The Economy is Like A Marble on a Ladle – A Friendly Reply to @ProfSteveKeen

In my last post I argued that post Keynesians needed to take the concept of General Equilibrium seriously and not simply rely on an assumption that everything is in disequilibrium.

The familiar argument that most of the time most of the economy is in disequilibrium is no argument to abandon the concept of equilibrium.  If disequilibrium is the natural state then prices would be essentially random and economics would have nothing to say.  Empirically most prices are fairly stable or follow fairly predictable paths, except in times of crisis.

I dont disagree and I think its largely a matter of the terminology and meaning of ‘Equilibrium’.  I didn’t want to get into the math of control theory so here’s a simple analogy.

Imagine a marble in an infinitely sized ladle – that  doesn’t wobble.   This kind of system is known as Assymptotically stable, that is the position of the marble will eventually converge to a point of equilibrium.  Indeed the further away from equilibrium the greater the forced on the marble.  The marble being small and the ladle infinitely large the marble has little influence on the dynamics of the system.  This is the type of system generally assumed by the term ‘equilibrium’ in economics, where individual agents are price takers.

Imagine however that ladle is held by someone trying to guess the movements of the marble, they will usually be off, so the marble generally moves towards the centre of the ladle but within a bounded range never perfectly stable at any point.  This type of system is known as Lyapunov stable.

Now imagine the ladle is not infinitely large but the marble can fall off the edge. If you guess badly the future position of the marble it will fall off.  One of the conditions of Lyapunov stability is that there are boundary constraints.

Now finally imagine a system where two ladles are linked together.  If two people move roghly in sync the marble will tend to move in a limit cycle, if they dont they it will be wildly unstable.  This kind of system was first studied by Lorenz – weather patterns being the subject.  If all subjects were blindfolded and uncoordinated then disequilibrium and wild movements would be the norm.

What kind of system is the economy?  I agree with Steve that Neoclassical economics makes simplistic assumptions about equilibrium using only the narrow class of systems of equilibrium that are asymptotically stable.  Like the marble in an infinitely large ladle with all movements perfectly predictable and predicted.

The economy is much more like the second class – the marble on a finite ladle with guesses on its position imperfect.  Position here being a surrogate for price.  It is not like the famous Wicksell analogy of a rocking horse hit by a hammer.

The issue is to how ‘chaotic’ the system is – how much it is like a Lorenz system?  Clearly agents do most of the time have good information about the behaviour of others – chaos is the exception rather than the rule.  I would posit that  it is not like predicting the Weather in England in summer, unable to predict from one day to the next from observation alone, but like the weather in a desert.  Most of the time you can guess sunny tomorrow from sunny today- but occasionally you can have flash floods with devastating consequences.

Yes economics does have simplistic models of equilibrium, but less simplistic models of equilibrium exist with bounded stability.  Indeed once you get into the realm of imperfect and adaptive expectations you have to get into describing them.  Which raises the issue of why students are not taught at least the basics of systems and control theory.

 

No Heathrow Extension Not Thrown into Doubt by Hung Parliament @ZacGoldsmith

Zac Goldsmith today tweeing that bexpansion is dead

The Sun

THE PLANNED third runway at Heathrow has been thrown into doubt by a hung Parliament, The Sun can reveal.

Theresa May’s lack of a majority leaves her exposed to as many as 40 Tory rebels opposed to their manifesto commitment to expand the UK’s busiest airport.

Plans for an expansion to Heathrow Airport involving a third runway have been thrown into doubt by the hung-parliament

And Labour’s own manifesto pointedly made no specific mention of Heathrow — only committing the party to a vague pledge to expand Britain’s airport capacity — leaving the opposition free to wreck the plan.

Prominent Labour figures like Mayor of London Sadiq Khan and Shadow Chancellor John McDonnell are opposed to the new landing strip.

The number of Tories opposed to the move forced the PM to kick the issue into the long grass last October with yet another consultation — but MPs are yet to actually vote on the decision.

The lack of majority held by Theresa May’s government after the snap-election has put the plans at risk

Members of public are currently being asked their views ahead of parliamentary scrutiny of their responses – meaning it could be up to a year before the decision reaches the division lobbies.

The Conservatives manifesto reaffirmed the support for new Heathrow runway, however Government Whips believe the plan faces a rough ride through a balanced House of Commons.

Lets do some math

Heathrow is an EVEL matter

Conservative Mps in England

317

Other Mps in England

217

Majority = 102

Even with 40 rebels they would have a majority of 10.

That’s assuming all labour MPS vote against.

Mc Donnell is very much in a minority amongst Labour Mps as anti-heathrow.  The Unions and most of the PLP are in favour.  The Manifesto is strongly pro runway expansion without mentioning Heathrow so a whip couldn’t be enforced.

Before the election only around 20 London Labour mps opposed expansion.  Lets be generous and say its now 30

By parliamentary arithmetic

For

Cons

317-40=277

Lab

205-30=185

Total for 462

Against

Cons 40

Lab 30

Others 10

Total Against 80

Majority in Favour 462-80=382

A whopping majority that would humiliate John Mc Donnell

If the conservatives had any sense it would be in the Queens Speech.

Note these numbers may be generous to the no side as many of the Anti-Heathrow London Conservative Mps have now lost their seats.

General Equilibrium – Why Post-Keynsians Can’t Live Without it

Perhaps the most fundamental proposition common to all Post-Keynesian economists, irrespective of the particular group or special group to which they belong is a rejection of the Walrasian theory of general equilibrium as the micro-foundation of the macroeconomic theory.

They find the Walrasian theory quite incompatible with Keynesian economics. In the Walrasian system it is crucial to understand that no trading or exchange can take place until planned demand and planned supply match in every market.

M. Agarwal

I dont disagree – however the concept of general equilibrium is older than Walras, Arrow-Debrau or DGSE, and the dominant model of general equilibrium  from early neo-classical economics was in any event challenged by an alternative concept of equilibrium from Keynes. The problem is with the dominant current conception of general equilibrium based on perfect foresight and rational expectations,  This is a model without time, without surprise, without profit, without any feature of economic interest.

Economics cannot do without a theory of equilibrium, it is the most important component of Macroeconomics and without it we can say nothing about the general price level.  What is needed is a properly dynamic theory of equilibrium that deals with economic activity in time and continuously.

Equilibrium is by definition a state in which the relevant variables are not changing, therefore, time is not taken into consideration in equilibrium analysis”.

J.R. Hicks

No.  The concept of equilibrium was derived from economics and was fundamentally misunderstood. Laying aside issues with physics envy equilibrium is not about no change it is a balance of forces which motion takes the path of least action (energy); this is important as for example if one holds constant or varies one variable others will shift you maintain an equilibrium state. This is important when you consider intertemporal equilibrium, whereby if you maintain a path of consumption, investment must shift (though causally maintaining a path of investment and consumption shifting is more plausible empirically and theoretically) and consequential effects on income lead to  continually varying states of the key variables.  Again brownian motion is not ‘no change’ it is continuous change but when viewed at a certain macroscopic level all forces are in balance.

The familiar argument that most of the time most of the economy is in disequilibrium is no argument to abandon the concept of equilibrium.  If disequilibrium is the natural state then prices would be essentially random and economics would have nothing to say.  Empirically most prices are fairly stable or follow fairly predictable paths, except in times of crisis.

Objections to the General Equilibrium (GE) approach seemed to be based on the dominant understanding of GE in neoclassical economics – that is a walrasian intertemporal equilibrium with rational expectations (perfect foresight) where all prices are set in advance and forever which exists is stable and unique, consumers maximise utilities based on endowments and the outcome is pareto optimal.  Alongside these are several (often hidden) assumptions such as perfect competition, no production and constant returns to scale – which are incompatible with the neoclassical theory of the firm. Prices are also only relative prices unless you set the price of the unit of exchange at unity – which means the price of money is exogenous to the model.  Not helpful.

The ‘fixed’ concept of equilibrium is historically limited and unnecessary; both Adam Smith and Ricardo commented on the observation that in most places most prices are relatively stable, apart from disturbances caused by war, bad harvests and currency  debasement.

Although most students are taught this version is ‘Walrasian; in fact  this is a radically simplified version derived from Cassel (1924),  Walras’s model was complex and modular and included production, capital goods, money etc. – not always very satisfactorily.  Cassel also dropped utility functions, though in post war years this in turn was dropped in favour of Walras’s original emphasis on utility.

Almost none of these are plausible or necessary for a theory of general equilibrium. Indeed all that is necessary for a theory of GE is ‘general’ stability/price convergence under bounded conditions. Indeed prior to Walras that was the assumption of the classical competition process.

Following the SMD findings we know that because of wealth effects it is not possible to derive a well formed excess demand function for an economy as a whole, and hence there can be multiple equilibria and prices can formed fixed cycles.  This does not mean however that ‘anything goes’ anything may be possible but that does not mean it is probable.  Again the statistical analysis framework of analysis is helpful as a jump between two phases is analogous to the annealing process – whereby a system suddenly jumps from one state of minimum energy to another.  This is helpful as we have a well established mathematical structure to deal with this – simulated annealing – which does not require modelling of every agent – rather the probability of different states -see Wu and Yany (1989) for a summary . A recent paper by Campbell and Baker (2017) applies adaptive expectations to commodity speculation applies the simulated annealing approach with helpful results. This approach is useful as they can be used to analyse groups where the probability of agency effects within a group is low – hence allowing for the advantages of agency based modelling without the overheads.  Ranges of groupings of agents based on factors such as income, assets and indebtedness can be tested in models empirically against past data and the best fits used for projections.

Walras of course mathematically described what prices could be in an equilibrium,  But his simultaneous equations method does not describe how it formed.  He intuitively described how agents ‘groped’ towards prices but lacked the tools to describe a market process of swarming and arbitrage – a process clear in the classical model.  What we today see as the ‘Walrasian’  auctioneer model is where prices are set forever prior to trading by agents with perfect foresight.  As perfect competition is assumed agents cannot affect prices and so prices cannot change.  Walras was unclear about whether this process was dynamic or static.  After Cassell and of course Arrow and Debrau a static version of equilibrium was used with ‘time’ reduced to indexing goods of different vintages so ice cream at time x could be traded different from ice cream at time x+y.  This underlined the importance of expectations of future prices.  In later editions Walras seemed to conclude that all that matters was that GE could be represented by static equations, and Cassell’s indexing approach assisted this by stripping away production, money and capital goods – if all you had were ‘endowments’ and all expectations were realised then all that was needed was a pure exchange economy, and all agents could be reduced to one representative agent.

With Lucas the assumptions of perfect foresight, equilibrium and perfect foresight became conjoined.   The assumption was that the economy was in equilibrium all of the time which requires as an a priori assumption rational expectations /perfect foresight etc. The problem here is not the indexing approach – it is valid and goes back to Torrens and Malthus – rather it is the combination of the indexing approach with perfect foresight – which in an ironic attempt to meet the ‘lucas critique’ removes agency and time from economics and hence all microfoundations.

What all of this demonstrates is how historically narrow GE theory has become and the need to look for more realistic and broader models. All one has to do to start with is to relax the assumption of perfect foresight, that all expectations of price are correct.  That immediately brings capital goods and production back in as the expected price at the time of investment can be different than at the time of valorisation.

If expectations can be wrong, with agents adapting there expectations based on past events and ever changing them according to emerging information then we are firmly in the realm of temporary and approximate equilibrium, where the ‘state’ of the set of prices that would be formed in GE is momentary and ever changing. We are also within the realm of ‘corridor stability’  of Leijonhufvud (1973) providing expectations stay within a range.

The Analogy here is with brownian motion, stability produced by ever changing and temporary sates. Of course economic agents are not passive particles they form expectations which can be wrong. The creation of false expectations however creates information and the potential for profit. If agents have liquidity to act on false expectations we have, like brownian motion, a series of infinitely many temporary equilibrium following one another maintaining system stability. When agents dont have liquidity because so many plans are wrong at the same time we have a phase shift and a rapid change in prices.  We can also draw on the massive legacy in finance theory of martingale (brownian) processes and valuation.

Roy Radners work on the importance of unrealised expectations and information to GE theory has been recently highlighted in a post by David Glasner 

The two differences that are most relevant in this context are the existence of stock markets in which shares of firms are traded based on expectations of the future net income streams associated with those firms, and the existence of a medium of exchange supplied by private financial intermediaries known as banks. In the [Arrow-Debrau-NcKensie] ADM model in which all transactions are executed in time zero, … and since, through some undefined process, the complete solvency and the integrity of all parties to all transactions is ascertained in time zero, the probability of a default on any loan contracted at time zero is zero. As a result, each agent faces a single intertemporal budget constraint at time zero over all periods from 1 to n. Walras’s Law therefore holds across all time periods for this intertemporal budget constraint, each agent transacting at the same prices in each period as every other agent does.

Once an equilibrium price vector is established in time zero, each agent knows that his optimal plan based on that price vector (which is the common knowledge of all agents) will be executed over time exactly as determined in time zero. There is no reason for any exchange of ownership shares in firms, the future income streams from each firm being known in advance.

[If] agents have no reason to assume that their current plans, …will remain optimal and consistent with the plans of all other agents. New information can arrive or be produced that will necessitate a revision in plans. {If} plans are subject to revision, agents must take into account the solvency and credit worthiness of counterparties … The potentially imperfect credit-worthiness of…enables certain financial intermediaries (aka banks) to provide a service by offering to exchange their debt, which is widely considered to be more credit-worthy than the debt of ordinary agents, …. Many agents seeking to borrow therefore prefer exchanging their debt for bank debt, bank debt being acceptable by other agents at face value. In addition, because the acquisition of new information is possible, there is a reason for agents to engage in speculative trades of commodities or assets

i see this as the interaction of two parallel systems.  The first is the calculation of the optimum production system given a schedule of demand.  This is like a Sraffa/Sinha linear production system – showing an infinitely thin slice of time – and can be modeled as a chaining together of such systems using the principle of least action necessary to reproduce this physical system.

The second being the set of information produced by this system and the difference between this information set and the information set held by agents – which again can be modeled by the principle of least action in its form of maximum information (least entropy).

It is the interaction between these two dynamic equilibrium systems that drives change in the economy.

An economics without assumptions of perfect competition, perfect foresight and pareto optimality of markets would be a very different one from that taught today, and largely lacking in unthinking a priori normative support for untrammelled markets.  It will all depend on the model and the evidence  But this is unnecessary and unhelpful baggage, it is not needed and indeed is counterproductive to a modern general equilibrium theory.

Quick Thoughts on Implications of General Election Stalemate on Planning Reform

The most likely outcome of the election is a conservative minority government with a confidence and supply arrangement with the DUP – which means an open border with Ireland, and hence free movement and no hard Brexit.

For planning reform some quick thoughts:

 The Conservative and Labour Commitments on Housing were essentially the same.

Same housing targets, same policy on Green Belt.  Labour even promised to extend Help to Buy.

But  there was a big shift towards land value capture – for the first time supported by all three parties.

EVEL matters

The Housing and Planning Act was the first to use it.  For planning and housing legislation in England the Tories have a comfortable majority.

Prioritizing Home Ownership over Housing Numbers is Dead

The Very short term impact of Alex Marsh on the 2015 Manifesto

The Neoliberal Urge to Planning deregulation is Over

The whole justification for the NPPF system is gone.  It stays because of lack of appetite for disruption, but the appeal led system will weaken.

Barwell is Gone

Best housing minister since Silkin.  He will be missed.  Suspect any new minister will simply progress Housing White Paper without the expertise.

The Shift Towards Zoning?

Unmistakable in last three years.  Unclear whether will be pursued with same enthusiasm.  But main change already implemented is need to implement Brownfield registers and PiP.

The Systemic Housing Shortfall Caused by Lack of Strategic Planning and Powerful Delivery bodies remains

All Parties will take the lesson of a powerful housing offer to the young at the next election without penalising the old and unwell.

 

No A Land Value Tax is Not a ‘Garden Tax’

The Telegraph this AM on land value tax.

Opponents of the tax say it would cause house prices to plummet, putting homeowners at risk of negative equity and forcing families to sell off their gardens to developers to lessen their tax burden.

 Ah the attack line, start describing it as a ‘garden tax’
However land value tax is a tax on the value of land if developed to its permitted planning use.  As under the NPPF gardens are not counted as brownfield sites the size of the garden could not be counted in valuation, only the footprint of the house itself, the land the house stands on.
Whether it is wise or not to exclude gardens is another thing entirely.  It would be much better to zone which areas with large gardens are desirable to densify and to what degree and to allow taxation above a certain level for elderly occupants to be deferable until death.
As for the calculations of tax rises, nonsense produced by an afternoons works by a junior researcher – excluding as it does all development land, business land and vacant land.  According to Savills these account for 25% of land values.
According to those who have done the math 1.83% of value would raise the same sums as Council tax leaving 83% of households better off.

How PiP differs from Outline Planning Permission & How LPAs can ensure the form of Development is Acceptable

A reserved matter to an outline planning permission is not a planning application but an application to discharge a planning condition on specified reserved matters.  The LPA has the power under the GPDO to refuse to determine the application without sufficient details (T&CP (Development Management Procedure) (England) Order, article 7 and the DMO (Wales) Order 2012, article 3).

One common issue is the means of access.  Highways authorities often refuse to determine applications without sitelines and a plan of the access point – so this is often included in applications with all other matters reserved.

Scale is one of the reserved matters.  However under well established principles of planning law reserved matters cannot abrogate the terms of the outline permission.  So if a description of development is ‘5 bungalows’ you have permission for 5 bungalows not 6 and not for two story houses.

PiPs have no such power of the LPA to ask for further details.  And because it is not a planning permission the regulations have to state in schedule 2 of the Town and Country Planning (Brownfield Land Register) Regulations 2017 on the contents of the register requirements over the number of units.  Note the wording it is very interesting.

(m)(i)a description of any proposed housing development; or

(ii)the minimum and maximum net number of ‎dwellings, given as a range, which, in the authority’s opinion, the land is capable of supporting;‎

(n)the minimum net number of dwellings which, in the authority’s opinion, the land is capable of supporting;

A typical case with an outline permission is an LPA grants planning permission for 5  2 bed units based on an ‘illustrative layout’ and when the reserved matters come in for 5 4 bed units it wont fit.  Under  Crystal Property (London) Limited v Secretary of State for Communities and Local Government and Hackney London Borough Council [2016] EWCA Civ 1265 refusal would be acceptable as even ‘illustrative’ drawings show the proposed scale of the development so the subsequent application would be outside the terms of the permission.

The ability to ask for further information, together with ever increasing application of this power effectively circumscribed outline permissions more and more.  Development interests felt it was no longer possible to ‘redline’ a site with minimal information.

The ‘dual track’ approach of the regulations gives LPAs a choice.  If they opt for the clause m (ii) route then if they state in the register 50-65 units then these would be 50-65 units of any size and form. 

If however they take the m(ii) and n clause route they can fully describe the development and just specify the minimum number of units – which might be as established in the local plan of SHLA.

The regs dont specify the ‘description’ so It is entirely possible for LPAS to do a mini planning brief with a diagram and extended description in the schedule.

To my mind it would be foolish for any LPA to so specify the maximum number of units.

‘Some for example a description could be

‘A minimum of 50 housing units with a dwelling mic in accordance with local plan policy H3, accessed off Jolt lane (fig x), and including 0.5ha of public open space situated on the eastern part of the site.  No unit shall be above 3 storeys in height, except for units fronting Jolt Lane which shall be a maximum of 4 storeys in height.  The layout shall be in accordance with the approved planning brief for a site.  No built development is permitted within the flood risk area shown on fig x or within 10m of the 400 kv overhead lines as shown on Fig x.  Sufficient flood storage should eb provided on site to ensure no net run off from its present state in accordance with EA requirements and local plan policy E5’

This is somewhat of an extreme example but for large sites with a lot of work already doe and continued opposition, and where local plans are not yet adopted, there may be considerable pressure following the required consultation for the LPA to not consider the site suitable based on ungrounded fears as to what might be permitted on the site.

LPAs need to get there a%%%s into gear given the December deadline.  If any LPA wants assistance on the urban design, capacity and other issues in compiling their brownfield registers please get in touch. Unless LPAs have a clear audit trail on how they assessed the suitability, deliverability and capacity of sites they could be subject to JR.

Garden Cities East of Oxford and the Restoration of the Wycombe Line

Having considered the opportunities near Cambridge represented by the Varsity Line it is necessary to look at Oxford – the other end of the Cambridge-MK-Oxford Arc.

Oxfordshire has been struggling with appropriate l;ocations within and beyond its Green Belt.  The number of big brownfield airbases are much lowers than Cambridgeshire and those that exist are not always in good places.  Landscape and flood risk constraints around most of Oxford are severe – which would make you think expansion of the least constrained areas would be easier – but it isnt as Oxford is severely underbounded.

On the restored link to Bicester which the Varsity Line will pass through Bicester itself of course is the biggest opportunity.  The only other station between is Islip, but flooding constraints mean the scale of development suitable here is limited.

The big opportunity lies through restoration of the Wycombe line – the first line to reach Oxford – which originally ran through Cowley to Princes Risborough and Wycombe and Maidenhead.

A number of positive opportunities come together

  1.  Network Rail is considering restoring services on teh Cowley Branch which serves Oxford Business Park and The Mini Factory
  2. With Cross Rail coming to Maidenhead Wycombe council is studying restoring a service to Bourne End – a major growth area.

So why not look at restoring the whole line which would link up several major towns west of London.

Indeed it offers potential for major expansion at Princes Risborough – where development stops dead at the Station, as well as a new Garden City beyond the Green Belt north of Junction 7 around restored station at Tidlington – not a conservation village and not constrained by flooding until you get to the Thames Flood Plane. This to me makes much more sense than the proposal for a new town put forward by landowners south of Junction 7 which would be very car orientated.  There is about 2 km of developable area between the A418 and the A429 and developed to Central Oxford type densities could house around 50,000 people.

Thame could also get a station in the town though it is already proposed for major extension in its neighbourhood plan.

Wheatly is also being considered for extension though it is in the Green Belt.

Time to Replace PTAL with a better measure of PT Accessibility

PTAL was first developed by the London Borough of Hammersmith and Fulham and later adopted by Transport for London as the standard method for calculating public transport accessibility in London.  Through the London Plan it has become embedded in policy with density thresholds based on Public Transport accessibility.

PTAL was a breakthrough enabling for the first time scientific mapping of relative public transport accessibility.  I was once on the London PTAL working group and do not underestimate the sheer hard work that went into the development and the effort TfL has made in providing London wide maps.

The PTAL method however is showing its age and needs a replacement.  It simply adds up walking and waiting time to the public transport network.  It is therefore a measure of accessibility to the public transport network.  It does not take account of where the service is to.  Also reflecting the limited computational power of when it was developed it has arbitrary cut off points of walking distance so that PTAL levels can fall off from very high to none in a short distance.  More technically the PTAL index is calculated against a rectilinear grid of points which means that accessibility levels are distorted at a local scale as travel times are not equal in all compass directions. Finally TfL enormously simplified calculation by only calculating accessibility from population weighted centroids of super output area rather than every address, but that does average out levels across a district.  The bands 1-6 were chosen as a suitable range for H&F but the method has had to be hacked to add additional categories for central and outer London.

TfLs CAPITAL model shows travel times from a node of interest – such as a shopping centre, but does not show accessibility from home in a universal index.  Similarly the newer ATOS model shows access to the nearest schools, 10,000 jobs etc as a map but is essentially a measure of local rather than strategic accessibility.

Experience with GIS based accessibility modelling internationally has developed to the extent that a serious research effort is needed in the new London Plan to drive updated policies on the best areas for densification.

A particularly promising approach is offered by the Access Across America project by the Minnesota Dept of Transport/University.  This maps for all modes access to employment (number of jobs) within a certain travel time.  They have mapped all major US Metros.

The methodology is to take job numbers and population number from centroids of census blocks and to map travel times based on Google transit data.

to reflect the of transit service frequency on accessibility, travel times are calculated repeatedly for each origin-destination pair using each minute between 7:00 and 9:00 AM as the departure time, and an accessibility value is calculated using each travel time result. The accessibility results are averaged to represent the expected accessibility value that would be experienced by a traveler departing at a random time in this interval.

And then

accessibility is averaged across all blocks in a CBSA (census block), with each block’s contribution weighted by the number of workers in that block. The result is a single metric (for each travel time threshold) that represents the accessibility value experienced by an average worker in that CBSA.

In the UK we face more challenges in calculating employment locations, we no longer have a census of employment and travel to work data is taken on a 10% sample.  However various departments do have geocoadable employment data  which is available.

My suggestion is to base an index based

  1. on the number of jobs accessible within the 95th percentile commute time in England
  2. The boundaries between the zones be based on the natural breaks (Jenks) method rather than equal interval per category to produce a more uniform spacing of variations in the map more useful for policy purposes.
  3. The index is based off the population weighted centroids of output areas, however in areas where the number of commuters is too low to calculate a statistically meaningful result they should be weighted.
  4. The results need to be aerially interpolated across a uniform grid.  I have used a hexagonal grid – known as a planagon grid successfully on a number of projects developing a technique pioneered by the CSIR research institute in Pretoria.  On a hexagonal grid distances are equal in all compass directions to surrounding centroids.   A grid of 7 planagons to 1 sq km (about 212m accross) is sufficient.  Any smaller it becomes very data heavy.
  5. Extract non developed, roads and no parcel areas from the planagon polygons.
  6. For employment weight number of jobs in the output area by the building height in the planagon, the same for residential, using a linear regression technique pioneered by the Technology University of Riyadh to interpolate census data to buildings. Then add up the numbers for all buildings with centroids inside the plangon grid hexagon.
  7. Then calculate travel times using the OS ITN network and the TfL services database as per PTAL.  Rather than taking a equally weighted sample across the morning peak  the sample should be weighted by the likelihood of work related travel across the whole day.

How an Economic Equilibrium is Like and Not Like a Physical Equilibrium

In a physical equilibrium the system is in a state of least action – or minimum free energy – if that system is conservative – that is it obeys the laws of conservation of energy.  The laws of dynamics and conservation of energy can all be deduced from the principle of least action for conservative systems.

Even in a situation of no growth – simple reproduction- without accumulation an economic equilibrium is is not like a physical equilibrium – it is a dissipative system – it requires a constant input of energy to maintain production, reproduce the workforce and maintain the capital stock against the wear and tear accounted for in depreciation.  Indeed value creation – work – can be seen as the continual struggle to restore and expand the scale of a dissipative system to a larger conservative one.

This I think is why those attempts to apply the universality of the principle of least action to Economics – such as in the Samuelson – Solow growth model – have been less than successful.

There are conservation laws in economics but they are of a different kind and it is a great mistake to apply too strictly analogies with energy conservation or imply a similar conservation of money doctrine (for example Godley).

For example the principle of least action is simply energy=potential energy+kinetic energy.

The equivalent in economics would be assets=liabilities+capital.  But assets here is not energy, the maintenance of the asset stock implies the continual input of energy. A non conservative system.

There is no universally accepted formulation of the extension of the principal of least action to thermodynamic and dissipative systems, which is not to state it cant be done. You can derive the laws of thermodynamics from the principle of least action but with the introduction of external energy sinks and sumps you cannot predict the course of action of the system from internal energy states alone.  Feedback in dissipative systems does not have to be positive – the la Chatelier principle that Samuleson used in Foundations.

Another key difference is whether an economy is on an equilibrium or non-equilibrium path is whether expectation have been dissapointed.  It takes little energy to form an expectation and no difference in energy between a correct and false expectation.  However a correct expectation can be modelled as information, and an incorrect one as no information.  Therefore equilibrium can be modelled as a state of maximum entropy – which is equivalent of least action.  This value creation system though is parallel and separate from – though connected to – the system for valuation of past work.  It is the creation of value which links the two.

One issue over which there is great confusion is whether economic systems – being inter-temporal and dynamic – are different from economic systems where prices shift according to the compatibility of plans and expectations.   If plans are compatible so that expectations are realized then an external disequilibrium input will cause a ‘shock’ – there will be a price path but that path will restore to equilibrium.  This is an example of Wicksells ‘Rocking Horse’ – which requires a Frisch’s type ‘shock’ of a hammer hitting it to knock out of equilibrium.  This is why the term ‘inter-temporal equilibrium’ is a misnomer.  There is no such thing – rather it is the path of reaction of a conservative system towards equilibrium.  Equilibrium is not a law – rather it is a result of a law – the principle of least action – that describes the state of a system fully at all points along its path in and out of equilibrium.

If however plans are incompatible and expectations are not realised then negative feedback is introduced – we have a dissipative system – value will be net created or destroyed. Most plans are incompatible most of the time. The creation and destruction will create net winners and losers.  There will be opportunities for arbitrage and bringing plans back into compatibility. The act of economic failure and mispricing destroys value but creates information.   So far as these opportunities are not realised then the negative feedback will continue.  The economic system will drift away from the point of maximum information.  Expected prices will become more and more misvalued.  Only when these expectations change and the price system increases in information content will an equilibrium path be restored.  The introduction of feedbacks makes the system non-linear.   The ‘normal’ state of the system need not be at rest – it can be a limit cycle – with saddlepoint properties outside a certain range.  This is another reason why it is better to think in terms of the underyling causative forces of least action and maximisation of information rather than the fixed point of equilibrium at rest.

This approach also has applications in terms of thinking about money.  If expectations are held for all time and are correctly realised then- as Hahn set down – money has no function.  However if the value of products is uncertain then a mechanism is needed to guarantee the realisation of a contract in the future.  Money as a store of value is an information store allowing the alignment of two plans in the future.