On the Fallacy of Geometrical Proofs of the Marginal Theory of Distribution

The marginal revolution of the late 19th century arose from the attempt to apply the ricardian theory of land rent arising from marginal productivity to all factor returns.

This immediately led to the ‘product exhaustion’ or ‘adding up’ problem; would the total returns of each marginal factor add up to the total returns?  The neo-classical solution deriving from Wicksteed. Wicksell and Barone working independently relied (though non realised it at the time) on Eulers  theorem of homogeneous functions, so that once paid their marginal revenue the total returns would exactly add up to total returns.  John Pullens magisterial history of the theory recounts these controversies.

The theory was immediately attacked for it initial assumptions.  The Euler theorum approach relied on partial derivatives of each factor, which assumes full independence of each factor.  This is unrealistic, the employment of an additional shepherd for example may require the use of an addition crook.  In the short run factor proportions are fixed. This lead for example Wicksteed and eventually Wicksell to abandon support for the theory, none the less it had gotten into the text books.

Modern general equilibrium theory has no requirement at all for marginal productivity theory, and following the capital theory controversies of the 50s and 60s there are less mentions of the ‘marginal productivity’ of capital.  This leads to the odd result that mainstream neo-classical theories of economics lacks a satisfactory theory of distribution.

Hence there have been a number of attempts at graphical representations of the marginal productivity theory.  These date back over a century but the most celebrated example is the Samuelson/Hicks model.

A brief recourse to the assumptions of the theory is necessary.  The theory assumes perfect competition and hence no profits.  It also assumes constant returns to generate a linearly homogeneous production function.  Finally the theory is not one of prices and so is not even a theory of factor returns, rather it is a theory of factor costs (supply curves) and so only becomes a theory of factor returns assuming general equilibrium- in this case one of perfect competition and full employment.

In this case returns are rental returns on labour, land and capital goods only, there is surplus, no profit and no interest.

It is worth noting the similarity of this system with Schumpeter’s stationary state earning no interest, and where all costs can be imputed to ‘stored up’ land and labour. In a single good world with zero interest this is a pure corn model where at the extensive margin of land rent all costs are equal to discounted labour values.

Not surprisingly these assumptions are somewhat at variance with reality.  The first to be faced was the assumption of constant returns. Of course firms generally face U shaped cost curves of increasing returns to the point where adding labour is generally less productive per marginal unit of input.  Wicksell modified the theory to incorporate U shaped cost curves, where under perfect competition firms produce at the bottom of the U of their cost curves.  This is a consequence of the assumption of perfect competition and implies an intersection at that point of the demand curve, and so also implies general equilibrium.

This is the situation implied in the Samuelson/Hicks model of factor returns shown below.

Assume two factors of production capital goods K and labour L.  (there are hidden assumptions here regarding a single capital good only and absence of consumption goods – to which we will return)


TR=TC= P x Q

On the input side P x Q = wL + rK where w and r are the marginal ‘own rates’ of productivity of each factor.



Therefore  P x Q = L(P.MPPl)+K(P.MPPk)

Dividing both sides with P

Q=L(MPPl) = K(MPPk)

So taking any given price the factor returns of their MPPs to K and L would exactly equal to TR.

At this point there is product exhaustion and it is not necessary to assume a linear homogeneous production function.

Of course this approach is nowhere nearly as closely defined in the neoclassical textbooks, we still have loose references to ‘profits’ being the capitalists return, and references to Eulers Thoerum, whilst the assumptions of gneral equlibrium and perfect competition are often glossed over or not mentioned at all.  Profits and interests remain unexplained.

Factor returns in  a three factor model has nothing to do with profits.

Under this models assumption of perfect competition every factor of production productive factor receives its marginal product, and this exhausts all output; there is no separate category of income called “interest” or “profit” that is distinct from payments to the owners of labor, land, and capital goods

Fisher (1930)

The hire of human beings is wages; the hire of land is rent. What, then, is the hire of (other) capital—houses, pianos, typewriters, and so forth? Is it interest? Certainly not. Their hire is obviously house rent, piano rent, typewriter rent, and so forth…Rent is the ratio of the payment to the physical object—land, houses, pianos, typewriters, and so forth—so many dollars per piano, per acre, per room. Interest, on the other hand, is the ratio of payment to the money value of these things—so many dollars per hundred dollars (or per cent). It is, in each case, the ratio of the net rent to the capitalized value of that rent

It might be objected that partial differentials have come in through the back door.  The Samuelson/Hicks theory assumes cost and revenue curves.  These must be derived and at the point of intersection one can take the partial derivatives.  However this is not valid the geometrical method assumes no function for the derivation of price and quantity, you could have for example a leontiff type system of fixed technical coefficients determining price and quanity with fixed rations of capital and labour inputs, indeed ANY input-output system.

All the geometrical system is doing is expressing an accounting relationship under conditions of perfect competition and general equilibrium.  No causation is implied – somehow a firm has reached the bottom of its average costs curve that is all.   As such it all the outputs of the production function is doing is expressing the accounting relationship of the capital/labour share in inputs.  It therefore fall victim to Shaikh’s criticism of the ‘Humbug’ production function.

So under these restrictions the geometrical approach is valid at determining marginal productivity of labour determines prices of goods.  This is easy to show in an economy with no capital goods as P.Q=L(MPPl)

Once we introduce time, profits and investments however we reach a situation that no rational firm or investor would wish to acheive.  One which is incompatible with capitalism.

Look at a firm with increasing returns – at the beginning of its cost curve.  Providing the market can absorb the output at a rate of profit which is increasing a ‘capital maximising’ firm will increase output to the point where the marginal increase in the rate of profit is zero, and where the marginal return to the rate of profit in other investments is not higher.  If it overshoots the bottom of the cost curve it will reduce production and labour to try to hit the bottom of the curve.  If the period of return of investment accounting for depreciation is greater than the current stock of capital goods this will be decreased and production reduced, the converse will lead to investment and increase in capital goods.

At the bottom of the curve existing capital goods will depreciate and need to be replaced, if replaced however there are zero returns to the owner of capital goods.   At zero interest it is not even possible to calculate the rate of value depreciation, indeed the concept is meaningless.  Rather we are simply retaining the physical nature of capital goods only. They are retaining their capital stock but making no money. But no rational investor would invest in a firm making zero profits, including those renting capital goods.  Therefore firms will try not to be at the bottom of the cost curve but a point to the left which generates a return at the expected rate of profit. If such a firm is generating profit it is not in perfect competition or general equilibrium. It is producing a good which is in demand and scarce and so generates profits, hence the short side rule applies, the price is set by the shortage in quantity – whichever is less: quantity demanded; or quantity supplied. Q=min{Qd,Qs}. The short side of the market determines the quantity traded.  This profit will be eventually be reduced as the profit is imputed back to the potential for increased factor prices and with new entrants attracted to the market – so this situation is dynamically unstable.

In such a market there is a surplus, a profit can be made, returns exceed marginal productivity of factors.

Is there a way to rescue the geometrical theory with its fixed  assumptions.  No, but we can relax the assumptions and see whether or not it is more robust.

Assume not three factors but four, we add money and those that own money and rent the factor, banks, at an interest rate.  Under competition the own rate of return of capital goods must equal the own rate of return of rental of money (which is interest plus cost of production of money), and both must be positive for investment, accumulation and the capitalist process to proceed.

Samuelson in his ‘rice model’ objected to our Schumpeterian assumption that the steady state has zero interest rates.  He gave the case of a rice field with yield 100 at year zero and 110 at year one – implying a 10% own rate.  However this pure physical productivity does not explain financial interest.  If the field produces 10 surplus and is the subsistence or workers is 4 – the surplus is 6 – which is rent not interest.  If an investor wished to invest in a new rice technique producing 120, or expanding land under cultivation  at the extensive margin, that is profit capable pf capitalization and earning interest.   Samuelson makes the pre-fisherian mistake of confusing rent and interest.  A dead stand of trees still has a value, but the same stand with the same number of trees has a higher value because of agio.  The valuation of the latter at T1 will be imputed back to the value of the land at T0 as indeed Schumpeter pointed out.

Let us assume that owners of capital goods and banks seek to maintain there rate of return, the system reproduces at a steady rate of profit.  A more classical version of the ‘steady state’.  In this case all factor returns are rent and we can produce a ricardian rent diagram .

The diagram makes the simplifying assumption that the capital intensity of the industry is such that the revenue on rent of capital goods is the same as profits.

You will note that here again we have a graphical (Ricardian) illustration of distribution.  Assuming equality of own rates of money and capital goods and reproduction at r no greater or less then the marginal productivity of the capital goods at that interest rate alone determines the capital goods rental factor share – which determines the interest share  (rate) and wages – or rather maximum wages at that interest rate, is a residual.  Maximum wages however is only achieved at full employment, otherwise again the short side rule applies.  There is circularity here – interest determines capital goods investment which determines interest; a Wicksell effect.

However we have assumed competitive equilibrium between the goods market and the loans market so that

the NPV of  (loans – cost of lending) = NPV of (capital returns – depreciation).

In this one good corn-money model it is the NET marginal productivity of capital goods – accounting for depreciation and cost of lending – which determines interest rates, we have closed the system.  This is making the slightly heroic assumption that banks are able to keep up with  this demand.  In most cases lending will be on the short side and market interest rates will be higher leading to production being less that it potential.

Note maximum wages sets a total wages pool of funding not the size of an individual wage.  With technical coefficients fixed in the short term an individual maximum wage can be found which a firm can use as a marker point in advertising positions according to labour market supply conditions.

Consider a case where an entrepreneur innovates, creates a technique which produces a return on capital goods in excess of the return on lending.  This will generate super profits.   This may eventually produce greater investment or impute back to factor prices and induce entrants.  In the short run shower there are super profits.  By the Kalecki profits formula Profits =Investment + capitalist consumption, capitalists here being the producers, renters of capital goods and renters of money en toto,   If we assume that reinvestment occurs up to the point on the cost curve that maintains R then the residual is capitalist consumption.  All factors impute back to a labour value from which rent is extractive. At positive profit they impute back to past present value of labour of production plus future present value of labour of capitalist consumption.  This is equivalent to the modified classical theory of value put forward by Ian Wright.  We can interpret exploitation here in a new light, it is not the totality of surplus value but rather value over and above that necessary to reproduce the system and withdrawn from future investment.

In the long run too NPV Profits= NPV of  (loans – cost of lending) = NPV of (capital returns – depreciation) as investment is attracted.  There is no NPV equivalence re labour.

By abandoning the neo-classical ceritus paribus assumptions and looking at conditions of reproduction we have a more robust theory of distribution.  However note the conditions though relaxed from perfect competition are still restrictive- we have a one good corn-money world and where short of full employment the relative bargaining powers of capital and labour determine whether the maximum labour share is reached.

Introduce two or more goods and multiple techniques with varying composites of capital and labour things become more complicated. We no longer have a straightforward relationship between interest and capital returns. So what.  Even if there is a change in technique and a reverse in the inter-temporal returns of one more ‘dose’ of any factor the identity of production that NPV Money=NPV Capital Goods applies.  You cannot consider past returns on one technique on future returns on another in an investment decision.  Past returns are sunk costs and are irrelevant.  The investment decision must be considered at that moment in time and for all futures moments in time and must be based only on the marginal return of capital advanced and from this distribution flows from the revised and corrected graphical technique we have presented here.

Green Light Given by Barwell to Bradford’s Local Plan


The housing minister has given Bradford’s local plan the go ahead after a delay caused by concerns regarding Green Belt boundaries.

Bradford Council’s Local Plan Core Strategy was put on a temporary hold by the minister of state for housing and planning, Gavin Barwell, following a technical intervention from Shipley MP Philip Davies.

There were concerns the plans, due to be adopted last October, would alter the boundaries of the Green Belt.

However, in a letter sent today, Mr Barwell wrote: ‘The Secretary of State acknowledges that the Plan does not alter the existing boundaries of the Green Belt and that any future changes to Green Belt boundaries will be through the preparation of Site Allocations.’

While the plan focuses on identifying housing development on brownfield sites, it acknowledges some areas of green field and Green Belt land will inevitably have to be used due to the ‘sheer scale’ of housing needed.

Cllr Alex Ross-Shaw, Bradford Council’s executive member for regeneration, planning and transport, welcomed the housing minister’s decision.

‘The Plan is sustainable and prioritises brownfield sites,’ he said.

‘Before the Holding Direction was issued, a Government Planning Inspector had already indicated that the plan was sound.

‘Now the Government has accepted the judgement of their own inspector, we can get on with the rest of the process to make sure that development in our district isn’t a developer free-for-all.’

Council leader Susan Hinchcliffe said: ‘From the outset everyone knew that the Local Plan obviously complied with all the planning rules set by Government and therefore it would have been odd for Government to find their own rules defective.

‘My biggest concern over this last few months has been that developers would put an application in on Green Belt anywhere in the district and without a Local Plan we would have been powerless to stop them.

‘The Secretary of State confirms our view that Green Belt should only be developed in exceptional circumstances.  We agree.  We have as much concern about the countryside and want to protect it as much as anyone else.’


Response to

An integrated strategic plan

Q1. Can the approach to strategic planning explored in this paper help to:

  1. tackle major constraints on future economic growth – i.e. the undersupply of homes and weaknesses in east-west transport infrastructure;
  2. maximise the potential of major new east-west infrastructure links; and
  3. develop distinct towns and cities into a major economic corridor?

Q2. How could the approach to strategic planning be amended or strengthened to better achieve these aims?



I have strong interest in this area having advised both Cambridge and Northampton on strategic planning issues and, though I am now based abroad, having written and advised extensively on strategic planning issues and the opportunities of this area.

The discussion paper sets out cogently the need for strategic planning in the area and the need for joint governance structures, however these are two separate but overlapping issues.  There is a need for a strategic plan for drive joint programmes in a joined-up way, there is also a need for governance structures to manage joint programmes across the region including the preparation of the strategic plan itself.

The management task then is one of portfolio management, managing a range of complex and integrated programmes including the delivery of a strategic plan and the management of infrastructure and human capital investment programmes deriving from that plan.

The discussion paper ‘jumps straight in’ to the governance arrangements, however a better approach would be to take a ‘project portfolio management’ which would begin with the scoping of the portfolio.

The scoping needs to address the challenges and opportunities the area faces and then build consensus through stakeholders on the commissioning process for the strategy, including but not confined to the governance of the strategy itself.

One of the key lessons learned from past strategic initiatives, such as Thames Gateway, MKSM, London, Stansted- Cambridge, the Arc etc. is if their purpose is not clearly framed and shared then there will not be sufficient ‘buy in’ from key stakeholders and joint working withers on the vine.  This is a clear risk to current regional initiatives such as Northern Powerhouse and the ‘Oxford-Cambridge’ Arc.

Emerging out of a political climate that is hostile to regional and sub-regional approaches it is imperative that the business case for a strategic plan is clear.  It is not enough to state that there are housing affordability problems, though particularly acute in Oxford and Cambridge these are problems across almost all of England; the business case needs to be clearer as to what marks this area out and how opportunities cohere to form the region which requires the plan.  When the discussion paper refers to transport corridors it is on clearer ground, however the ‘Oxford Cambridge Expressway’ will not primarily serve the purpose of commuting between Oxford and Cambridge, and if it did it would be a poor investment, rather it serves the purpose of West-East movement to the East Coast Port and Europe and regional movements between the South Midlands and East Anglia.  Hence there needs to be much more clarity on the drivers of Economic Geography.

Ahead of the formal setting up of long term governance structures the Infrastructure Commission should draw together stakeholders to scope the strategy based on an analysis of these drivers.  Long term governance structure might take months, and years if as is likely it requires legislation.  Hence an interim structure whereby the commission is client for a strategy should be preferred with an advisory board which can evolve for a formal governing board with legal authority over time.

In my view the key economic drivers are as follows:

  1. The weak East West Links to the East Coast ports and North of London more generally which holds back the economic development of the South Midlands/East Anglia.
  2. The tight constraints (both policy and physical) around the university Oxford and Cambridge which creates economic development and housing affordability issues as well as long distance commuting.
  3. The underdevelopment of knowledge based industries in other major towns in the region such as Aylesbury, Northampton and Milton Keynes which all lack Universities.
  4. The opportunities presented by East-West Rail and HS2 freeing up major capacity on WCML – opportunities which converge around Milton-Keynes and Northampton.
  5. The opportunities presented by major former defense bases which have closed or are about to close – such as Alconbury and Lakenheath.
  6. A series of new and expanded towns and Garden Cities which have now reached their originally planned size and now need to consider options for future growth
  7. Past growth in the area being planned without proper consideration of supporting water infrastructure which is now a constraint on growth (e.g. the Rye Meads treatment plant. Now a special protection area, which serves 5 towns).
  8. The fact that London and Greater Birmingham cannot meet there housing needs within their boundaries.  This need overspills to the surrounding area and could total over 1 1/2 million homes.  These pressures and opportunities converge in the Arc area.

Arising from this a strategic plan needs to do three things:

  1. Optimize the locations of major employment growth at transport nodes
  2. Link the nodes with regional transport networks
  3. Plan for strategic new housing locations at optimum transport nodes on this network so that jobs can be accessed at lowest cost and sustainably.

This networks approach where networks link hubs should replace the flawed language of corridors.  This lead to the flawed and abandoned approach in the early London-Stansted-Cambridge work where seemingly every field between London and Cambridge was surveyed for potential housing growth, when it was always the case that a limited number of nodes only was required.

Need for clarity on the drivers is essential.  For example, unless there is clarity on whether the stakeholders including Central Government accept the need for overspill from metropolises then the programme could failure later down the line in terms of disagreement about options, or worse face legal challenges down the line from development interests.

The challenges also provide clarity on the geographical coverage of the strategy and governance, a key issue which the paper omits.

I would strongly recommend that the arc sweeps from Swindon to Ipswich/Bury/Lowestoft not just Oxford/MK/Cambridge, and also considers the intersection of other transport corridors which have great potential such as WCML, WAGN and London/Stansted/Cambridge – and so should include Northampton, Harlow, Stevenage etc.

Finally follow through requires strong Treasury support.  The Treasury will soon be piloting development rights auctions in a transport corridor.  This could be strongly compatible with the new ‘permission in principle’(Zoning) model and strategic sites arising from the strategy.  This could solve the infrastructure problem of costs being front loaded and values backloaded by giving an income stream through land capture that could be borrowed against.  This is a model pioneered in the area in Garden Cities.  Borrowing however needs to be ‘off balance sheet’ of the PSBR, and this should be taken into account in setting up governance arrangements for receipt and spending of value capture and ensuring compatibility with State Aid rules.  It may be an arm’s length company limited by guarantee is needed where public sector bodies do NOT have an equity stake.

The lesson of past failures in this region is that weak ‘partnership’ models have no legs.  Without a clear mandate and budget, they have not garnered long term stakeholder support.  Also models which have involved direct central government intervention (e.g. UDCs) had limited achievements because of local hostility.


New Opportunities

Q3. Can the approach to strategic planning explored in this paper provide a basis for improved long-term collaboration and engagement between the corridor and:

  1. housing developers;
  2. infrastructure providers (e.g. in the telecommunications and utilities sectors) and investors; and
  3. central government – through, for example, a new, long-term ‘infrastructure compact’?

Q4. How could the approach to strategic planning be amended or strengthened to better achieve these aims? What else will be required for partners across the corridor to develop these relationships and exploit these opportunities?



Q5. Do you agree with the design principles set out at paragraph 41?

How might these be developed or amended to better enable collective decision-making?

Q6. Should any new cross-corridor governance structures preserve a role for subregional collaboration?

Q7. Can the opportunities afforded by strategic planning, be exploited without statutory governance structures to ‘lock-in’ collaboration over the long-term?

Q8. If informal models of collaboration are to be sufficient, how can local authorities give confidence to wider stakeholders that their commitment to a) their strategic plans, and b) joint-working will sustain over the long-term?


In the post-RS NPPF DTC world the duty to cooperate is supposed to fill the vacuum.  It either hasn’t happened or is happening too slowly.

The reasons for this are the duty to cooperate takes 5 or moves years to ‘bite’.  First LPAs have to realise they have to cooperate, then commission strategic housing and other studies and then reach unanimity – as if they were the medieval Polish parliament.  On issues of urban expansion there is a clear bias in favour of rural districts surrounding urban towns which they outnumber.  As geographical area expands the sheer number of authorities requiring unanimity expands exponentially.

The other major failing regarding the Duty concerns the failure so far to deal with metropolitan overspill and even agree a structure/MoU to tackle it, or even in some cases that there is an issue at all.

Whilst the paper is correct that the role of Central government needs to be supportive rather than directive of the strategy and its delivery strategy under current legislation including the DTC can only work if National Government provides clear support.

In my view National Government needs to do five things if the Programme is to be a success:

  1. Set down a policy position that the Arc Strategy is a National priority and to fulfill the DTC stakeholders should work together to further the strategy
  2. The strategy needs to be agreed by a board of affected authorities and the NIC but this need not be subject to a rule of unanimity
  3.  The strategy will be subject to informal panel appointed by the PI/SOS its recommended changes then going forward for the SoS for endorsement
  4. Once endorsed local plans must be in general conformity with the strategy to fulfill the DTC and be sound
  5. The SoS will devolve infrastructure spending/development auction revenues to an Arc governance body providing it agrees to further the strategy and agrees to a proportionate share of housing need overspill from Greater Birmingham and Greater London as set out in the strategy.

In terms of the sub-regional/regional interface the lesson of past regional planning projects is that they are largely pieced together – not always coherently – from sub-regional plans.  Retention of strong sub-regional cooperation is essential not least because of the scale of the area and the success of some existing partnerships.  It is essential though that the regional strategy is properly resourced and driven by a dedicated team/consultants so it coheres and has its own identity – not just being a part time endeavor.  There should however be no need for a middle tier of sub-regional plans/MOUs.  This would be too complicated and be slow to ‘trickle down’ rather sub-regional work should flow up into the strategy plan.


Developing and delivering an integrated strategic plan

 Q9. How could local authorities make early progress in the development of an integrated strategic plan, prior to the development of any new collective governance arrangements?

Q10. How can progress against the plan be assessed and the effectiveness of the plan monitored and evaluated? Are there examples of good practice from which lessons can be learned?

The preparation of the strategy cannot wait until long term governance arrangements are worked through and legislated for.

The NIC  should commission the strategy in late summer 2017 following scoping work with local authorities – and then set up an interim programme board to manage the contract.  You would be waiting decades if dozens of individual districts, county councils and combined authorities each took to their members a budget and agreement to commission a strategy.  It would take as long to agree as DTC MOUs – we already know this doesn’t work in the necessary timescale.

Also long term governance cannot wait until there is universal coverage of combined authorities, therefore it is necessary to have large and potentially unwieldy board arrangements.  This should be seen as a supervisory board however not an executive board.  An executive board should comprise one rep per county plus one from each of the four main town, Oxford, Cambridge, MK and Northampton because for political reasons county only representation would be unacceptable.  Day to day management should in the short term rest with the NIC as the only constituted and capable body already in existence able to handle the commissioning work.

Broxbourne local plan lead, Green Belt policy is outdated

Herts Mercury

“Outdated” green belt policy needs addressing, according to one of the men responsible for a major planning blueprint.

The claim was made by Broxbourne Borough Council’s Paul Seeby, who is overseeing the area’s local plan, which will facilitate the building of 7,000 homes.

Green belt policy was established in the 1940s with the aim of preventing urban sprawl, but a shortage of housing has brought it into question.

The Government stipulates houses can only be built on green belt land if there are “very special circumstances”, but half of the sites laid out in the borough’s plan lie in the coveted land.

“There has been building taking place primarily on brownfield sites,” said cllr Seeby.

“But building on green belt land shouldn’t be seen as a loss or taking something away.

“If you look at sites like High Leigh or Rosedale Park, a lot of green belt can be opened up for people to enjoy. It’s a trade off, there’s not enough brownfield sites.

“I think we need to ask ourselves what we are preserving. A piece of grass isn’t necessarily rich in biodiversity.

“When we are trying to go into the green belt we need to look at ways to open it up so people can enjoy the countryside, albeit with a few houses on the site.”

He added: “The green belt boundaries were quite random. They took London and drew a line round it and you can’t access the land.

“I think [the policy] is outdated and we need to ensure that people have access to green space.”

Local plans are currently being drawn up by every district council in the country.

Around 45 per cent of the 16,000 homes to be built in East Herts lie on the green belt, while in North Herts the figure is around 42 per cent of 12,000 homes.

Richard Bullen, the honourary treasurer of the Hertfordshire branch of the Campaign for the Protection of Rural England, passionately disagreed with Mr Seeby.

“The primary purpose of the green belt is to stop communities being joined together,” he said.

“This stems back to what happened between the wars with houses being built on the side of roads and wherever they wanted.

“It’s called green only because the Ordnance Survey use the colour green to show countryside. It’s there to make sure that we don’t get urban sprawl.

“In 1945 London and Los Angeles were about the same size. In 1947 the green belt policy was introduced.

“If London was the size of LA now it would extend from Bedford to Brighton. The whole purpose of the green belt, to prevent urban sprawl, has been very successful.”

Part of the East Herts District Plan would see 10,000 homes built in Gilston as a new garden town. Spike Hughes, of Stop Harlow North, is campaigning against the development.

On the issue of green belt policy, he said: “I absolutely disagree with Mr Seeby.

“You keep hearing this ‘we are going to build on the green belt because it’s not nice green belt’. What a load of bull.

“If you look at the green belt around Gilston, it is beautiful.”


London – Double Densities with 2,200 Tower Blocks Not Enough – 5% of Green Belt Still Needed

Co-Star on Carter Jonas Report

Carter Jonas, the UK property consultancy, has outlined a three-pronged approach to address London’s housing crisis in its latest report, entitled ‘Solving London’s Housing Crisis’.

Through a combination of taller buildings, increased density and building innovation, Carter Jonas estimates that around 1.47 million homes could be delivered in the next 20 to 30 years.

Building up is essential to accommodate more homes in a smaller area and has the potential to deliver 820,000 additional homes. The proportion of residential supply approved through tall building developments has risen from 16% to 42% between 2004 and 2015, demonstrating a clear recognition that this efficient use of land could be instrumental in meeting housing targets.

By increasing the density of development within Housing Zones, Opportunity Areas and Intensification Areas, which cover 15% of the city, there is the scope to support more than 720,000 new homes – double the 360,000 units currently forecast.

London is the greenest city of its size with over a fifth of the capital’s land area classified as Green Belt. The second solution – Build Out – highlights that just a third of the city’s 15,300 acres of ‘non-green’ Green Belt land could accommodate 250,000 homes, at affordable price points. Green Belt boundaries should be reviewed over time by local authorities in line with the current National Planning Policy Framework and the recent Housing White Paper.

Carter Jonas estimates that by employing modern methods of construction to ‘Build Differently’ an extra 397,500 new homes could be delivered each year. Furthermore, if 10% of the 360,000 residential units planned across the Housing Zones, Opportunity Areas and Intensification Areas were developed as smaller high-quality units, then this could allow for up to an additional 36,000 units to be delivered.

Continuation of office to residential conversions offers at least a further 15,000-25,000 new homes, with 16,600 units currently granted planning and in progress (approx. 9m sqft of office space).

Tim Shaw, Head of Development, Carter Jonas, said: “There is not a single panacea that will solve London’s housing crisis, but there is a solution if we adopt an approach that allows for increased development density, relaxation of the green belt definition and innovation in construction techniques. We estimate that this approach could deliver the 1.5 million new homes London needs by 2050, so whilst the target is clearly very ambitious, it is achievable.

“Naturally these are needed to deliver a flurry of new high rise developments and the political challenge of reclassifying green belt land.

“What is clear is that we collectively need to embrace change using a targeted, innovative and effective plan – and the Government needs to give us the flexibility to do so. Aspects will be unpopular – the idea of 2,200 new towers won’t be to everyone’s taste – but the situation in London has reached such a critical point that we need to face up to this harsh reality if we are to have any hope of tackling the Capital’s housing needs.

Wealden v Lewes/South Downs Parts of Core Strategy Quashed because of Cumulative SAC Impact


The impact of the JCS would only be 190 vehicles a day through the Ashdown Forest SAC, however with Welden these topped 1,000 /daym the screening threshold. Natural England gave bad advice.

As out of time for Wealden only applies to SDNP area – which has no strategic sites – and almost no impact on Ashdown forest.  So the impact is to leave SNP in Lewes district without a plan unless it carries out a hugelty expensive study on development of marginal impact – what a waste of time.


Micro Assumptions in Macro Models – Any Scope for NK/PK Convergence?

With major orthodox figures such as Paul Romer attacking the ‘lazy’ approach to microfoundations in DGSE models and major figures such as Micheal Eichenbaum  questioning major pillars of DGSE (such as the Euler equation and by implication rational expectations) it is opportune to examine – in the light of recent theoretical developments – whether a stronger foundation for the economic behavior of individuals may emerge.

Romer made a number of points which apply to any system of modelling – including the identification problem which applies equally to SFC models.  The key point he also made however is that this masked the lack of progress of the DGSE approach to macro over the last 20 years including the ‘lazy’ approach to micro-foundations which ignored systems effects and emergent phenomenon,

Suppose an economist thought that traffic congestion is a metaphor for macro fluctuations or a literal cause of such fluctuations. The obvious way to proceed would be to recognize that drivers make decisions about when to drive and how to drive. From the interaction of these decisions, seemingly random aggregate fluctuations in traffic throughput will emerge. This is a sensible way to think about a fluctuation

Better modelling requires a better model of the agent and the interactions of agents.  Behavioral research and empirical observations are crucial but to be useful for forecasting these need to be abstracted  to a model.

Hence Romer’s call for taking microfoundations seriously.   Which is the purpose of this piece.  Having said that I hate the term. I refer here to ‘micro assumptions’ rather than ‘micro foundations’ as the use of the term ‘foundations’ represents a reductionist fallacy that excludes system level properties that only represent themselves at a system wide scale as emergent properties of individual agent decision making.  Precisely the point Romer was making.

To simplify matters I will only focus on one of the three legs of the DGSE framework the household sector.

In all modern versions of this there is a household consumption function which maximises NPV of utility over a lifetime using a Euler equation (which effectively defines an IS curve).  The use of rational expectations assumes perfect foresight where agents are immune of the ‘illusion’ that current price levels are a reliable guide to future action.

From a Ramsey consumption model the residual is savings.  This then feeds into a Koopmans-Kass derived Growth model (as used in RBC models the foundation of DGSE models) where the change in the capital stock =savings-depreciation=investment.  On this point the divergence between NK and PK is small as all PK models I know use a similar definition of the ‘capital stock’ derived from Kalecki.  Again the term is misleading – it is a flow not a stock – the classical term ‘capital advanced’ is far superior – it is only a stock if you freeze and roll up accounts after a fixed but arbitrary period- such as a calendar year.  Capital here is defined in purely monetary terms and as such and with only a single ‘representative agent’ in DGSE models no complications from capital theory and the in-applicability of marginal theories of distribution – which PKs might otherwise object to – arise.   So the real difference between NK and PK models is over the household consumption function. Here many PK models have a lot to answer for making very simplistic and non-founded assumptions such as savings being a fixed proportion of income.  This isn’t good enough.

Recent years have seen a number of theoretical developments which offer the potential to replace the RE/Representative Agent/Euler equation approach.

Firstly the growing use of extrapolation expectations – that agents extrapolate current conditions to the future.  A hypothesis which has growing empirical support as Noah Smith summarises.

The problem with the Euler equation is that under equilibrium the (interest) savings rate implied by the Euler equation should be the same as the money market interest rate (to be strict minus any Wicksell effect).  They are not, not only are they not positively correlated many many studies show they are negatively correlated.  The same phenomenon  underlies the ‘equity premium puzzle’ and the ‘risk free puzzle’ (of which more of below as it now seems there is a solution to these puzzles).

The Euler equation implies that an increased interest rate should lead to higher savings and deferment of consumption from the present to the future.  However two factors seems to be at work, firstly households are liquidity constrained not simply by a budget constraint.  If debtors receive a reduction in income  from higher interest rates they may prioritise essential current consumption over savings. As Keynesian’s stress MPS and MPC are different decisions.  Secondly interest rates in an inflation targeting monetary regime are highly correlated with inflation and so consumers may buy now to avoid higher prices later.

The solution to both of these issues seems to lie in abandoning a representative consumer and dividing households into several groups.   This can be done for example by class (those relying on wages only and those on investment income) and by degree of indebtedness.  By these factors liquidity constraints can vary but also by the degree to which future savings are ‘planned’.

Of course few households will plan there savings through optimizing their savings through a Ramsey model.  But many will invest in pension funds etc. which will carry out sophisticated financial models.  For everything else ‘fast and frugel’ hueristics (rules of thumb) can be assumed based on current spending patterns.

Here Extrapolative Expectations comes in.  As a recent paper by Glaser and Nathenson points out it only takes an assumption that a minority of house price buyers will extrapolate current house price trends to capture speculation, bubbles and momentum.  As Noah Smith summarises.

Glaeser and Nathanson’s model makes one crucial assumption — that investors rely on past prices to make guesses about demand, but fail to realize that other buyers are doing the exact same thing. When everyone is making guesses about price trends based on other people’s guesses about price trends, the whole thing can become a house of cards. The economists show how if homebuyers think this way, the result — predictably — is repeated housing bubbles and crashes. 

Of course it is the division of agents into categories in this was which was precisely the foundation for Minky’s Financial Instability hypothesis.

For an agent to fully investigate current and future price movements is costly.  Information takes time to gather and time to filter noise, and the time to filter noise increases with the amount gathered leading to exponential costs.  ‘Sticky information’  (Mankiw /Reis) models are a form of bounded rationality based on extrapolative expectations.  Indeed once you allow for this you get a non-vertical phillips curve.  Keynsianism is seemingly vindicated.

The second major advance concerns the interpretation of the utility function given historical time.  Here I will refer to the work of Ole Peters of the Sante Fe institute.

Current utility theory has a ‘many worlds’ approach to events similar to that which has bogged down theoretical physics.    So if you take many tosses of a coin you then assemble them into an ‘ensemble’ from which you can estimate likelihood and hence logarithmic utility.   Peters has shown this to be flawed by a replacement approach which places events and decisions in historical time based on past events.  This approach replaces utility but is mathematically equivalent to logarithmic utility.  Most excitingly it offers a basis for estimating ‘rational leverage’ of both households and firms based on past outcomes and future likelihoods – and a solution to the equity premium and other finance theory puzzles.

We resolve the puzzle by rejecting the underlying axiom that the expectation value of profit should be used to judge the desirability of [a] contract. Expectation values are averages over ensembles, but an individual signing [a] contract is not an ensemble. Individuals are not ensembles, but they do live across time. Averaging over time is therefore the appropriate way of removing randomness from the model, and the time-average growth rate is the object of interest.

As Peter’s points out Chocrane’s text book on finance manages to derive all of modern finance theory from a single basic equation, but one that makes a false assumption on ensemble utility, so if you correct that assumption you get the whole of finance theory  as a free gift.

So the proposal is to reconstruct the micro assumptions of household behavior based on extrapolative expectations and optimal leverage with liquidity constraints.

This requires division of households into groups and agents based on wealth and leverage.  Once you have wealth transfers and lending however this requires a balance sheet based approach which can model stocks and flows.  So the irony is that current trends on improving ‘microfoundations’ could bring NKs firmly towards the ideas and techniques pioneered in the PK community.

Courts – Affordble Housing SPG unlawful because it should have been a DPD

The Queen on the application of Skipton Properties and Craven District Council

Although the case is unusual in that the district had not saved its affordable housing policy the logic applies to all SPG which set affordable housing thresholds or %.  The effect is to reassert the Great Portland V Westminster principle which applied prior to the DP regs under the pre 2004 regime.

the correct analysis is that the NAHC 2016 contains statements in the nature of policies which pertain to the development and use of land which the Defendant wishes to encourage, pending its adoption of a new local plan which will include an affordable housing policy. The development and use of land is either “residential development including affordable housing” or “affordable housing”. It is an interim policy in the nature of a DPD. It should have been consulted on; an SEA should have been carried out; it should have been submitted to the Secretary of State for independent examination.

Will Oil ‘Gush’ from Unauthorised Exploratory borehole under High Weald?

The Times  My feeling is that unless there was a condition on the original permission preventing sidetracks then Surrey CC dont have a leg to stand on, they can simply claim they were drilling for oil – the purpose of their original consent.

An oil company has drilled a well in the green belt without permission and ignored repeated warnings that it would need consent, a council has said.

Angus Energy continued to drill at Brockham, Surrey, in January despite the county council writing to the company twice to say that it required planning approval.

Residents are calling on the council and the Environment Agency to hold an inquiry and to prosecute the company if it has broken the law.

Angus has an existing oil production site at Brockham, near Dorking, but had been told that its planning permission did not cover any new drilling. The company believes there could be far greater quantities of untapped oil 700 metres underground in the Kimmeridge layer of shale that runs across the Weald.

A significant investor in the Brockham site claimed last month that if oil flowed from the well, as was expected, further exploration was likely across the Weald, which straddles Surrey, Sussex and parts of Kent and Hampshire. David Lenigas, an Australian entrepreneur who claimed in 2015 that there could be 100 billion barrels of oil under the south of England, said that a flow test at Brockham could open up “the whole Weald basin for other players to go look. There is substantial amounts of oil in these Kimmeridge limestones.”

The council sent letters to the company in September and December last year stating that it would need to apply for planning permission to drill a new sidetrack, which is a well branching off the original borehole. Clayton Wellman, a Liberal Democrat councillor in Mole Valley, the district covering Brockham, said he was very concerned because Angus Energy also had a stake in the most controversial new oil exploration site in southern England, near Leith Hill in the Surrey Hills Area of Outstanding Natural Beauty.

“We are very worried because we do not know what else they are doing and whether they are doing things properly,” Mr Wellman said.

Roger Abbott, who lives less than a mile from the Brockham site, said that Angus Energy had told the local population that it was only exploring existing wells. He called on the county council and the Environment Agency to hold an inquiry “to assess whether Angus Energy can be trusted with the high-risk business of extracting oil from within the green belt” and to prosecute the company if it had broken the law. “The well should be shut down immediately pending that inquiry,” he said.

Keith Taylor, the Green MEP for the South East of England, said that “the drilling, without permission, of a new well is an outrageous breach of an already deeply strained trust” in the oil and gas drilling industry.

Angus Energy said that it did not require planning permission for the sidetrack well. A spokesman said: “Despite having every opportunity at a number of meetings and in extensive correspondence, the county council has not identified any way in which the sidetrack causes any planning harm.”

The council has suggested that Angus Energy could apply for retrospective planning permission.

The company spokesman said: “We have asked the county to provide further documentation concerning this matter. And, as always, we will work together on an agreement and consents if they are actually required.”

The Oil and Gas Authority, the government body which regulates the industry, said that it was looking into the concerns raised.

On Village Envelope Boundaries – Still Relevant Post #NPPF?


The form and nature of local plans has changed very little since the inception of the NPPF.  We still have much the same planning policy concepts and structure of local plans, as well as length 800+ pages not being uncommon.

One area that deserves a hard look is Village Envelope Boundaries.  Under the PPG world the purpose of such policies was clear – to mark the extent of the ‘open countryside’ which was protected and conversely the area where settlement infill policies applied conversely.  With that protection removed the debate shifted to whether these were  “[relevant] policies for the supply of housing” under para 49.  The outcome of the courts, subject to the current Supreme Court case is they are.

As a result the main problem most authorities have faced is a flood of applications for villages outside the Green Belt of inappropriate size and often inappropriate scale.

Other designations have fared less harshly under the courts, in particular settlement gaps, which have been held not to be necessarily ‘out of date’ through age alone if they serve a landscape purpose and housing growth can be met elsewhere.

The problem with ‘classic’ village envelopes is they can fall so easily out of date once you lack a 5 year housing supply.

The starting point I would suggest is to define policies for distribution of housing that

  1. Define the distribution of growth for villages both in absolute and relative terms
  2. That local plans have ‘plan b’ policies for circumstances of not having a 5 year supply, that can mean reserve sites or directing growth to particular areas, rather than simply repeating the original % split.

Then for villages the priority for policy is to define where growth is unacceptable on landscape and conservation terms and then define areas where organic growth is acceptable over time.  This of course is how villages historically grew, organically and slowly until the 20C.  A greater rate of growth is needed today but the same principles apply.

Many of the ‘unnaceptable’ areas can be defined as LGS, but if these are necessarily ‘extensive tracts’ some form of local landscape designation is needed.