Broxbourne local plan lead, Green Belt policy is outdated

Herts Mercury

“Outdated” green belt policy needs addressing, according to one of the men responsible for a major planning blueprint.

The claim was made by Broxbourne Borough Council’s Paul Seeby, who is overseeing the area’s local plan, which will facilitate the building of 7,000 homes.

Green belt policy was established in the 1940s with the aim of preventing urban sprawl, but a shortage of housing has brought it into question.

The Government stipulates houses can only be built on green belt land if there are “very special circumstances”, but half of the sites laid out in the borough’s plan lie in the coveted land.

“There has been building taking place primarily on brownfield sites,” said cllr Seeby.

“But building on green belt land shouldn’t be seen as a loss or taking something away.

“If you look at sites like High Leigh or Rosedale Park, a lot of green belt can be opened up for people to enjoy. It’s a trade off, there’s not enough brownfield sites.

“I think we need to ask ourselves what we are preserving. A piece of grass isn’t necessarily rich in biodiversity.

“When we are trying to go into the green belt we need to look at ways to open it up so people can enjoy the countryside, albeit with a few houses on the site.”

He added: “The green belt boundaries were quite random. They took London and drew a line round it and you can’t access the land.

“I think [the policy] is outdated and we need to ensure that people have access to green space.”

Local plans are currently being drawn up by every district council in the country.

Around 45 per cent of the 16,000 homes to be built in East Herts lie on the green belt, while in North Herts the figure is around 42 per cent of 12,000 homes.

Richard Bullen, the honourary treasurer of the Hertfordshire branch of the Campaign for the Protection of Rural England, passionately disagreed with Mr Seeby.

“The primary purpose of the green belt is to stop communities being joined together,” he said.

“This stems back to what happened between the wars with houses being built on the side of roads and wherever they wanted.

“It’s called green only because the Ordnance Survey use the colour green to show countryside. It’s there to make sure that we don’t get urban sprawl.

“In 1945 London and Los Angeles were about the same size. In 1947 the green belt policy was introduced.

“If London was the size of LA now it would extend from Bedford to Brighton. The whole purpose of the green belt, to prevent urban sprawl, has been very successful.”

Part of the East Herts District Plan would see 10,000 homes built in Gilston as a new garden town. Spike Hughes, of Stop Harlow North, is campaigning against the development.

On the issue of green belt policy, he said: “I absolutely disagree with Mr Seeby.

“You keep hearing this ‘we are going to build on the green belt because it’s not nice green belt’. What a load of bull.

“If you look at the green belt around Gilston, it is beautiful.”

 

London – Double Densities with 2,200 Tower Blocks Not Enough – 5% of Green Belt Still Needed

Co-Star on Carter Jonas Report

Carter Jonas, the UK property consultancy, has outlined a three-pronged approach to address London’s housing crisis in its latest report, entitled ‘Solving London’s Housing Crisis’.

Through a combination of taller buildings, increased density and building innovation, Carter Jonas estimates that around 1.47 million homes could be delivered in the next 20 to 30 years.

Building up is essential to accommodate more homes in a smaller area and has the potential to deliver 820,000 additional homes. The proportion of residential supply approved through tall building developments has risen from 16% to 42% between 2004 and 2015, demonstrating a clear recognition that this efficient use of land could be instrumental in meeting housing targets.

By increasing the density of development within Housing Zones, Opportunity Areas and Intensification Areas, which cover 15% of the city, there is the scope to support more than 720,000 new homes – double the 360,000 units currently forecast.

London is the greenest city of its size with over a fifth of the capital’s land area classified as Green Belt. The second solution – Build Out – highlights that just a third of the city’s 15,300 acres of ‘non-green’ Green Belt land could accommodate 250,000 homes, at affordable price points. Green Belt boundaries should be reviewed over time by local authorities in line with the current National Planning Policy Framework and the recent Housing White Paper.

Carter Jonas estimates that by employing modern methods of construction to ‘Build Differently’ an extra 397,500 new homes could be delivered each year. Furthermore, if 10% of the 360,000 residential units planned across the Housing Zones, Opportunity Areas and Intensification Areas were developed as smaller high-quality units, then this could allow for up to an additional 36,000 units to be delivered.

Continuation of office to residential conversions offers at least a further 15,000-25,000 new homes, with 16,600 units currently granted planning and in progress (approx. 9m sqft of office space).

Tim Shaw, Head of Development, Carter Jonas, said: “There is not a single panacea that will solve London’s housing crisis, but there is a solution if we adopt an approach that allows for increased development density, relaxation of the green belt definition and innovation in construction techniques. We estimate that this approach could deliver the 1.5 million new homes London needs by 2050, so whilst the target is clearly very ambitious, it is achievable.

“Naturally these are needed to deliver a flurry of new high rise developments and the political challenge of reclassifying green belt land.

“What is clear is that we collectively need to embrace change using a targeted, innovative and effective plan – and the Government needs to give us the flexibility to do so. Aspects will be unpopular – the idea of 2,200 new towers won’t be to everyone’s taste – but the situation in London has reached such a critical point that we need to face up to this harsh reality if we are to have any hope of tackling the Capital’s housing needs.

Wealden v Lewes/South Downs Parts of Core Trategy Quashed because of Cumulative SAC Impact

Ballilaw

The impact of the JCS would only be 190 vehicles a day through the Ashdown Forest SAC, however with Welden these topped 1,000 /daym the screening threshold. Natural England gave bad advice.

As out of time for Wealden only applies to SDNP area – which has no strategic sites – and almost no impact on Ashdown forest.  So the impact is to leave SNP in Lewes district without a plan unless it carries out a hugelty expensive study on development of marginal impact – what a waste of time.

 

Micro Assumptions in Macro Models – Any Scope for NK/PK Convergence?

With major orthodox figures such as Paul Romer attacking the ‘lazy’ approach to microfoundations in DGSE models and major figures such as Micheal Eichenbaum  questioning major pillars of DGSE (such as the Euler equation and by implication rational expectations) it is opportune to examine – in the light of recent theoretical developments – whether a stronger foundation for the economic behavior of individuals may emerge.

Romer made a number of points which apply to any system of modelling – including the identification problem which applies equally to SFC models.  The key point he also made however is that this masked the lack of progress of the DGSE approach to macro over the last 20 years including the ‘lazy’ approach to micro-foundations which ignored systems effects and emergent phenomenon,

Suppose an economist thought that traffic congestion is a metaphor for macro fluctuations or a literal cause of such fluctuations. The obvious way to proceed would be to recognize that drivers make decisions about when to drive and how to drive. From the interaction of these decisions, seemingly random aggregate fluctuations in traffic throughput will emerge. This is a sensible way to think about a fluctuation

Better modelling requires a better model of the agent and the interactions of agents.  Behavioral research and empirical observations are crucial but to be useful for forecasting these need to be abstracted  to a model.

Hence Romer’s call for taking microfoundations seriously.   Which is the purpose of this piece.  Having said that I hate the term. I refer here to ‘micro assumptions’ rather than ‘micro foundations’ as the use of the term ‘foundations’ represents a reductionist fallacy that excludes system level properties that only represent themselves at a system wide scale as emergent properties of individual agent decision making.  Precisely the point Romer was making.

To simplify matters I will only focus on one of the three legs of the DGSE framework the household sector.

In all modern versions of this there is a household consumption function which maximises NPV of utility over a lifetime using a Euler equation (which effectively defines an IS curve).  The use of rational expectations assumes perfect foresight where agents are immune of the ‘illusion’ that current price levels are a reliable guide to future action.

From a Ramsey consumption model the residual is savings.  This then feeds into a Koopmans-Kass derived Growth model (as used in RBC models the foundation of DGSE models) where the change in the capital stock =savings-depreciation=investment.  On this point the divergence between NK and PK is small as all PK models I know use a similar definition of the ‘capital stock’ derived from Kalecki.  Again the term is misleading – it is a flow not a stock – the classical term ‘capital advanced’ is far superior – it is only a stock if you freeze and roll up accounts after a fixed but arbitrary period- such as a calendar year.  Capital here is defined in purely monetary terms and as such and with only a single ‘representative agent’ in DGSE models no complications from capital theory and the in-applicability of marginal theories of distribution – which PKs might otherwise object to – arise.   So the real difference between NK and PK models is over the household consumption function. Here many PK models have a lot to answer for making very simplistic and non-founded assumptions such as savings being a fixed proportion of income.  This isn’t good enough.

Recent years have seen a number of theoretical developments which offer the potential to replace the RE/Representative Agent/Euler equation approach.

Firstly the growing use of extrapolation expectations – that agents extrapolate current conditions to the future.  A hypothesis which has growing empirical support as Noah Smith summarises.

The problem with the Euler equation is that under equilibrium the (interest) savings rate implied by the Euler equation should be the same as the money market interest rate (to be strict minus any Wicksell effect).  They are not, not only are they not positively correlated many many studies show they are negatively correlated.  The same phenomenon  underlies the ‘equity premium puzzle’ and the ‘risk free puzzle’ (of which more of below as it now seems there is a solution to these puzzles).

The Euler equation implies that an increased interest rate should lead to higher savings and deferment of consumption from the present to the future.  However two factors seems to be at work, firstly households are liquidity constrained not simply by a budget constraint.  If debtors receive a reduction in income  from higher interest rates they may prioritise essential current consumption over savings. As Keynesian’s stress MPS and MPC are different decisions.  Secondly interest rates in an inflation targeting monetary regime are highly correlated with inflation and so consumers may buy now to avoid higher prices later.

The solution to both of these issues seems to lie in abandoning a representative consumer and dividing households into several groups.   This can be done for example by class (those relying on wages only and those on investment income) and by degree of indebtedness.  By these factors liquidity constraints can vary but also by the degree to which future savings are ‘planned’.

Of course few households will plan there savings through optimizing their savings through a Ramsey model.  But many will invest in pension funds etc. which will carry out sophisticated financial models.  For everything else ‘fast and frugel’ hueristics (rules of thumb) can be assumed based on current spending patterns.

Here Extrapolative Expectations comes in.  As a recent paper by Glaser and Nathenson points out it only takes an assumption that a minority of house price buyers will extrapolate current house price trends to capture speculation, bubbles and momentum.  As Noah Smith summarises.

Glaeser and Nathanson’s model makes one crucial assumption — that investors rely on past prices to make guesses about demand, but fail to realize that other buyers are doing the exact same thing. When everyone is making guesses about price trends based on other people’s guesses about price trends, the whole thing can become a house of cards. The economists show how if homebuyers think this way, the result — predictably — is repeated housing bubbles and crashes. 

Of course it is the division of agents into categories in this was which was precisely the foundation for Minky’s Financial Instability hypothesis.

For an agent to fully investigate current and future price movements is costly.  Information takes time to gather and time to filter noise, and the time to filter noise increases with the amount gathered leading to exponential costs.  ‘Sticky information’  (Mankiw /Reis) models are a form of bounded rationality based on extrapolative expectations.  Indeed once you allow for this you get a non-vertical phillips curve.  Keynsianism is seemingly vindicated.

The second major advance concerns the interpretation of the utility function given historical time.  Here I will refer to the work of Ole Peters of the Sante Fe institute.

Current utility theory has a ‘many worlds’ approach to events similar to that which has bogged down theoretical physics.    So if you take many tosses of a coin you then assemble them into an ‘ensemble’ from which you can estimate likelihood and hence logarithmic utility.   Peters has shown this to be flawed by a replacement approach which places events and decisions in historical time based on past events.  This approach replaces utility but is mathematically equivalent to logarithmic utility.  Most excitingly it offers a basis for estimating ‘rational leverage’ of both households and firms based on past outcomes and future likelihoods – and a solution to the equity premium and other finance theory puzzles.

We resolve the puzzle by rejecting the underlying axiom that the expectation value of profit should be used to judge the desirability of [a] contract. Expectation values are averages over ensembles, but an individual signing [a] contract is not an ensemble. Individuals are not ensembles, but they do live across time. Averaging over time is therefore the appropriate way of removing randomness from the model, and the time-average growth rate is the object of interest.

As Peter’s points out Chocrane’s text book on finance manages to derive all of modern finance theory from a single basic equation, but one that makes a false assumption on ensemble utility, so if you correct that assumption you get the whole of finance theory  as a free gift.

So the proposal is to reconstruct the micro assumptions of household behavior based on extrapolative expectations and optimal leverage with liquidity constraints.

This requires division of households into groups and agents based on wealth and leverage.  Once you have wealth transfers and lending however this requires a balance sheet based approach which can model stocks and flows.  So the irony is that current trends on improving ‘microfoundations’ could bring NKs firmly towards the ideas and techniques pioneered in the PK community.

Courts – Affordble Housing SPG unlawful because it should have been a DPD

The Queen on the application of Skipton Properties and Craven District Council

Although the case is unusual in that the district had not saved its affordable housing policy the logic applies to all SPG which set affordable housing thresholds or %.  The effect is to reassert the Great Portland V Westminster principle which applied prior to the DP regs under the pre 2004 regime.

the correct analysis is that the NAHC 2016 contains statements in the nature of policies which pertain to the development and use of land which the Defendant wishes to encourage, pending its adoption of a new local plan which will include an affordable housing policy. The development and use of land is either “residential development including affordable housing” or “affordable housing”. It is an interim policy in the nature of a DPD. It should have been consulted on; an SEA should have been carried out; it should have been submitted to the Secretary of State for independent examination.

Will Oil ‘Gush’ from Unauthorised Exploratory borehole under High Weald?

The Times  My feeling is that unless there was a condition on the original permission preventing sidetracks then Surrey CC dont have a leg to stand on, they can simply claim they were drilling for oil – the purpose of their original consent.

An oil company has drilled a well in the green belt without permission and ignored repeated warnings that it would need consent, a council has said.

Angus Energy continued to drill at Brockham, Surrey, in January despite the county council writing to the company twice to say that it required planning approval.

Residents are calling on the council and the Environment Agency to hold an inquiry and to prosecute the company if it has broken the law.

Angus has an existing oil production site at Brockham, near Dorking, but had been told that its planning permission did not cover any new drilling. The company believes there could be far greater quantities of untapped oil 700 metres underground in the Kimmeridge layer of shale that runs across the Weald.

A significant investor in the Brockham site claimed last month that if oil flowed from the well, as was expected, further exploration was likely across the Weald, which straddles Surrey, Sussex and parts of Kent and Hampshire. David Lenigas, an Australian entrepreneur who claimed in 2015 that there could be 100 billion barrels of oil under the south of England, said that a flow test at Brockham could open up “the whole Weald basin for other players to go look. There is substantial amounts of oil in these Kimmeridge limestones.”

The council sent letters to the company in September and December last year stating that it would need to apply for planning permission to drill a new sidetrack, which is a well branching off the original borehole. Clayton Wellman, a Liberal Democrat councillor in Mole Valley, the district covering Brockham, said he was very concerned because Angus Energy also had a stake in the most controversial new oil exploration site in southern England, near Leith Hill in the Surrey Hills Area of Outstanding Natural Beauty.

“We are very worried because we do not know what else they are doing and whether they are doing things properly,” Mr Wellman said.

Roger Abbott, who lives less than a mile from the Brockham site, said that Angus Energy had told the local population that it was only exploring existing wells. He called on the county council and the Environment Agency to hold an inquiry “to assess whether Angus Energy can be trusted with the high-risk business of extracting oil from within the green belt” and to prosecute the company if it had broken the law. “The well should be shut down immediately pending that inquiry,” he said.

Keith Taylor, the Green MEP for the South East of England, said that “the drilling, without permission, of a new well is an outrageous breach of an already deeply strained trust” in the oil and gas drilling industry.

Angus Energy said that it did not require planning permission for the sidetrack well. A spokesman said: “Despite having every opportunity at a number of meetings and in extensive correspondence, the county council has not identified any way in which the sidetrack causes any planning harm.”

The council has suggested that Angus Energy could apply for retrospective planning permission.

The company spokesman said: “We have asked the county to provide further documentation concerning this matter. And, as always, we will work together on an agreement and consents if they are actually required.”

The Oil and Gas Authority, the government body which regulates the industry, said that it was looking into the concerns raised.

On Village Envelope Boundaries – Still Relevant Post #NPPF?

 

The form and nature of local plans has changed very little since the inception of the NPPF.  We still have much the same planning policy concepts and structure of local plans, as well as length 800+ pages not being uncommon.

One area that deserves a hard look is Village Envelope Boundaries.  Under the PPG world the purpose of such policies was clear – to mark the extent of the ‘open countryside’ which was protected and conversely the area where settlement infill policies applied conversely.  With that protection removed the debate shifted to whether these were  “[relevant] policies for the supply of housing” under para 49.  The outcome of the courts, subject to the current Supreme Court case is they are.

As a result the main problem most authorities have faced is a flood of applications for villages outside the Green Belt of inappropriate size and often inappropriate scale.

Other designations have fared less harshly under the courts, in particular settlement gaps, which have been held not to be necessarily ‘out of date’ through age alone if they serve a landscape purpose and housing growth can be met elsewhere.

The problem with ‘classic’ village envelopes is they can fall so easily out of date once you lack a 5 year housing supply.

The starting point I would suggest is to define policies for distribution of housing that

  1. Define the distribution of growth for villages both in absolute and relative terms
  2. That local plans have ‘plan b’ policies for circumstances of not having a 5 year supply, that can mean reserve sites or directing growth to particular areas, rather than simply repeating the original % split.

Then for villages the priority for policy is to define where growth is unacceptable on landscape and conservation terms and then define areas where organic growth is acceptable over time.  This of course is how villages historically grew, organically and slowly until the 20C.  A greater rate of growth is needed today but the same principles apply.

Many of the ‘unnaceptable’ areas can be defined as LGS, but if these are necessarily ‘extensive tracts’ some form of local landscape designation is needed.

 

National Strategic Planning has a Future – If we keep its goals limited

Imagine the scenario – we only have a lightweight National Statement on Strategic Planning carried out by the National Infrastructure commission

Its roles would be expressly limited to those instances where strategic planning above the city/combined authority level to coordinate with national infrastructure and economic development programmes (the Midlands Engine and the Northern Powerhouse)

It would only:

  1. Deal with the overspill on OAN from the Major cities of London, Greater Manchester, South Sussex and Greater Bristol to broad locations on transport corridors
  2. Provide the national support to a strategy for the Oxford-MK-Cambridge-Northampton-Eastern Ports Arc linked to new rail capacity and existing rail capacity released on the WCML by HS2 seeNational Infrastructure Commission
  3. Release the potential for auction of development rights along HS1 and HS2, Crossrail 2, the Varsity Line and WCML.
  4. Support the Northern Way and Midlands Engine including supporting Major Growth poles for HFE and research
  5. Ensure each of these objectives are achieved whilst securing the strategic purposes of the Green Belt and protecting nationally important landscapes.

 

Is this not the kind of national strategy / strategic planning the May government could stomach?

Indeed the true synergies of this approach in areas like the Atlantic Gateway,  Crewe, Northampton and Lakenheath would be apparent.  The economic growth potential would be increased and environmental downsides decreased through taking a joined up approach.

 

Getting Sober on Wicksell’s Wine Lake – The Financial Instability Hypothesis and the Pool of Funding

A very simple proposition.

The financial instability hypothesis as set out by Minsky and elaborated by Kindelberger, Keen and others is by far the best economic framework to explain business cycles, and especially the major instabilities that come about in ‘balance sheet recessions’ (Koo) and debt deflationary spirals (Fisher/Hoyt).

However the hypothesis is purely monetary.  It depends on Ponzi investors speculating on assets beyond their fundamental value.

This is unsatisfactory as it leaves unexplained the ‘fundamental value’ of goods and assets, and so is a partial rather than a general theory and so not yet up to the wholesale replacement of lame-stream DGSE/NK models.

Let us focus on one moment a potential ‘tipping point’ at the top of the business cycle.

Consider one flawed theory of what causes that tipping, very wrong but in a very interesting way.

That being the ‘subsistence fund/pool of funding’ explanation deriving from classical economics (the Wages Fund) and developed by Bohm-Bawerk, Wicksell and Strigl.  This became one strand in Austrian business cycle theory but by the late 30s had become completely taken over by the even more flawed monetary Austrian explanation whereby central banks create the business cycle.

The funding concept can be very simply explained by the example of a hurricane destroying all fixed assets in ‘Crusoe town’.  To simplify assume no fixed capital.  Then if it took one year to rebuild everything the real ‘cost’ of rebuilding would be the sum of consumption goods needed to provide for the workforce over one year.

The Austrian/Swedish approach went further, consider a choice of techniques of different ’roundaboutness’ that choice is determined by the comparison of the cost of the subsistence fund against the discounted present value of the final output.  In this one good no fixed capital world a more roundabout technique will only ever be chosen unless it is more productive.

Its most sophisticated elaboration was in Wicksell’s celebrated ‘Wine Model’ where there is a single good, wine, used to sustain workers.  In this model Wicksell managed to retain the key elements of Bohm-Bawerk’s capital theory – a subsistence fund and productivity of roundabout methods, even with compounding interest.  However it omitted savings, once that, fixed capital or a capitalist share is introduced the value of the wine depends on the interest rate.  Like Sraffa’s system it cannot be closed without a exogenous interest rate. However subsequent authors (Sandelin) have closed the system by adding a savings function.  In this system the rate of saving, the rate of interest, the economic rate of depreciation and the length of production period are determined simultaneously, however at any one instant in time the pool of funding is fixed and hence causal.  A savings function also enables the introduction of fixed capital by means by the reduction to dated land and labor technique with depreciation (ref Wicksell’s assessment of Ackermans model, and Sraffas correct model of depreciation).  The one thing that cannot survive is a ‘marginal productivity of capital’ as in such a model price and real Wicksell effects are very clear.

The Austrian approach explains the tipping point of business cycles because of a ‘drain’ in the subsistence fund reducing real wealth and consumption.  From the Mises Institute.

When money is created out of “thin air” it leads to a weakening of the pool of funding. What is the reason for this? The newly created money doesn’t have any back-up behind it as far as the production of goods is concerned—it sprang into existence out of “thin air” so to speak. The holder of the newly created money can use it to withdraw final consumer goods from the pool of funding with no prior contribution to the pool. Hence this act of consumption, or nonproductive consumption, puts pressure on the pool of funding. (The consumption is nonproductive because the individual consumes goods without making any contribution to the pool of funding)….

when money is created out of “thin air” it diverts funding away from wealth producers who have contributed to the pool of funding toward the holders of the newly created money. For a previously given pool of funding this will imply that wealth producers will discover that the purchasing power of their money has fallen since there are now less goods left in the pool—they cannot fully exercise their claim over final goods since these goods are not there.

As the pace of money creation out of “thin air” intensifies it puts greater pressure on the pool of funding. This in turn makes it much harder to implement various projects as far as the maintenance and the improvement of the infrastructure is concerned. Consequently the flow of production of various final consumer goods weakens, which in turn makes it much harder to make provisions for savings. All this in turn further weakens the infrastructure and so undermines the flow of production of final consumer goods.

This exposes the deep misunderstanding of bank created money in Austrian economics.  Credit is backed, banks can create ‘money out of thin air’ only so long as they are backed by prior equity and/or future profits.  As and when productive investments lead to the repayment of loans the money is destroyed.  The pool of funding is not depleted, indeed through productivity improvements and economic growth it is enlarged.

However consider a minskyian ‘Ponzie’ investment.  In such a scenario the money is not destroyed, rather it drains the pool of funding reducing aggregate demand. In the dyspeptic wind model where the more wine you drink the more productive you are the economy hits a sobering reality.  Consumers drink less wine, with demand falling debt repayments become unsustainable and the tipping point of the business cycle is reached.

Hence there is a rather deep connection between the different schools of heterodox economics on this issue and a way back in for capital and value theory in the purely monetary post-Keynesian school.

For the issues regarding land speculation in such model see Gaffney and Cleveland under further reading.

Further Reading 

W.O. Coleman,  WICKSELL AND THE AKERMAN AXE MODEL: A RE-EXAMINATION  Australian Economic Papers December 1983

Knut Wicksell Value Capital and Rent – 1970

Richard Ritter Von Strigl Capital and Production – 1934

Bo Sandelin On Wicksell’s Missing Equation: A Comment History of Political Economy Spring 1980 12(1): 29-40;

Eugen Böhm Ritter von Bawerk  Capital and Interest

Mary M Cleveland George International Journal of Socuial Economics  Wicksell and Gaffney: a three‐factor model of the boom and bust cycle 1990

 Mason Gaffney The Amercian Journal of Sociology and Economics Keeping Land in Capital Theory: Ricardo, Faustmann, Wicksell, and George 2008