In response to a request from the London Assembly the GLA is to develop a ‘London-wide interactive digital model’ to assist planning for tall buildings.
Why hasn’t this been done before?
Three reasons one is who will pay for it? There are numerous data providers of 3D Lidar based data but maintaining a model for the whole of London will cost you around a 40,000 annual subscription.
The second related reason is what standards to use. If commercially sourced then open sourcing is out of the question.
The third reason is why platform to use. Many 3D GIS platforms, for example ArcGIS pro and City Engine struggle with detailed models of multiple square KM.
However there are a growing number of exemplar 3D city wide models – which are open sourced.
The lesson I think from these case studies is don’t worry about the capacity issue or copyright and focus on open source.
There are several reasons for this. Technology is rapidly evolving, clients will use multiple platforms. Whether GIS, game engine or web based. This requires a backend of open source based data which can be translated on the fly.
When it come to open sourcing city wide 3D data there is only one real choice CityGML – which operates with up to 5 levels of detail. It can be converted for use in most geospatial applications, a task made easier if you use FME server as your delivery engine. There is an excellent open source database, which runs off PostGIS 3D City DB which runs on the back of POSTGIS.
There is a growing list of cities which have built CityGML models including Brussels, Berline and Montreal, some of which have true texture information.
If you are going down the open source route then you really need to capture your own 3D data, and sadly that means not using Ordnance survey data apart from Cadastrals. TfL could order a flight every year including facade texture scanning.
How could this be developed into a business model. You could see one where Boroughs get the data for free, and the public can view and analayse the data at a low level of detail through a web based viewer. It would be paid for by developers who could download the data at a higher level of detail than that viewable online.
The Berlin model shows what can be done. Data can be downloaded for individual buildings or whole districts at an intermediate level of detail. Areas are selected through an OpenStreet Map based interface. Berlin having benefitted from obsessive local users who have digitised the footprint of every building. London’s coverage is much more patchy with building specific data tailing off in the suburbs, for example caompare Isle of Dogs and East Greenwich.
Web based viewers for CityGML is somewhat experimental. There are various commercial products and one open source, Cesium, developed by the Technology University of Munich, however to be honest much more work is needed, and exceptionally fast web connections, especially to develop a client with analysis features for tall buildings. However again we dont have to wait, if built on the right foundations the technology and webs speeds will catch up.
There is a danger of a vast commissioning process resulting is a disappointing result. Rather the GLA/TfL should build gradually from an open source base – and finacially support an academic institution to develop the web base client – open source of course – with awards for the best contributions. If built on an open source base then multiple software vendors like Uber and Google, who spend fortunes capturing their own data, will support it if we have a robust model which can be replicated globally.