The Open Source Geospatial Foundation, or OSGeo, founded in 2006 is a not-for-profit organization whose mission is to support and promote the collaborative development of open geospatial technologies and data.
The Open Knowledge Foundation (OKF) is a not-for-profit organization founded in 2004 and dedicated to promoting open knowledge in all its forms.
What follows is a shared response to some of the questions raised by the consultation on the future of the Ordnance Survey’s data licensing and pricing model. This was sent using Ernest Marples’ open UK Geographic Data Consultation response service. See also the Simply Understand digestable, short version of the consultation document. March 17th, this Wednesday is the closing day of the response period.
Geographic information is critical to making effective use of open government data. Everything happens somewhere; to find data, and analyse it, location is invaluable context.
The Making Public Data Public programme is part of a global trend among administrations to provide state-collected information to citizens, free of cost or constraints.
Within the UK, public authorities, universities and commercial entities have been unable to exchange Ordnance Survey supplied geographic information with one another, although they have all purchased a license to use the data. The restrictions on “derived works” produced using Ordnance Survey data have blocked innovation and collaboration among those who already have access. Whichever policy option is chosen, the restrictions on sharing derived works should be lifted.
Every resource found in the real world has a geographic context. Basic geographic data should be readily available in a machine-readable form suitable for sharing; existing data can be efficiently updated by a wide user base, and gaps in the geographic data can be quickly filled.
The OpenStreetmap project (OSM) recently signed up its 200,000th map editor for its free and open source collaborative mapping project. After a rapid response by contributors to the crisis in Haiti, OSM became the default map for fieldwork in disaster recovery. Providers of aerial imagery lifted restrictions on derived works (for example, tracing of shapes from aerial imagery), and this was key to enabling the rapid response.
As free and open source platforms for managing geographic data have matured, so the knowledge and competence needed to work with GI has grown, and this expertise is available in local authorities, public archives, and in libraries.
As it stands, local authorities must produce and maintain data, which is then integrated into a standard national map by Ordnance Survey. New technology implies a change in the role of a National Mapping Agency, but it is still needed as source of quality assurance for data produced by a federation of local authorities.
Basic geographic data collected by a National Mapping Agency should be available at marginal cost – i.e. at the cost of distribution, tending to zero where data is made available over the internet. This is the model recommended by the 2008 Treasury-sponsored study on Models of Public Sector Information Provision via Trading Funds.
Government should commit to supporting the National Mapping Agency outright, as a public good. This commitment can be justified on economic grounds, due to the available evidence suggesting that the free publication of geographic information will produce a surplus of taxable economic activity, above and beyond the costs of collection and maintenance of that data.
Government should commit to supporting the cost of maintaining quality geodata centrally, rather than distributing the costs around public authorities’ budgets. New business activity enabled by open geodata will be taxable, at no net cost (and possibly even a net profit) to the government.
The analysis done during the Cambridge study on Models of Public Sector Information Provision via Trading Funds suggested that for large-scale, high specification OS data products “a change from average cost to a marginal cost regime would increase welfare. Specifically, gross benefits would be around £168m a year while net costs to government would be around £12m. Overall this implies an overall net benefit to society of £156m.”
Another reference is the recent New Zealand government report on Spatial Information in the New Zealand Economy:
“In 2008, the use and re-use of spatial information is estimated to have added $1.2 billion in productivity-related benefits… equivalent to slightly more than 0.6 per cent of GDP or GNP in 2008.”
“Had key barriers been removed it is estimated that New Zealand could have benefited from an additional $481 million in productivity-related benefits in 2008, generating at least $100 million in government revenue.”
It is important to recognise that the benefits go far beyond the traditional suppliers and consumers of geographic information.The economic and social value implicit in geographic information are only beginning to be realised. The market benefits of unlocking raw Ordnance Survey data will quickly return to the government more than it invests in supporting the agency.
As the consultation document states: “At the heart of [Option 3] is a shift towards government paying more”. Taxpayer-funded institutions already contribute more than 46% of Ordnance Survey’s revenue. Public authorities are suffering budget cuts due to the UK’s public sector debt crisis, and simply cannot afford to pay more.
An “updater pays” model can be used to defray the cost of maintenance of free and open mapping data. For example, a construction firm would pay to re-survey changes to the built environment that their work has caused. This can be viewed as a stamp tax on updates to geographic information.
Provided that the underlying data is made freely available, the Ordnance Survey may also continue to fund its activities in part by charging more than the marginal cost of distribution for “value-added” products, such as finished maps in print and digital forms.
The consultation document names a list of Ordnance Survey data products; of these, CodePoint (or a similar dataset which contains a mapping of postal codes to coordinate locations) and BoundaryLine (administrative and political boundaries) have the highest priority for civic applications. Without these two products available for re-use without constraint, the aims of the Making Public Data Public programme simply cannot be realised.
The small and mid scale raster products (images of maps showing less detail) can be generated from data at the largest scale. But these kind of added-value data products are not critical to the aims of the Making Public Data Public project, in the way that administrative areas and postal code locations are critical.
The Ordnance Survey should publish the raw data immediately, using common standards-based Internet technologies. Services providing annotation, printing, and download of map images can remain paid-for commodities for those without access to GIS expertise to work with the data.
Data should be made available under an open license which ensures freedom to use, alter and redistribute the data.
The license chosen should accord with the terms of the Open Knowledge Definition.
In short,
Option 1 (things stay the same) is a non-option, given that geographic information is so critical to unlocking public data.
Option 2, (all OS raw data available at marginal cost) is the preferable option.
Option 3 (staged release of less commercially valuable products) seems like an unhealthy compromise.
Absolutely agreed that Boundary Line and Code Point (Royal Mail notwithstanding) must form part of OS Free without let or hindrance. The MPDP project and most third sector and pubic sector activities of social value are dependent upon this.
Correction: the CODES process by which OS took data from external sources such as local authorities to leverage apparent benefits from the so doing has been run down owing to the inability or unwillingness of local authorities to provide ‘as built’ (as opposed to planned) data. Hence OS collect the data and do not rely on LAs as stated here. Sad but true.
Given that large scale data accounts for some 80% of all OS income, as paid by consulting engineers etc at present, the suggestion that the burden can be shifted from buyer to supplier seems perverse and will ultimately be borne by (i.e. charged on to) the developer anyway, often working in or on behalf of the public sector which then picks up an inflated bill. In addition data quality will suffer.
No question that the current process is not the most efficient and that some form of out-sourcing model for GI data capture would be better for UK plc.
Modesty has prevailed but worth noting that the ‘Cambridge’ study referred to was prepared by at least one member of OKF. It demonstrates a profound misunderstanding of the demand for large scale geographic data in assuming a high price elasticity of demand for such data. While this may be true for route network (likely to change again with Google Navigation and Ovi Maps for free) and address data there is really no evidence to support such an assumption for topoographic data. As Nigel Shadbolt himself said at a recent meeting there is certainly a need for greater research into the economics of GI before any such case or policy decisions are made.
Resolution of the ‘derived data’ issue is one of the great missed opportunities in the consultation, along with defining of the pubic task for OS and consideration of how long term real terms costs to the user could be further reduced (remembering that they have already fallen 30% in recent years) by focusing OS on technical data issues such as quality and abandoning sales, marketing etc.
We shall see soon enough…..