The Open Geospatial Consortium Data Quality Working Group has compiled a survey to assess global spatial data quality. Those that work with and create geospatial data are invited to participate. A background article describing the work of the Data Quality Working Group and the goals of the survey is below. This article was originally written for the GITA newsletter, Network by 1Spatial.
A global opportunity for spatial data quality measurement and reporting
Data quality is a massive concern for those involved in information technology and the software business globally. The Data Warehousing Institute estimated that data quality problems cost U.S. businesses more than $600 billion per year.1 Closer to home for those working with spatial data, PIRA (Commercial Exploitation of Europe’s Public Sector Information, 20002) estimated that in 1999 it would cost the European Community countries €36bn to replace its geographical information assets. This amount was estimated to be growing at €4.5bn per annum. Similar costs for the US were estimated at $375bn with a $10bn growth per annum! These figures will almost certainly have accelerated in the aftermath of 9/11 with the focus on homeland security and geospatial intelligence. The increased emphasis and activity around Spatial Data Infrastructures is also driving a number of regional and national activities using geographical information assets or geospatial data, and its interaction with non-spatial or ‘regular’ data.