Feeds:
Posts
Comments

Archive for August, 2011

Over the last decade or so private companies have been acquiring vast quantities of geospatial information, and as technology progresses the way in which this data is collected, stored and distributed changes too. Collaborative data collection, more commonly known as crowdsourcing, has also taken off with volunteers populating data portals such as OpenStreetMap.

Traditionalists question the quality of this crowdsourced data but where its strength lies, particularly in the mapping world, is in the ability of enthusiastic volunteers to provide specific details on areas that aren’t covered by commercial or public sector mapping organisations. Citizen cartography is taking off and while professional surveyors may be horrified at the thought of volunteers mapping the world, there is no getting away from the fact that it is already happening.

The commercial world has been quick to grasp the business opportunity with commercial mapping companies using crowdsourcing to assist in the creation of up-to-date, detailed, low cost maps. These businesses recognise the advantages that crowdsourcing has to offer in terms of providing rapid and detailed updates at very low cost. They are not about to reject data generated by their “non-expert” users when this data can be used to improve their product offerings at little cost. Business is business.

This leads to questions regarding the accuracy of crowdsourced data and how to manage it. Some navigation companies handle the matter by using mapping specialists to verify user-generated data before it is incorporated into official updates while other mapping companies handle the issue by assessing over time the credibility of users reporting data and by determining how many other users are in agreement with the user-generated data.

Previously users acquired their data from public sector sources such as our national mapping agency, Chief Directorate: National Geo-spatial Information. When acquiring data from such sources, users are generally assured of the quality of the data they are accessing – it is accurately collected by mapping specialists and is maintained and updated accordingly. This is not necessarily the case when it comes to data provided by collaborative and/or commercial data providers.

Google itself is upfront about not providing any guarantees as to the quality of its spatial data. Ed Parsons makes the point that people all over the world are using Google’s spatial data because it is good enough for their purposes. Why, he asks, should Google spend billions of dollars providing a high quality data product when the majority of its users don’t have exacting data quality requirements.

Moving data collection beyond the hands of the experts may lead to an inrease in suspect data but there is no getting around the fact that other people are interested in this data and have a use for it. Entrepreneurs are often able to see opportunities and applications inherent in “non-expert” mapping data that mapping specialists would turn their noses at. This does not mean to say that there is no need for quality data. Of course there is. Decision makers, policy makers and a variety of users across all our economic sectors require high-quality spatial data in order to do their work. But who says we have to have one or the other. Surely there is a place for both in this rapidly changing world of ours.

 

Advertisements

Read Full Post »

%d bloggers like this: