How to help a layperson find the most appropriate web map?
ISBN 978-85-88783-11-9
Authors
1Cheng, X.
1WUHAN UNIVERSITY Email: cxq1028@163.com
Abstract
Maps on the web are booming and people can access these maps through personal computer, mobile phone and the tablet easily. The public already got used to using maps to represent and analyze what they saw and what they thought. However, as the increase of web maps, it is more and more difficult to identify the right map which mostly fits with user’s requirement. There are several milestones in the development of web maps, such as the establishment of OGC specifications (Web Map Service, etc.), the application of Scalable Vector Graphics and the introduction of Tiled Web Maps, but what really promoting the rise of web maps are Google maps and OSM (OpenStreetMap). Due to the release of Google Maps in 2005, map-based mashups were popular and people with a little coding knowledge can make and publish their own maps, the map making became easy unprecedentedly. The project of OSM, which born in 2004, became the most popular and representative application always be called VGI (Volunteered Geographic Information) which characterizing public participation and collaboration. OSM provides open vector data including roads, buildings, waterways, places and land use, licensed under the Open Data Commons Open Database License (ODbL). So to some extent, Google Maps changes the way we produce maps and OSM makes us control the content of maps, both them made web maps prosperous. Now, instead of lacking data and software necessary for making a map, people are getting stuck in search and identification of the map that meets their demand adequately. This question has two aspects: search and quality evaluation. Finding a collection of diverse, high quality web maps is a significant challenge because the common search engine (Google, Bing, etc.) can’t handle the spatial characteristic of web maps. However we can develop a customized web crawler which integrates spatial semantics. This crawler actively grabs and analyzes map-related web pages and URLs by detecting keywords such as “GETMAP”, “MapServer”, “WMS”, “OpenStreetMap”, ”Tile” & “Map”, etc. After getting the URLs, web requests following some given schema will get the map image for quality evaluation in the next step. Web maps always have some description information or metadata which can be used as a criteria of quality evaluation, but these metadata always only contain some exterior information such as map extent, projection, scale, tile schema, service type, service provider and map update time, there is a dearth of content specific metadata available to identify a web map from others on the Web, such as color, style, legibility, information count, capability and performance, image quality etc.. For this reason, it is imperative to analyze the content of map image. We try to introduce content based image processing technique to extract as many as critical measures identifying the defining characteristic of web maps, for example, Color Image Segmentation and Geographic Feature Extraction and Recognition. These critical measures can be regard as an enrichment or enhancement for traditional metadata.
Keywords
Web Map; Quality Evaluation; Color Image Segmentation