The OpenStreetMap (OSM) project is an open-source, community-based, user-generated street map/data service. It is the most popular project within the state of the art for crowdsourcing. Although geometrical features and tags of annotations in OSM are usually precise (particularly in metropolitan areas), there are instances where volunteer mapping is inaccurate. Despite the appeal of using OSM semantic information with remote sensing images, to train deep learning models, the crowdsourced data quality is inconsistent. High-resolution remote sensing image segmentation is a mature application in many fields, such as urban planning, updated mapping, city sensing, and others. Typically, supervised methods trained with annotated data may learn to anticipate the object location, but misclassification may occur due to noise in training data. This article combines Very High Resolution (VHR) remote sensing data with computer vision methods to deal with noisy OSM. This work deals with OSM misalignment ambiguity (positional inaccuracy) concerning satellite imagery and uses a Convolutional Neural Network (CNN) approach to detect missing buildings in OSM. We propose a translating method to align the OSM vector data with the satellite data. This strategy increases the correlation between the imagery and the building vector data to reduce the noise in OSM data. A series of experiments demonstrate that our approach plays a significant role in (1) resolving the misalignment issue, (2) instance-semantic segmentation of buildings with missing building information in OSM (never labeled or constructed in between image acquisitions), and (3) change detection mapping. The good results of precision (0.96) and recall (0.96) demonstrate the viability of high-resolution satellite imagery and OSM for building detection/change detection using a deep learning approach.

Remote Sensing and Deep Learning to Understand Noisy OpenStreetMap

Usmani, Munazza;Bovolo, Francesca;Napolitano, Maurizio
2023-01-01

Abstract

The OpenStreetMap (OSM) project is an open-source, community-based, user-generated street map/data service. It is the most popular project within the state of the art for crowdsourcing. Although geometrical features and tags of annotations in OSM are usually precise (particularly in metropolitan areas), there are instances where volunteer mapping is inaccurate. Despite the appeal of using OSM semantic information with remote sensing images, to train deep learning models, the crowdsourced data quality is inconsistent. High-resolution remote sensing image segmentation is a mature application in many fields, such as urban planning, updated mapping, city sensing, and others. Typically, supervised methods trained with annotated data may learn to anticipate the object location, but misclassification may occur due to noise in training data. This article combines Very High Resolution (VHR) remote sensing data with computer vision methods to deal with noisy OSM. This work deals with OSM misalignment ambiguity (positional inaccuracy) concerning satellite imagery and uses a Convolutional Neural Network (CNN) approach to detect missing buildings in OSM. We propose a translating method to align the OSM vector data with the satellite data. This strategy increases the correlation between the imagery and the building vector data to reduce the noise in OSM data. A series of experiments demonstrate that our approach plays a significant role in (1) resolving the misalignment issue, (2) instance-semantic segmentation of buildings with missing building information in OSM (never labeled or constructed in between image acquisitions), and (3) change detection mapping. The good results of precision (0.96) and recall (0.96) demonstrate the viability of high-resolution satellite imagery and OSM for building detection/change detection using a deep learning approach.
File in questo prodotto:
File Dimensione Formato  
remotesensing-15-04639.pdf

accesso aperto

Licenza: Creative commons
Dimensione 7.53 MB
Formato Adobe PDF
7.53 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/340428
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact