Code and data supporting the results of the paper "Learning from vector data: enhancing vector-based shape encoding and shape classification for map generalization purposes", published in the CaGIS Special Issue "Machine Learning in Cartography".
For recreating the results using Jupyter Notebooks:
- Polygon encoding notebook for polygons
- Polyline encoding notebook for polylines
- Template matching notebook for the case study
For model comparison using python:
- main.py for polygons
--> Just change the hyperparameters, select the features by commenting/uncommenting them in the list and run the code.
- Link to repository of the building data set from Yan et al. (2021)
- Coastline data: Coastline data, manual selection from © European Marine Observation and Data Network (EMODnet)
- Case study data: Buildings from the area of the case study in Beijing, © OpenStreetMap contributors
- Building Templates: Manual selection from the dataset of Yan et al. (2021) (see above)
- Dense building data: Manipulated data of Yan et al. (2021) to be used for map-based normalization, as the buildings are in a smaller area
- Train the models as described above with the polygon encoding notebook.
- Predict the classes for the buildings in the case study and export as follows (example for 'RNN' model):
pred_case_study = {'index': [i for i in range(len(predictions))],
'osm_id_ret': id_list,
'clas_prediction': predictions,
'shape_prediction': shape_predictions}
predictionsDF = pd.DataFrame(pred_case_study)
predictionsDF.to_csv('predictions_case_study-RNN.csv')
- Repeat for other models if wanted.
- Open the template matching notebook, load the csv-file and run the code.
Martin Knura (2024): Learning from vector data: enhancing vector-based shape encoding and shape classification for map generalization purposes, Cartography and Geographic Information Science, 51:1, 146-167, DOI: 10.1080/15230406.2023.2273397