This repository documents the basic data and code an urban science research work. This work will be published in the ISPRS Journal of Photogrammetry and Remote Sensing. We have explored a new method of relying on spatial perception of streets to assess spatial quality of urban on a large scale. An empirical study is used as an example to validate our approach. A machine learning scoring model is constructed using street images of the city area. Perceptions were scored on six dimensions: beautiful, affluent, safe, lively, depressing and boring. The six dimensional perceptions were further divided into positive and negative perceptions. The top 20% of street scenes with the highest positive perception scores were considered as high quality street spaces, and the top 20% of street scenes with the highest negative perception scores were considered as low quality street spaces. Finally, high accessibility was superimposed to identify the street spaces with the highest probability of being traversed by residents and of high (low) quality.
The research is carried out under the SHAPC-lab and the lab director & the corresponding author:Jie He academic social networks page:
http://faculty.hitsz.edu.cn/hejie
For more information related to the research, please follow the laboratory's WeChat Official Account:
- ADE20K dataset
- Depthmap software
- ArcGIS software
- Pillow
- Time
- Urllib
- TensorFlow
- Tensorboard
- Keras
- Numpy
- Matplotlib
- Image semantic segmentation training data storage
ADE20K
- Image semantic segmentation training data pre-processing storage
ADE20K_Process
- Image semantic segmentation training data pre-processing code
make_dataset
- Image semantic segmentation training and applications
SegNet_ImageSemanticSegmentation
- Street view image download and pre-processing
Streetview_Download
- Street View image perception scoring
StreetView_Score
- Urban street accessibility processing data
Depthmap
- Urban perception and street view collection point data
GIS_file
# SHAPClab_UrbanPerception_StreetAccessibility