Systematic Transportation and Recreation Environment Evaluations using Technology (STREET)
Evaluating neighborhood physical disorder and pedestrian safety across select cities in the Americas by conducting virtual audits using Google Street View.
Many health behaviors and outcomes are influenced by the places where people live, their environment, their neighbors, and societal norms, including neighborhood disorder and street design features.
Systematic social observation techniques such as street audits allow researchers to understand neighborhood conditions without relying on surveys or existing databases. In this process, trained researchers visit study locations in person to record information.
Virtual audits are an emerging technique that simulates systematic social observations using existing imagery such as Google Street View. Instead of visiting locations in person, trained raters “visit” the same locations (where Google Street View imagery is available). This technique shares many advantages and rigor of in-person street audits. In addition, virtual audits are faster and less expensive to complete. Virtual audits are also useful for systematically rating streets within cities and comparing measurements across cities.
Deep learning is another novel method that relies on neural networks trained by human labeled images to automatically label a large number of images from sources such as Google Street View. These methods can multiply the efforts of human raters to label features of interest on all streets in a city that have publicly available images, nationally or globally depending on the focus of the study. The public health application of these methods is recently being developed, but they offer a promising opportunity to expand data collection efforts.
STREET aims to understand neighborhood physical disorder and pedestrian safety across select cities in the Americas by conducting virtual audits. STREET builds on prior data collection efforts from the Rio das Pedras Community Health Diagnosis, and SALURBAL.
STREET also supports the efforts of the recently funded BEPIDL (Built Environment, Pedestrian Injury and Deep Learning study) that is focused on pedestrian collisions and safety perceptions in Bogota, Colombia. The BIPEDL study will use Google Street View images to identify built environment features by training neural networks with human-labeled images
Partners: University of Texas-Rio Grande Valley, University of Washington-School of Public Health, SALURBAL, Universidad de los Andes, University of Maryland, University of Utah, and Pedro Gullón Tosio, MD, PhD, Marie Skłodowska-Curie Research Fellow, Universidad de Alcalá.
The Computer Assisted Navigation Visual Assessment System (CANVAS) platform is a website developed by the Built Environment and Health Research Group at Columbia University. CANVAS facilitates using Google Street View to conduct virtual audits of neighborhoods and street segments.
CANVAS’s management component allows a researcher to select street segments for the auditor to view and standard audit instruments to complete while assessing the street segments. In addition, CANVAS captures and stores the inputted data, manages study progress, computes basic statistical measures of data reliability, and allows for data export to standard statistical packages.
Based on the study, selected street segments from certain geographic areas are assigned to trained raters. CANVAS defines a street segment as the length of street and sidewalk that runs (between the green and red markers in Google Street View).
Raters systematically rate characteristics of the street segment by answering multiple choice questions on CANVAS, based on the Google Street View window. Raters assess buildings and other features that are visible from the street segment.
Follow our progress on Research Gate project page.
The BIPEDL study will also be relying on virtual audits to develop a sample of labeled Google Street View images for training neural networks to label all of Bogota’s streets that have Google Street View images. One difference for this study is that they virtual audits will focus on static images drawn from segments and intersections, though many of the same pedestrian features identified by STREET will be labeled. A subsample of the human-labeled images will be used to train the neural networks, another subsample for validation and an additional subsample for testing with the goal of achieving at least 85% labeling accuracy for each feature. If this level of accuracy is not achieved initially, then the process is repeated by adding more human-labeled images for training. Once minimum accuracy is achieved, the neural networks are then used to label the other street segments and intersections.
The BIPEDL study create neighborhood profiles of road safety from the labeled features that will then be examined with respect to pedestrian collisions, safety perceptions and measures of social equity. A final part of the study is to create a risk map of pedestrian safety using Bayesian spatiotemporal methods.
Remigio, R.V., Zulaika, G., Rabello, R.S., Bryan, J., Sheehan, D.M., Galea, S., Carvalho, M.S., Rundle, A., Lovasi, G.S. A Local View of Informal Urban Environments: a Mobile Phone-Based Neighborhood Audit of Street-Level Factors in a Brazilian Informal Community. Journal of Urban Health. 2019. 18: p. 1-2.
Ndjila S, Lovasi GS, Fry D, Friche AA. Measuring Neighborhood Order and Disorder: a Rapid Literature Review. Current Environmental Health Reports. 2019. 6(4): p. 316-326.
Fry D, Mooney SJ, Rodríguez DA, Caiaffa WT, Lovasi GS. Assessing Google Street View Image Availability in Latin American Cities. Journal of Urban Health. 2020.
Gullón P, Bilal U, Sánchez P, Díez J, Lovasi GS, Franco M. A comparative case study of walking environment in Madrid and Philadelphia using multiple sampling methods and street virtual audits. Cities & Health. 2020: 1-9.
This work is supported by a generous gift from Dana and David Dornsife to the Drexel University Dornsife School of Public Health, and funding from SALURBAL, the Wellcome Trust.