Obtains Area Deprivation Index (ADI) scores for US addresses
This script uses a 4-step procedure to obtain ADI scores for US addresses:
- Load addresses
- Obtain latitude/longitude coordinates for each address
- Utilizes the Google Maps Geocoding API
- Use each address's latitude/longitude coordinates to look up its corresponding US Census FIPS code
- Utilizes the free & public US FCC Area API
- Use each address's FIPS code to look up its ADI scores
- Utilizes a pre-downloaded CSV file from the University of Wisconsin, Madison Neighborhood Atlas
This script was written using Python 3.12.4. UCI MIND does not guarantee that it will work on older Python versions.
We highly recommend using a virtual environment to run this script, as it relies on some third-party Python packages. After cloning the repository, open a terminal window in the repository's root folder and run the following commands to prepare your virtual environment:
# Create a Python virtual environment (only need to do this once):
python -m venv .venv
# Activate the virtual environment:
# (Windows: .ps1 for PowerShell or .bat for Command Prompt)
.\.venv\Scripts\Activate.ps1
# If using PowerShell and "running scripts is disabled on this system", need to enable running
# external scripts. Open PowerShell as admin and use this command (only need to do this once):
set-executionpolicy remotesigned
# While in the virtual env, upgrade pip and install packages (only need to do these once):
python -m pip install --upgrade pip
pip install -r requirements.txt
The files pyproject.toml
and .vscode\settings.json
were used in the development process for this script. They provide settings for automatic code formatting using VS Code's Ruff extension. You can safely ignore these files if they do not apply to your dev environment.
- If you have Python programmers on your team, they can learn more about Ruff here: https://docs.astral.sh/ruff/
Addresses are expected to be in a spreadsheet file named addresses.csv
located in the same folder as main.py
. This file contains all the addresses that will be queried for latitude/longitude coordinate data, FIPS codes, and ADI rankings, with one address per row. Running the script once will automatically create this file, but you can just as easily create one yourself with a simple text editor. addresses.csv
should contain these column headers:
street,apt_num,city,state,zip
Then, populate the sheet with addresses as you see fit using your favorite spreadsheet program or text editor.
To fetch latitude and longitude coordinates, this script requires a Google Cloud API Key that is attached to an account with the Google Maps Geocoding API enabled. This requires a credit card, but the Google Maps Platform currently offers $200 of monthly credit for using its APIs. $5 per 1000 requests = a limit of 40,000 requests in a month before any costs are incurred.
- Sources:
This API key should be stored in a file called secrets.json
, also located in the same folder as main.py
. Running the script once will automatically create this file, but you can just as easily create one yourself with a simple text editor. secrets.json
should follow this format:
{
"google_cloud_api_key": "YOUR_API_KEY_HERE"
}
By default, this script queries for FIPS codes from the most recent US Census (2020 as of writing). If you're running this script in the future (or if you're running retrospective studies), the variable CENSUS_YEAR
near the top of the file main.py
must be changed so the correct FIPS codes can be requested. Refer to the FCC Area API documentation for more info.
This should be set to align with the version of ADI dataset you wish to use.
Download your desired ADI dataset from the Neighborhood Atlas website. Select these options: 12-digit FIPS codes, All States, and your desired year/version.
Your download should be a .zip file containing 1 .txt and 1 .csv file. Place the .txt and .csv files in a folder named adi-data
in the same folder as main.py
. Running the script once will create this folder for you, or you can create it yourself.
The folder adi-data
and files secrets.json
and address.csv
are untracked in this Git repository due to containing information specific to you.
Once you have your addresses, Google API key, and ADI data, you can run the script in a terminal. Assuming you set up a virtual environment using the steps above, the script can be ran with these commands in this repository's root directory:
# Activate your virtual environment (assuming your terminal is PowerShell on Windows)
.\.venv\Scripts\Activate.ps1
python main.py
# Just to be tidy
deactivate
The script will process each address one-at-a-time. When complete, each address and all associated location data will be written to a timestamped CSV file for manual review.
We encourage programmers to modify this script to better integrate into your tech stack. For example, instead of using manually-edited CSVs for input and output, you can use a database API to fetch and upload location data.
To support our work and ensure future opportunities for development, please acknowledge the software and funding. The project was funded by The University of California, Irvine's Institute for Memory Impairments and Neurological Disorders (UCI MIND) grant, P30AG066519.