Manual (see at bottom) and automated processes of sourcing data for the stop-covid19-sfbayarea project
This is a script to scrape COVID-19 data from the San Francisco Department of Public Health (SFDPH) website here and add it to a CSV file to use in this SF COVID-19 dashboard.
To install the dependencies for this project, run the following lines of code in your terminal:
python3 -m venv env
source env/bin/activate
pip install -r requirements.txt
Right now, you will need to run python3 scraper.py
to run this tool. In a nutshell, it fetches an HTML page from the SFDPH website, gets the relevant data from the page, and writes a new line with that data to a CSV file in the data
directory.
In the data folder, find the spreadsheet of reporting agencies. Each tends to publish only current cumulative data. We want to use the Wayback Machine to grab the data present on those sites for each day since they started reporting (volunteers will have to determine the first date on a case by case basis).
- select an agency from the CA County Health Sites COVID-19 file (view only - create issue to add or correct info there)
- open the COVID-19 County Data Input file
- add (+) a new worksheet tab
- rename it to the area represented by the website
- visit website and all available datapoints being tracked by that agency, e.g. confirmed cases, deaths, number of tests conducted, etc.
- Visit the Wayback Machine and enter the agency website
- Iterate through each day starting with current, and transpose datapoints to appropriate columns in the data file.
- When you enter a date before the agency started reporting, you may get the 404 error. You're done! If you have the time, start over with another agency on the list!
- This data needs to be updated manually every day for the time being. Let us know in slack if you want to do the daily updates.