Run a query from the command line to BigQuery that writes the output to a CSV file. You need to have Python 3 along with pip, the python package installer installed or you can build the docker image and run it (see instructions below). You need to add your json credentials from Google in the root of the project and name it key-file.json before building the Docker image.
How to use the query runner:
-
Git clone the repository, cd into the project folder and install python virtual environment
pip3 install virtualenv
-
Create the virtual environment
virtualenv -p python3 venv
-
Activate python virtual environment
source venv/bin/activate
-
pip install requirements with the command
pip3 install -r requirements.txt
-
Set the environment variable from the command line pointing to the path of your json service account file or use the one in the project folder called key-file.json
export GOOGLE_APPLICATION_CREDENTIALS=key-file.json
Run command to write output to CSV file
python3 bin/bigquery_runner.py path/to/your/file.sql
You can find the following results written to a CSV in the csv_results folder
Example command:
python3 bin/bigquery_runner.py sql_queries/chicago_crime_preview_10.sql
Add the sql files with the queries you want to run to the sql_queries folder in the project.
Build the Docker image
docker build --rm -t bigquery-runner .
Run the docker image with the provided path to where you wish to store your CSV file
docker run -v /path/to/your/folder:/app/csv_results bigquery-runner sql_queries/chicago_crime_preview_10.sql
Example:
docker run -v ${PWD}/csv_results:/app/csv_results bigquery-runner sql_queries/chicago_crime_preview_10.sql
If you want to use your own google application credentials you can do so by adding the GOOGLE_APPLICATION_CREDENTIALS environment variable on the run command with the value set to the path of your json file
docker run -e GOOGLE_APPLICATION_CREDENTIALS="your-key-file.json" -v ${PWD}/csv_results:/app/csv_results bigquery-runner sql_queries/lowest_crime_rate_per_day.sql