Giter Club home page Giter Club logo

analytics-settings-database's Introduction

Google Analytics Settings Database

This is not an officially supported Google product.

This repository contains code for a Google Cloud Function that loads Google Analytics 4 settings into a set of BigQuery tables. By default, the function is scheduled to run daily. This creates a daily backup of Google Analytics settings that can be used for a variety of purposes, including restoring settings, auditing setups, and having an extensive change history across accounts.

Requirements

Implementation

Downloader Function

  1. Navigate to your Google Cloud and enable the Google Analytics Admin API

  2. Navigate to IAM & Admin > Service Accounts and create a new service account.

    • Give the service account a name. This guide will use analytics-settings-database as an example.
    • Grant the service account the BigQuery Admin role.
    • Grant the service account the Cloud Funtions Invoker role.
  3. Click on your newly created service account and navigate to "Keys".

  4. Create a new JSON key. Make note of the file that was downloaded to your device.

  5. Open cloud shell and enter the following to create the cloud function:

    
    rm -rf analytics-settings-database && git clone https://github.com/google/analytics-settings-database.git && cd analytics-settings-database && bash deploy_function.sh
    
    
    • Follow the steps outlined in the deploy script to create the HTTP function.
    • Once the function has been created, navigate to Cloud Functions and click on the cloud function.
    • Edit the function and under "Runtime, build, connections and security settings", change the service account to the one you recently created.
    • Click next.
    • Click "+" to create a new file. Name this file credentials.json and add the contents of the key file you downloaded earlier afer you created your service account.
    • Click deploy. Your cloud function should now be operational.
  6. If it is not already open, open cloud shell again and enter the following to create your BigQuery dataset and tables. This will automatically create a data set named "analytics_settings_database" and populate it with the required tables.

    
    cd analytics-settings-database && bash deploy_bq_tables.sh
    
    
    • Once the script has run, your new tables should be visible in BigQuery.
  7. Navigate to Cloud Scheduler.

    • Create a job.
    • Give the schedule a name. This guide will use analytics-settings-database as an example.
    • Enter a frequency. You can enter whatever frequency you would like, but this guide will use 0 22 * * * as an example to run the scheduler daily at 10 PM (or 22:00).
    • Enter your timezone.
    • Set the target type to HTTP.
    • Set the URL to your cloud function's URL.
    • Set the Auth header to "Add OIDC token" and select your service account.
    • Save the schedule.
  8. Copy your service account email and grant it access to the GA4 accounts you want it to access.

Upon completing the implementation process, the settings for your Google Analytics accounts that the API can access will be loaded into BigQuery daily at 10 PM. The frequency with which this happens can be adjusted by modifying the Cloud Scheduler Job created during the deployment process.

Property Overview Table

The property overview table serves as an example of how the various settings tables can be combined to create a useful report on the status of various settings for your properties. You can either create a scheduled query based on the property_overview.sql query or create a cloud function workflow.

Scheduled Query Deployment
  1. Open cloud shell and enter the following to create the cloud function:

    
    cd analytics-settings-database/report_tables/property_overview && bash deploy_query.sh
    
    
  2. Upon completing the deploy script, the scheduled query should be complete.

  3. Setup is now complete. Your new property overview table will be populated when your scheduled query runs.

Cloud Function Workflow
  1. Open cloud shell and enter the following to create the cloud function:

    
    cd analytics-settings-database/report_tables/property_overview && bash deploy_function.sh
    
    

    When the script completes, the following should be created:

    • A property overview cloud function.
    • A property overview table.
    • A property overview workflow.
  2. Navigate to your workflows and edit your new workflow.

  3. Add a cloud scheduler trigger and save your workflow.

  4. If you previously created a cloud scheduler to trigger your downloader function, navigate to to the Cloud Scheduler page and delete the cloud scheduler that was previously used with your cloud function.

  5. Setup is now complete and your new property overview table should be populated whenever your worklow is scheduled to run.

analytics-settings-database's People

Contributors

b-kuehn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

analytics-settings-database's Issues

Missing fields in destination tables

There are currently fields missing, here is what i found:

ga4_conversion_events, missing fields:

  • default_conversion_value (RECORD)
  • value (FLOAT)
  • currency_code (STRING)

I also had an error in ga4_custom_metrics where "measurement_unit" should be a STRING maybe? But i didn't test that because i don't need the custom metrics so i just removed them

Error when creating cloud scheduler

Hey there!

I did my best to avoid submitting a request but I wanted to reach out. The steps I've taken include deleting my first staging project and creating a new staging project just to make sure I didn't miss a step in the process.

I am getting two errors at the last stage of the script when it tries to create a cloud scheduler. See the errors below and the screenshot here https://d.pr/i/BrTjGp .

ERROR: (gcloud.app.create) The project [myproject-name] already contains an App Engine application in region [us-central].  You can deploy your application using `gcloud app deploy`.
ERROR: (gcloud.scheduler.jobs.create.http) argument --uri: Bad value []: Must be a valid HTTP or HTTPS URL.

I am using a brand new project to test out the script prior to installing on a production project.

I hope this is enough information to get started. Let me know if I should provide more details.

Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.