Disclaimer: This is not an official Google product. There is an alternative deployment architecture available here.
To create an automated data pipeline that will run daily to extract the previous day's conversion data via an SA360 data transfer, generate new conversions with calculated order profit as revenue based on margin data file and upload the new conversions back into Search Ads 360 (SA360) where it will be leveraged for Custom Bidding and/or reporting.
Please find below the architecture of the solution:
The pipeline is built within a Google Cloud project instance and uses the following cloud products and technologies:
- Big Query
- Function:
- SA360 to Big Query Connector
- Google Merchant Center (GMC) Data Transfer
- Store the margin data file
- Data storage
- Data transformation
- Google Cloud Functions
- Function:
- Execute script to upload new conversions via SA360 API
- Google Cloud Scheduler
- Function:
- Trigger cloud function
- Google Cloud Storage *(optional)*
- Function:
- Store upload script execution/error logs
- Python 3.7
- Standard SQL
- Caampaign Manager 360 (CM360)
- Create Offline Floodlight Tag
- Grant service account access
- Search Ads 360 (SA360)
- Grant service account access
-
Create a Google Cloud Project instance For this step you may create a new project instance if needed or utilize an already existing project in which case you may skip this step.
-
Enable required APIs (or APIs can be enabled as you perform each subsequent step).
- Documentation to enable APIs
- Cloud Console API Library to enable the following APIs:
- BigQuery API
- BigQuery Storage API
- BigQuery Data Transfer API
- Cloud Functions API
- Cloud Storage API
- Cloud Pub/Sub API
- Campaign Manager API
- Search Ads API
-
Create keyless service account Create service account for your project.
- (Note: Do not generate keys) - Link to GCP
-
Grant service account product permissions
-
*(NOTE: it is recommended to utilize the CM360 API for offline conversion uploads whenever possible. If your use case cannot be accommodated via CM360 then you may proceed with SA360)
-
(Option A) - Add service account email to CM360 account. Add the generated service account email as a user to your Account/Advertiser(s), making sure to grant user role permissions depending on the granularity of access you are comfortable with. At minimum the user account will need Insert Offline Conversions / Edit Offline Conversions permissions for the Account/Advertisers used in this project.
-
(Option B) - Add service account email to SA360 account. Add the generated service account email as a user to your Agency/Advertiser(s), making sure to grant with Agency User or Advertiser User level access (allowed to upload conversions) depending on the granularity of access you are comfortable with. At minimum the account will need Advertiser User access to the specific target advertisers used in this project.
-
-
(Optional) Create Cloud Storage Bucket for upload logs
- Follow steps outlined here to create a GCS bucket named
converison_upload_log
- Follow steps outlined here to create a GCS bucket named
-
Create Bigquery datasets to segment data.
- Documentation to create BQ dataset.
- Create datasets for SA360, GMC and business data.
- SA360: 1 per required Advertiser
- Name:
<advertiser_name>
- Default table expiration: 7 days - Never (table data is appended daily, up to your discretion)
- Name:
- Google Merchant Center (GMC):
- Name:
<account_name>_GMC_feed
- Default table expiration: 7 days - Never (table data is appended daily, up to your discretion)
- Name:
- Business data (margin file)
- Name:
business_data
- Default table expiration: Any
- Name:
-
Create Bigquery Data Transfers
- Create following data transfers:
- SA360 (1 per advertiser, as needed) [link]
- Display Name: Any
- Schedule: Daily (recommended to run early morning, ex: 4AM EST)
- Dataset ID: Relevant SA360 Advertiser dataset created in Step 6
- Agency/Advertiser ID: Both IDs can be found in SA360
- Google Merchant Center [link]
- Display Name: Any
- Schedule: Daily (recommended to run early morning, ex: 4AM EST)
- Dataset ID: Google Merchant Center dataset created in Step 6
- Merchant ID: ID can be found in GMC
- For this project only the Products & product issues option is required and should be checked.
- SA360 (1 per advertiser, as needed) [link]
- Create following data transfers:
-
Upload Margin data into Bigquery
- Manually upload margin data (
.csv
file format recommended) into business_data dataset. - A data transfer from Google Cloud Storage may also used to automatically pull a specifed file which would refresh the target table at a set schedule.
- Manually upload margin data (
-
Create Scheduled Query
- Create a scheduled query to run transformation queries for each advertiser.
- Example Configuration:
- Scheduled Query Name:
<advertiser_name/id> Profit Gen
or Any - Destination Dataset: Rescpective advertiser dataset created in Step 6
- Destination Table:
conversion_final_<sa360_advertiser_id>
- Write Preference:
WRITE_TRUNCATE
- Query String: Reference query code in the
sql_query
folder.
- Scheduled Query Name:
-
Create Delegator Cloud Function
- Create cloud function with the following configurations:
- Step 1 configuration:
- Function Name:
cloud_conversion_upload_delegator
- Region: us-central1
- Trigger: Pub/Sub
- Authentication: Require authentication
- Advanced:
- Memory allocated: 2 GB
- Timeout: 540 seconds
- Service Account: App Engine default service account
- Function Name:
- Step 2 configuration:
- Runtime: Python 3.7
- Entry point:
main
- Code: Reference code in the
conversion_upload_delegator
folder
-
Create Upload Cloud Function - For this step you have the option of standing up either the CM360 upload node or the SA360 upload node.
NOTE: It is recommended to utilize the CM360 API for offline conversion uploads unless your use case can only be supported by the SA360 API. - (Option A) - Create CM360 Cloud Function. Create cloud function with the following configurations: - Step 1 configuration: - Function Name:
cm360_cloud_conversion_upload_node
- Region: us-central1 - Trigger: Pub/Sub - Authentication: Require authentication - Advanced: - Memory allocated: 256 MB - Timeout: 540 seconds - Service Account: App Engine default service account - Step 2 configuration: - Runtime: Python 3.7 - Entry point:main
- Code: Reference code in theCM360_cloud_conversion_upload_node
folder.
- **(Option B)** - Create SA360 Cloud Function. [Create cloud function](https://cloud.google.com/functions/docs/deploying/console) with the following configurations:
- Step 1 configuration:
- **Function Name:** ```sa360_cloud_conversion_upload_node```
- **Region:** us-central1
- **Trigger:** Pub/Sub
- **Authentication:** Require authentication
- **Advanced:**
- **Memory allocated:** 256 MB
- **Timeout:** 540 seconds
- **Service Account:** App Engine default service account
- Step 2 configuration:
- **Runtime:** Python 3.7
- **Entry point:** ```main```
- **Code:** Reference code in the ```SA360_cloud_conversion_upload_node``` folder.
- Standup Cloud Scheduler Job(s)
- Create a Cloud Scheduler job per target advertiser. Please note that each advertiser will have its own scheduled job and Frequency should be staggered by 5 minutes within the same hour.
- Example configuration:
- Name: Any
- Description: Any
- Frequency:
- Example: starting everyday at 6 AM staggered by 5 minutes:
0 6 * * *
5 6 * * *
10 6 * * *
- etc...
- Example: starting everyday at 6 AM staggered by 5 minutes:
- Timezone: As per your preference
- Target: Pub/Sub
- Topic:
conversion_upload_delegator
- Payload samples in the section below.
CM360 sample payload:
{ //configurable in install.sh
"dataset_name": <DS_BUSINESS_DATA>,
"table_name": <CM360_TABLE>,
"topic": <CM360_PUBSUB_TOPIC_NAME>,
"cm360_config": {
"profile_id": <CM360_PROFILE_ID>,
"floodlight_activity_id": <CM360_FL_ACTIVITY_ID>,
"floodlight_configuration_id": <CM360_FL_CONFIG_ID>
}
}
SA360 sample payload:
{
"table_name": "conversion_final_<SA360_advertiser_id>",
"topic": "SA360_conversion_upload" //hardcode
}
Notebook uses the synthesized data, which you can run in less than 30 mins to comprehend the core concept and familiarize yourself with the code.
We recommend that you follow three broad phases to productionalize the solution:
- Phase 1 - use the notebook to valid account access, etc.,
- Phase 2 - use the test module to further test with the full stack of the services, and finally,
- Phase 3 - operationalize the solution in your environment.
We provide synthesized test data to test the solution in the test_solution folder. Please use the install.sh with proper parameters to install the demo module.