16 Nov 2025 | 4 min read
In this guide we’re going to show how we load Google Ads data into BigQuery using GCP’s BigQuery Data Transfer Service.
As always, the official docs is the first place to go to for information about the service:
"The BigQuery Data Transfer Service automates data movement into BigQuery on a scheduled, managed basis."
They also go on to mention the modes of using BigQuery Data Transfer:
"You can access the BigQuery Data Transfer Service using the:
We prefer Terraform, which manages transfer configs via the same underlying API that you would use from Python or Java.- but in this guide we’ll be showing how the process of loading Google Ads to BigQuery via the bq command-line tool.
We’ll assume you have basic knowledge of shell commands like bash or zshrc, and familiar with the gcloud CLI.
We’ll also assume you’re familiar with Google Cloud Platform (GCP) (basic understanding will suffice).
We’re using a service account to run commands - if you want to do the same, follow these instructions to create one.
Ensure the user running the commands has at least Viewer access to the Google Ads account.
Instructions on how to grant Google Ads account access to a user can be found here.
These are the variables (with examples) we’ll be using in later commands. You can drop this in an .env and source it or run them directly in the shell - whatever method you prefer:
PROJECT=example-project-123456
CUSTOMER_ID=123-456-789
SA=example-name@example-project-123456.iam.gserviceaccount.com
DATASET=example_dataset
LOCATION=europe-west2
ROLE=bq_transfer_updater
TRANSFER_NAME=my_google_ads_transfer
Ensure your user e.g. service account has access to the target Google Ads account. Without this you wont’ be able to create the data transfer:
example-name@example-project-123456.iam.gserviceaccount.com
Enable the BigQuery and BigQuery Data Transfer APIs in your project so you can use them:
gcloud services enable \
--project=$PROJECT \
bigquery.googleapis.com \
bigquerydatatransfer.googleapis.com
We need a dataset to load our Google Ads data to. We can create one via the bq command:
bq mk \
--dataset \
--location=$LOCATION \
--description "Google Ads dataset. Tables populated by BigQuery Data Transfer ${TRANSFER_NAME}." \
$PROJECT:$DATASET
Create a custom IAM role with just the transfers permissions - since that’s all we’ll need in terms of permissions:
gcloud iam roles create $ROLE \
--project=$PROJECT \
--title="BQ Transfer Updater" \
--permissions="bigquery.transfers.update,bigquery.transfers.get"
Grant the custom role we just created to the service account so it has permissions to operate on data transfers:
gcloud projects add-iam-policy-binding $PROJECT \
--member="serviceAccount:${SA}" \
--role="projects/${PROJECT}/roles/${ROLE}"
To ensure our data transfer can load data to our chosen dataset, we need to explicitly grant our service account access to the dataset - we can do this via the following bq command:
bq query --nouse_legacy_sql \
"GRANT \`roles/bigquery.dataEditor\` \
ON SCHEMA \`${PROJECT}.${DATASET}\` \
TO 'serviceAccount:${SA}';"
Let’s first create a PARAMS variable for readability, where we’ll pass the Google Ads account ID i.e. customer_id):
PARAMS='{"customer_id":"'"${CUSTOMER_ID}"'","exclude_removed_items":"false"}'
Then we can pass this to the transfer setup which will then create the data transfer:
bq mk \
--transfer_config \
--data_source=google_ads \
--project_id=$PROJECT \
--target_dataset=$DATASET \
--display_name=$TRANSFER_NAME \
--params=$PARAMS \
--refresh_window_days=7 \
--service_account_name=$SA
Upon success we should see the transfer config ID outputted to the terminal - save that to the variable:
TRANSFER_RUN_CONFIG=projects/123456789012/locations/europe-west2/transferConfigs/123a456b-0000-1234-567f-c89abc01d012
If you missed the output, run the following bq ls command to find it:
bq ls \
--transfer_config \
--transfer_location=$LOCATION \
--project_id=$PROJECT
The Data Transfer Service only supports 180 days max per backfill, so anything more than that you’d have to run yourself.
To trigger a transfer for a particular date, say 03 October 2023, we run this:
bq mk \
--transfer_run \
--run_time=2023-10-03T00:00:00-00:00 \
$TRANSFER_RUN_CONFIG
We check to see if numbers line up with the Google Ads UI as many users will question the validity of our data if it isn’t:
What we’ve noticed at the time of writing this, is that the transfers for each date takes a while to complete so in order to speed this process up we run a bq command that will trigger a run for each date in a date range.
We start by defining our date range:
START_DATE="2023-07-10"
END_DATE="2023-07-20"
SLEEP_SECONDS=30
Note that this will create data transfers for dates 10 July 2023 up to and including 19 July 2023 (not 20 July 2023).
Here's the script to run each daily transfer:
current_date="$START_DATE"
while [[ "$current_date" < "$END_DATE" ]]; do
echo "Scheduling run for ${current_date}T00:00:00-00:00"
bq mk \
--transfer_run \
--run_time="${current_date}T00:00:00-00:00" \
$TRANSFER_RUN_CONFIG
echo "Sleeping for ${SLEEP_SECONDS} seconds..."
sleep $SLEEP_SECONDS
# Increment date by one day
current_date=$(date -j -v +1d -f "%Y-%m-%d" "$current_date" +%Y-%m-%d)
done
echo "All runs scheduled from $START_DATE to $END_DATE."
The reason for running the loop command over scheduling a backfill is that the data transfer service will schedule the jobs approx 30 minutes apart. If you have large historical volumes then that can take ages.
See below, green were scheduled and completed via the bq command, whereas those still pending are ones that were scheduled via the backfill with date range option:
You should now see a list of Google Ads views and tables in BigQuery:
That is the full workflow for getting Google Ads data into BigQuery using the BigQuery Data Transfer Service.
Once the service account is set up and the transfer config is created, the pipeline runs on its own and backfills are easy to trigger as needed.
At that point you can focus on modelling and reporting instead of manually exporting spreadsheets or troubleshooting broken connectors.
This is the foundation we use before building any marketing analytics stack, because clean and repeatable data movement is the part that keeps everything else stable.