You need Google Ads data in BigQuery. The task sounds straightforward until you realize there are four fundamentally different ways to do it, each with distinct trade-offs around cost, sync frequency, and maintenance burden.
The right choice depends on constraints you might not have considered yet. Before comparing dashboards and feature lists, you need to answer one question that eliminates half your options immediately.
The Developer Token Question
A Google Ads developer token is a 22-character alphanumeric string required to make Google Ads API calls. You get one from the API Center in a Google Ads Manager Account, but getting it approved is where teams hit a wall.
Access levels progress from Test Account (no production access) through Explorer (limited), Basic (production, 15,000 daily operations), and Standard (production, unlimited). The stated review time is 3-5 business days. The reality is often weeks to months.
Common rejection reasons include vague use case descriptions, website issues, and (critically) third-party tool usage. Using open-source tools like Airbyte with your developer token can violate Google’s API policies and get your application rejected.
Here’s what this means for your options:
| Approach | Developer Token Required |
|---|---|
| BigQuery Data Transfer Service | No |
| Google Ads Scripts | No |
| Fivetran | No (they use their own) |
| dlt | Yes |
| Airbyte | Yes |
| Custom API pipeline | Yes |
If you don’t have a developer token and don’t want to wait months for approval, your realistic options are Data Transfer Service, Scripts, or a managed connector like Fivetran. That constraint alone shapes your decision.
Approach 1: BigQuery Data Transfer Service
Google’s native solution is fully managed and free, with no connector costs. You only pay standard BigQuery storage and query rates.
Setup happens entirely in the BigQuery Console. Navigate to Data Transfers, create a new transfer, select Google Ads as the source, enter your Customer ID, configure the schedule, and authorize via OAuth. For Performance Max campaigns, enable PMax Campaign Tables.
Data available includes date-partitioned tables with naming format ads_[ReportName]_[CustomerID]. Standard reports cover campaigns, ad groups, ads, keywords, search queries, geo performance, demographics, and various metadata. Custom GAQL reports are now supported for queries beyond standard fields.
Limitation: minimum sync frequency is once every 24 hours. The refresh window is configurable from 0-30 days (default 7), and backfills run about 35 minutes apart per date. Google Ads statistics may lag up to 3 hours.
Historical data is the bigger gap. Data Transfer Service only loads data from the point you set it up, plus the backfill window. If you need two years of campaign performance to build trend reports or train bidding models, this approach won’t get you there on its own. The Google Ads API (via dlt or Scripts) can pull historical data, so teams that need a full backlog often end up running a one-time historical load through a different method, then switching to Data Transfer Service for ongoing syncs.
Schema updates are another consideration. When Google updates API versions (the v21 to v22 transition in January 2026 renamed several metrics), your downstream queries may break. This isn’t unique to Data Transfer Service, but with managed connectors you’re somewhat insulated from the change.
When to choose this: You need Google Ads data in BigQuery, daily updates are sufficient, and you want a simple setup with zero licensing costs. This covers the majority of analytics use cases. For an MCC managing multiple accounts, set up MCC-level transfers rather than individual account transfers.
Approach 2: Fivetran
Fivetran offers a fully managed ELT connector with 15-minute minimum sync frequency. Setup is straightforward: create an account, add the Google Ads connector, sign in via OAuth, select accounts, configure reports, and set your conversion window (1-90 days, default 30).
The data coverage is similar to Data Transfer Service with some additions. Custom reports support GAQL queries. Recent features include SEARCH_TERM_KEYWORD_STATS and Google Ads v22 support. You also get pre-built dbt models for faster transformation.
Costs are where Fivetran gets complicated.
Fivetran uses Monthly Active Rows (MAR) pricing. A March 2025 change shifted from account-wide to per-connector MAR tiering, eliminating bulk discounts. Plans range from Free (500K MAR/month) to Business Critical (roughly $1,067/million MAR), with a $5 base charge per standard connection and $12,000 minimum annual contract.
Marketing data is particularly problematic because rows update frequently. Attribution windows mean conversions get attributed retroactively, and ad-level data creates high row counts. Reddit feedback reports 4-8x cost increases under the new pricing, with users describing costs going “from $20/month to $2,000/month” as they scaled.
Sync behavior works incrementally for the past 3 days, with a daily rollback sync capturing changes outside that window. The GEO_TARGET table runs full syncs on the 1st and 15th of each month, which can spike your MAR unexpectedly.
When to choose this: You need sync frequency faster than daily, you’re already in the Fivetran ecosystem for other connectors, and your budget can absorb unpredictable costs. Watch MAR counts carefully on marketing data and get estimates before committing. For a broader comparison of managed and open-source connectors, see my comparison of Fivetran, Airbyte, and dlt.
Approach 3: Google Ads Scripts
Google Ads Scripts run JavaScript within the Google Ads interface and can export data directly to BigQuery. No developer token required.
Scripts authenticate via the logged-in Google Ads account and run within Google’s infrastructure. They have their own execution limits (30 minutes maximum runtime) rather than using the external API quotas.
Setup involves navigating to Google Ads → Tools → Bulk Actions → Scripts, enabling the BigQuery Advanced API, and pasting a script that queries data and loads to BigQuery. Schedule daily, preferably 3am or later for data accuracy.
Capabilities vs the full API:
| Feature | Scripts | Full API |
|---|---|---|
| Developer Token | Not required | Required |
| Execution Time | 30 min limit | No limit |
| Scheduling | Hourly/Daily/Weekly | Full control |
| Custom Logic | JavaScript | Any language |
A basic script structure queries the Google Ads reporting API, formats results, and streams them to a BigQuery table. You get full control over which fields to pull and how to structure the output.
The limitations matter for larger operations. The 30-minute execution limit becomes problematic for accounts with high data volumes. Scheduling is hour-level only (no specific minute scheduling). And you’re responsible for updating scripts when Google changes field names.
When to choose this: You need custom logic without API complexity, you’re managing a limited number of accounts, you don’t have server infrastructure, and daily or weekly reports are sufficient. Scripts work well as a middle ground between the simplicity of Data Transfer Service and the complexity of API-based solutions.
Approach 4: dlt (Data Load Tool)
dlt is an open-source Python library for data loading that offers a code-first approach to Google Ads extraction. Full control over the pipeline, automatic schema evolution, and incremental loading, all at infrastructure cost only.
Limitation: dlt requires a developer token. If you can get one approved, you gain significant flexibility.
The Google Ads source is verified and provides actual Python code for customization. A basic pipeline:
import dltfrom dlt.sources.google_ads import google_ads
pipeline = dlt.pipeline( pipeline_name="google_ads", destination="bigquery", dataset_name="google_ads_data")
source = google_ads( customer_id="1234567890", queries=[ { "query": "SELECT campaign.name, metrics.clicks FROM campaign", "table_name": "campaign_performance" } ])
load_info = pipeline.run(source)dlt handles pagination, rate limiting, schema inference, and incremental loading automatically. You deploy the pipeline wherever Python runs: Cloud Functions, Cloud Run, Airflow, or a simple cron job.
When to choose this: Your team is Python-proficient, you have (or can get) a developer token, you want infrastructure-only costs, and you need custom control over the extraction logic. dlt works particularly well when you’re already using it for other sources and want consistency across your ingestion layer. For a complete walkthrough, see my hands-on guide to dlt.
Comparison Summary
| Factor | Data Transfer Service | Fivetran | Google Ads Scripts | dlt |
|---|---|---|---|---|
| Cost | Free | $$-$$$ (MAR) | Free | Free (infra only) |
| Developer Token | Not needed | Not needed | Not needed | Required |
| Min Sync Frequency | 24 hours | 15 minutes | Hourly | Custom |
| Setup Complexity | Low | Low | Medium | Medium-High |
| Custom Reports | GAQL support | GAQL support | Full control | Full control |
| Maintenance | Low | Low | Medium | High |
| Best For | Most use cases | High-frequency needs | Custom logic | Python teams |
Common Pitfalls Across All Approaches
Attribution window challenges affect every approach (see comparing attribution models for deeper coverage). Conversions can be attributed retroactively for up to 90 days depending on your settings. Whatever sync frequency you choose, your data will change after the fact. Build your reporting to handle this. Avoid treating yesterday’s numbers as final.
Schema complexity catches teams off guard. Data Transfer Service creates dozens of tables. Understanding the relationships between campaign, ad group, and ad-level data requires reading Google’s documentation carefully.
Developer token approval remains the major barrier for API-based solutions. If you need dlt or custom pipelines, start the application process early. Provide specific, detailed use cases in your application. Avoid mentioning third-party tools.
GA4 + Google Ads misattribution is a known issue. When joining GA4 data with Google Ads data in BigQuery, gclid attribution can produce unexpected results. This isn’t specific to any loading approach but becomes your problem once the data lands in BigQuery.
Making the Decision
Start with your constraints:
-
Do you have a developer token? If not and you need to move quickly, your options are Data Transfer Service, Fivetran, or Scripts.
-
Is daily sync sufficient? For most analytics and reporting, yes. Data Transfer Service covers this for free.
-
Do you need faster than daily? Fivetran offers 15-minute syncs, but watch the costs carefully on marketing data.
-
Do you need custom logic? Scripts give you JavaScript control without API complexity. dlt gives you Python control with more flexibility but requires the developer token.
-
What’s your budget tolerance? Free (Data Transfer Service, Scripts) vs infrastructure-only (dlt) vs managed-with-fees (Fivetran).
-
Do you need historical data? Data Transfer Service won’t backfill beyond 30 days. If you need months or years of history, you’ll need Scripts or an API-based tool for the initial load, even if you use Data Transfer Service going forward. A mixed strategy (one-time backfill + ongoing managed sync) is common and perfectly reasonable.
For most teams, Data Transfer Service is the right default. It’s free, daily data covers the majority of reporting needs, and setup takes minutes. The cases for Fivetran (sub-daily syncs) and dlt (Python-based ingestion with full customization) are real but narrower than you’d expect.
Don’t overthink the initial choice. You can always migrate later, and the data structures across approaches are similar enough that your downstream transformations won’t need major rewrites. The broader build vs buy question applies here too.