Google Ads Scripts are JavaScript programs that run inside the Google Ads platform. They have access to your account data and can export directly to BigQuery. They don’t require a developer token. They don’t require server infrastructure. They authenticate via the logged-in Google Ads account and execute within Google’s own infrastructure.
This positions them between DTS and a full dlt pipeline: more control than the Data Transfer Service, less infrastructure than dlt, and no API approval queue.
How They Work
Scripts live at Google Ads → Tools → Bulk Actions → Scripts. You write JavaScript in the built-in editor, connect BigQuery as an advanced service, and schedule execution.
The authentication model is the key difference from other approaches. Scripts authenticate as the Google Ads user who created them. They run inside Google’s infrastructure, against Google’s own reporting API. From Google’s perspective, this is internal — it’s the same as you querying data through the UI, just automated. This is why no external developer token is needed.
The BigQuery integration requires enabling the BigQuery Advanced API in the script’s settings. Once enabled, you can query the BigQuery service directly from your JavaScript code.
A Basic Export Pattern
A minimal script that queries campaign performance and writes to BigQuery:
function main() { var startDate = getStartDate(); var endDate = getEndDate();
var report = AdsApp.report( "SELECT campaign.name, campaign.id, " + "metrics.clicks, metrics.impressions, metrics.cost_micros, " + "metrics.conversions, segments.date " + "FROM campaign " + "WHERE segments.date BETWEEN '" + startDate + "' AND '" + endDate + "' " + "AND campaign.status = 'ENABLED'" );
var rows = report.rows(); var tableData = [];
while (rows.hasNext()) { var row = rows.next(); tableData.push({ campaign_name: row['campaign.name'], campaign_id: row['campaign.id'], clicks: parseInt(row['metrics.clicks']), impressions: parseInt(row['metrics.impressions']), // Cost is in micros — divide by 1,000,000 for actual currency cost: parseFloat(row['metrics.cost_micros']) / 1000000, conversions: parseFloat(row['metrics.conversions']), date: row['segments.date'] }); }
writeToBigQuery(tableData);}
function writeToBigQuery(data) { var projectId = 'your-project-id'; var datasetId = 'google_ads'; var tableId = 'campaign_performance';
var schema = { fields: [ {name: 'campaign_name', type: 'STRING'}, {name: 'campaign_id', type: 'STRING'}, {name: 'clicks', type: 'INTEGER'}, {name: 'impressions', type: 'INTEGER'}, {name: 'cost', type: 'FLOAT'}, {name: 'conversions', type: 'FLOAT'}, {name: 'date', type: 'DATE'} ] };
BigQuery.Jobs.insert({ configuration: { load: { destinationTable: { projectId: projectId, datasetId: datasetId, tableId: tableId }, schema: schema, writeDisposition: 'WRITE_TRUNCATE', sourceFormat: 'NEWLINE_DELIMITED_JSON' } } }, projectId, Utilities.newBlob( data.map(JSON.stringify).join('\n'), 'application/octet-stream' ));}
function getStartDate() { var date = new Date(); date.setDate(date.getDate() - 7); return Utilities.formatDate(date, AdsApp.currentAccount().getTimeZone(), 'yyyyMMdd');}
function getEndDate() { var date = new Date(); date.setDate(date.getDate() - 1); return Utilities.formatDate(date, AdsApp.currentAccount().getTimeZone(), 'yyyyMMdd');}The schema definition lives in the script. You control which fields to pull, how to name columns, and what data types to assign. This is fundamentally different from the Data Transfer Service’s fixed schema.
Note the cost conversion. Google Ads reports cost in micros — divide by 1,000,000 to get actual currency values. This is one of the classic Google Ads gotchas, and it’s your responsibility to handle it in the script. There’s no automatic conversion.
Execution Limits
Scripts have a hard 30-minute execution limit per run. This is the defining constraint of the approach.
For small accounts (a few hundred campaigns, modest data volume), 30 minutes is ample. For large accounts with thousands of campaigns, multiple ad groups per campaign, and ad-level or keyword-level data, you can hit the limit. When you do, the script times out mid-execution and your BigQuery table may contain partial data.
There are two ways to work within the limit:
Reduce scope per run. Instead of pulling all ad groups in one execution, pull by campaign and schedule multiple script runs or stagger coverage across the week. This adds coordination complexity.
Use aggregated queries. Pull campaign-level data rather than ad-level or keyword-level data. Aggregation reduces row counts dramatically. Most reporting use cases can work at campaign level; you only need the granular data for specific optimization analyses.
Scheduling options are hourly, daily, or weekly — not minute-level. You can’t schedule a script for 3:17am; you pick 3am or 4am. For daily refreshes, schedule at 3am or later to ensure the previous day’s data is complete (Google Ads statistics can lag up to 3 hours).
Compared to the Alternatives
| Data Transfer Service | Scripts | dlt | |
|---|---|---|---|
| Developer token required | No | No | Yes |
| Custom field selection | GAQL support | Full control | Full control |
| Custom logic | No | JavaScript | Python |
| Sync frequency | Daily | Hourly/Daily | Any |
| Scale ceiling | Unlimited | 30-min runtime | Infrastructure-dependent |
| Schema management | Google-managed | Script-managed | Auto-inferred |
| Maintenance burden | Low | Medium | High |
Scripts occupy a specific niche. They’re more flexible than DTS (you define the fields, logic, and schema — no ClickType inflation trap or fixed schema surprises) and simpler than dlt (no developer token, no Python, no infrastructure). But they’re bounded by JavaScript and the 30-minute wall.
When Scripts Are the Right Choice
Scripts fit well when:
- Custom extraction logic or field filtering is needed but no server infrastructure is available
- Developer token approval is pending or rejected — Scripts use a different auth path
- Account volume is modest enough to complete within the 30-minute limit
- Hourly, daily, or weekly scheduling is sufficient (no sub-hour requirements)
- The team works in JavaScript; Python teams may prefer dlt once a developer token is available
For enterprise-scale accounts, high data volume, sub-hourly requirements, or teams that want Python and version control, the developer token approval process becomes worthwhile. See dlt Google Ads Pipeline for what becomes available with API access.