For organizations with ad-hoc analytics users, governance controls may deliver more savings than technical optimizations. A single analyst running unfiltered SELECT * queries can generate bills that outweigh engineering effort spent on partitioning and clustering. The patterns below operate at query, table, and project levels, and are designed to catch different failure modes.
Query-Level Controls
max_bytes_billed sets a hard limit per query. Queries that would exceed this limit fail before scanning data — you don’t pay for the scan, you just get an error message:
-- Query fails if it would scan more than 10 GBSELECT * FROM large_tableOPTIONS (max_bytes_billed = 10737418240); -- 10 GB in bytesThis is the single most important cost control for ad-hoc users. Set it in every tool and connection:
In dbt, configure in your profile to prevent any single model from scanning more than the specified limit:
my_project: target: prod outputs: prod: type: bigquery method: oauth project: my-project dataset: production maximum_bytes_billed: 107374182400 # 100 GBThis catches misconfigured incremental models that accidentally run full refreshes. An incremental model that should scan 5 GB but suddenly tries to scan 500 GB (because someone ran --full-refresh on the wrong model, or the incremental logic has a bug) will fail safely instead of generating a surprise bill.
In Looker/BI tools, set max_bytes_billed in the connection configuration. This prevents dashboard queries from scanning more than expected, which protects against someone building an unfiltered explore on a multi-TB table.
For individual analysts, encourage setting the limit in their BigQuery console settings or client configurations. Even a generous limit (1 TB) prevents the worst-case scenarios.
Partition Filter Requirements
The require_partition_filter option on partitioned tables forces every query to include a filter on the partition column. Queries without one fail with an error instead of scanning the entire table:
ALTER TABLE `project.dataset.events`SET OPTIONS (require_partition_filter = true);This is the second most important governance control. Apply it to every partitioned table that’s accessible to analysts or BI tools. The partition pruning note covers this in detail, but the governance angle is distinct: it’s not about optimization, it’s about preventing accidents.
In dbt, include this in your model configuration:
{{ config( partition_by={"field": "event_date", "data_type": "date"}, require_partition_filter=true) }}Project-Level Quotas
As of September 2025, new BigQuery projects default to a 200 TiB daily quota. For existing projects, configure custom quotas through the Admin SDK or Cloud Console.
Consider implementing quotas at multiple levels:
Per-user daily limits: Prevent individual analysts from monopolizing budget. A 10 TiB daily limit per user is generous for ad-hoc work but prevents runaway queries.
Per-project limits: Cap spend on development vs production projects. Development projects should have much lower limits than production because they shouldn’t be scanning production-sized tables (and if they are, that’s a design problem).
Alert thresholds: Notify before hitting hard limits. A budget alert at 80% of your monthly target gives you time to investigate before the bill arrives.
The hierarchy of cost controls, from most to least granular:
max_bytes_billedper query/connection — catches individual runaway queriesrequire_partition_filterper table — catches unfiltered scans- Per-user daily quotas — catches sustained individual overspend
- Per-project daily quotas — catches systemic overspend
- Budget alerts — catches everything else, but only after the money is spent
Layer all five. Each catches failures that the others miss.
Authorized Views for SELECT * Protection
Instead of granting analysts direct table access (which allows SELECT * on wide tables), create views that expose only the columns they need:
CREATE VIEW `project.views.user_summary` ASSELECT user_id, signup_date, user_typeFROM `project.raw.users`;
-- Grant view access, not table accessGRANT roles/bigquery.dataViewerON `project.views.user_summary`TO 'user:analyst@company.com';This pattern serves double duty:
- Cost control: Users can only scan the columns in the view, not the entire underlying table
- Data governance: Sensitive columns (email, PII, financial data) are excluded without complex row/column-level security
The trade-off is maintenance. Every new column an analyst needs requires updating the view. But for wide tables (50+ columns) where analysts regularly need only 5-10 columns, the cost savings from preventing full-width scans are substantial.
For organizations with more sophisticated needs, BigQuery’s column-level security and data masking features (available in Enterprise and Enterprise Plus editions) provide finer-grained control without the maintenance burden of authorized views.
Service Accounts for Cost Attribution
Create separate service accounts for each integration: dbt, Looker, Fivetran, Census, custom scripts. This enables precise cost attribution by system.
When every integration uses the same service account (or worse, a shared personal account), INFORMATION_SCHEMA.JOBS_BY_PROJECT shows one giant blob of costs that you can’t decompose. With separate service accounts, you can answer questions like:
- “How much does our dbt pipeline cost per day?”
- “What percentage of compute is driven by Looker dashboards?”
- “Did Fivetran’s sync costs change after we added a new source?”
The setup takes 30 minutes and enables precise cost attribution by system indefinitely.
Reservation Assignments for Capacity Isolation
If you’re on Editions pricing, reservation assignments isolate teams from each other. Without isolation, a heavy workload from one team consumes slots that another team needs, causing their queries to queue.
Assign different teams or workloads to separate reservations:
-- Production pipeline reservationCREATE RESERVATION `project.region-us.prod_pipeline`OPTIONS ( slot_capacity = 0, edition = 'ENTERPRISE', autoscale = (max_slots = 300));
-- Ad-hoc analytics reservationCREATE RESERVATION `project.region-us.adhoc_analytics`OPTIONS ( slot_capacity = 0, edition = 'STANDARD', autoscale = (max_slots = 100));This pattern ensures that a heavy dbt run doesn’t starve dashboard queries, and a rogue ad-hoc analysis doesn’t slow down production pipelines. Each reservation scales independently within its own limits.
Implementation Priority
Start with max_bytes_billed and require_partition_filter — highest-impact controls with least friction. Add per-user and per-project quotas, authorized views, and reservation assignments as data volumes and team size grow.