When you link a GA4 property to BigQuery, a dataset named analytics_<property_id> appears in your project. Inside it, four distinct table types serve different purposes. Using the wrong one for your use case means either missing data, paying unnecessary costs, or reporting on numbers that will never be final.
The Four Table Types
| Table | Purpose | Key Limitation |
|---|---|---|
events_YYYYMMDD | Daily export, source of truth | 10+ hour latency |
events_intraday_YYYYMMDD | Streaming near-real-time data | Missing attribution fields |
pseudonymous_users_YYYYMMDD | User export keyed by device ID | Optional, must enable separately |
users_YYYYMMDD | User export keyed by custom user_id | Optional, must enable separately |
Daily Tables: Your Source of Truth
The events_YYYYMMDD tables contain fully processed event data with complete user attribution. They’re what you build your reporting models on.
Timing: Daily tables typically arrive mid-afternoon in your property’s timezone. There’s no guaranteed SLA for standard properties — “mid-afternoon” is a description of typical behavior, not a commitment. Plan your pipeline scheduling accordingly.
Late data window: These tables continue updating for up to 72 hours after their date. Mobile app SDK batching and Measurement Protocol hits commonly arrive with delays. If you run an incremental model against the previous day’s data first thing in the morning, you’ll miss events that haven’t arrived yet. Use a lookback window in your incremental strategy:
{% if is_incremental() %}WHERE event_date >= DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY){% endif %}What’s in them that intraday tables lack: traffic_source (first-touch attribution), user_ltv (lifetime value metrics), and is_active_user. If your analysis depends on any of these, you must use daily tables.
Limits: Standard GA4 properties can export up to 1 million events per day in daily tables. GA4 360 properties can export up to 20 billion events daily.
Intraday Tables: Real-Time Monitoring Only
The events_intraday_YYYYMMDD tables provide data within minutes of event occurrence. They’re updated continuously throughout the day as events stream in.
The deletion timing: Intraday tables are automatically deleted once the corresponding daily table completes. You cannot rely on them for historical analysis — they exist only for the current day’s monitoring.
What’s missing: Three fields are never populated in intraday tables:
traffic_source— no first-touch attributionuser_ltv— no lifetime value datais_active_user— no active user flag
This means session-level attribution models and user acquisition analysis that depend on traffic_source will produce incorrect results if run against intraday data. The queries execute without errors; they just return empty or misleading values.
Accuracy: Expect 0.5–2% discrepancy between intraday and final daily data under normal conditions, with edge cases reaching 20% or more. The intraday data represents an in-progress state of processing, not a final count.
Cost: Streaming ingestion costs $0.05 per GB. A site with 1 million daily events pays roughly $1.50/month for intraday exports. Daily batch loading is free — you pay only for storage and queries. The cost is modest but worth knowing before enabling intraday exports for multiple properties.
When to use intraday: Same-day dashboards, anomaly alerting, real-time operational monitoring. Not for any analysis where accuracy matters or where attribution fields are required.
User Tables: Optional Identity Exports
The pseudonymous_users_YYYYMMDD and users_YYYYMMDD tables export user-level data rather than event-level data. They are disabled by default and require explicit enablement in the BigQuery linking settings.
pseudonymous_users_YYYYMMDD: Keys on user_pseudo_id — the device/browser cookie identifier. One row per device that was active. Contains audience memberships, lifetime value metrics, and predictive scores (purchase probability, churn probability) for each device.
users_YYYYMMDD: Keys on your custom user_id — the identifier you set via gtag('set', 'user_id', '...') or equivalent. Only populated for events where user_id was explicitly set.
When to use them: These tables become useful when you need user-level aggregations without building them from the event tables. If you’re accessing predictive scores from GA4, this is the only BigQuery location where they appear. For most analytics engineering work, building user-level models from the events tables directly gives more control and flexibility.
Caveat: If your team is considering enabling the “User-provided data” feature in GA4 admin, be aware of the critical interaction with user_id export. That feature permanently disables user_id in BigQuery exports with no reversal option.
Choosing Between Daily and Intraday
The choice isn’t either/or. Enable both exports and use them for different purposes.
Day N: Events occur throughout the day → Intraday tables update within minutes (for monitoring)
Day N+1: Daily export completes (~10-16 hours after midnight) → Intraday table deleted → Production reporting runs against daily table
Day N+3: Late data window closes → Daily table is now stable and final → Historical queries produce consistent resultsPractical guidance:
- Run same-day monitoring dashboards against intraday tables. Accept the accuracy limitations explicitly in the dashboard.
- Run all production reporting, attribution analysis, and KPI metrics against daily tables.
- If your stakeholders need same-day data for decisions (not just monitoring), document the known intraday limitations prominently so they understand what they’re looking at.
For the query patterns used to access these tables efficiently, including _TABLE_SUFFIX filtering and date range syntax, see GA4 BigQuery Query Patterns.