ServicesAboutNotesContact Get in touch →
EN FR
Note

Cross-Platform Ad Metric Comparability

Why only five metrics can be meaningfully compared across ad platforms, how to handle platform-specific metrics, and conversion configuration details that determine what your 'conversions' column actually means.

Planted
dbtgoogle adsdata modelinganalytics

Unified ad reporting models that UNION data from Google, Meta, and LinkedIn benefit from a narrow metric set. Adding more platform-specific metrics produces NULL-heavy columns and complicates interpretation for stakeholders.

The Five Universal Metrics

Fivetran’s dbt_ad_reporting package (v2.4.0, covering 11 platforms) normalizes exactly five metrics across all platforms:

  1. Clicks
  2. Impressions
  3. Spend
  4. Conversions
  5. Conversions value

These are the only metrics with reasonable cross-platform equivalents, even though their definitions aren’t identical across platforms. They answer the core questions that justify cross-platform reporting in the first place: How much did we spend? What did we get for it? How many people saw and engaged with our ads?

Why Only Five?

Platform-specific metrics don’t have equivalents elsewhere:

  • LinkedIn social actions: reactions, comments, shares, viral impressions, follower gains
  • Meta engagement: ThruPlays, video completion rates, frequency, relevance ranking
  • Google competitive metrics: Quality Score, Impression Share, Search Lost IS (budget), Search Lost IS (rank)

Trying to force these into a unified schema produces columns that are NULL for most rows. A quality_score column that’s populated only for Google rows and NULL for everything else doesn’t help anyone. It clutters the unified model and confuses stakeholders who see NULLs and wonder if something is broken.

The clean design: keep unified models to just the common five. Platform-specific metrics stay in platform-level intermediate or mart models. If stakeholders want to see ThruPlays alongside impressions, build a platform-specific Meta dashboard rather than cluttering the cross-platform view.

Passthrough Variables: The Middle Ground

For teams that want platform-specific columns in the unified output without permanently modifying the package, Fivetran’s package supports passthrough metrics. Each passthrough metric has three properties:

  • name: The source column name in the platform’s intermediate model
  • alias: The column name in the unified output
  • transform_sql: An optional SQL expression applied during the UNION

The key pattern for passthrough metrics: use transform_sql: "null" to explicitly place NULLs where a metric doesn’t apply to a given platform. This is better than letting the UNION fail or silently dropping columns, because the NULL is intentional and documented.

dbt_project.yml
vars:
ad_reporting__facebook_ads_passthrough_metrics:
- name: frequency
alias: facebook_frequency
- name: video_thruplay_actions
alias: facebook_thruplay_actions
ad_reporting__google_ads_passthrough_metrics:
- name: impressions_share
alias: google_impression_share
transform_sql: "null" # placeholder for non-Google platforms

Use passthrough variables sparingly. Every passthrough column that’s NULL for most platforms is a column that will generate questions. If you find yourself adding more than 2-3 passthrough metrics, you probably need a platform-specific dashboard instead.

Conversion Configuration

The conversions column is where the five “universal” metrics get deceptively complex. Each platform defines conversions differently:

Facebook/Meta: Conversions come from configurable action types. The Fivetran package defaults to fb_pixel_purchase and lead_grouped, but your configuration might use different action types depending on your business. If you’re an ecommerce company, you probably want purchase and maybe add_to_cart. If you’re B2B, you might want lead and complete_registration.

LinkedIn: Combines external_website_conversions with one_click_leads into a single conversions column. This is a LinkedIn-specific combination that doesn’t map cleanly to how Google or Meta count conversions.

Google and Microsoft: Provide aggregate conversion totals directly. These are simpler to work with but still reflect whatever conversion actions you’ve configured in the platform.

The critical implication: your unified conversions column contains structurally different things depending on the platform. A “conversion” from Meta might be a fb_pixel_purchase. A “conversion” from LinkedIn might be an external_website_conversion. A “conversion” from Google might be any of the conversion actions you’ve configured.

Document Your Conversion Definitions

This documentation deserves to live in the dbt model itself, not in a wiki or a Confluence page. Add a description to your unified model’s conversions column that specifies exactly what feeds into it per platform:

schema.yml
models:
- name: mrt__marketing__campaign_report
columns:
- name: conversions
description: >
Unified conversion count. Sources differ by platform:
- Google Ads: All configured conversion actions (aggregate total)
- Facebook Ads: fb_pixel_purchase + lead_grouped action types
- LinkedIn Ads: external_website_conversions + one_click_leads
These are NOT directly comparable. Use blended ROAS or
UTM-based attribution for cross-platform conversion analysis.

When a stakeholder asks “why doesn’t this match the Meta dashboard?”, the answer should be one dbt docs click away. The Meta dashboard might be counting different action types, or using a different attribution window, or including view-through conversions that your configuration excludes. Having the definition documented prevents the multi-day investigation that otherwise follows.

Comparability Limits

The five metrics are approximations, not exact equivalents. Spend is the most reliable; impressions and clicks have definitional differences but are directionally consistent; conversions and conversions_value are configured differently per platform and attributed over different windows. The unified view is useful for budgeting, trend analysis, and downstream patterns like blended ROAS and budget pacing, but not for exact cross-platform conversion comparison.