ServicesAboutNotesContact Get in touch →
EN FR
Note

Ad Platform Metric Divergence

Why impressions, clicks, and conversions mean different things on Google, Meta, and LinkedIn — and why pretending they're equivalent produces misleading cross-platform reports.

Planted
google adsanalyticsdata modeling

Even the most fundamental ad metrics — impressions, clicks, conversions — are defined differently by each platform. Cross-platform models that treat these metrics as equivalent without documentation produce misleading comparisons.

Campaign Hierarchy Mismatch

Before you can even compare metrics, you have to map campaign structures to a common schema. Each platform organizes its ad hierarchy differently:

LevelGoogle AdsMeta AdsLinkedIn Ads
TopCampaignCampaignCampaign Group
MiddleAd GroupAd SetCampaign
BottomAdAdCreative

LinkedIn inverts everything: what LinkedIn calls a “Campaign” is what Google and Meta call an “Ad Group.” What LinkedIn calls a “Campaign Group” is what everyone else calls a “Campaign.” And at the bottom, LinkedIn uses “Creative” where Google and Meta use “Ad.”

This isn’t a minor naming inconvenience. Any cross-platform model has to map these hierarchies to a common schema before a single UNION ALL can work. In the intermediate layer, LinkedIn’s “Campaign Group” gets renamed to “Campaign” and LinkedIn’s “Campaign” gets renamed to “Ad Group.” Without this remapping, you’ll be comparing Google campaigns against LinkedIn ad groups and wondering why the numbers look strange.

Impression Counting

“Impressions” sounds like a simple concept — how many times was the ad shown? But each platform applies different standards:

LinkedIn counts an impression when 50% of ad pixels are in-view for at least 1 second. This is the IAB viewability standard. An ad that loads below the fold and is never scrolled to doesn’t count.

Google counts an impression when the ad is served, regardless of whether the user actually saw it. A search ad that appears on page 2 of results that the user never scrolls to still registers as an impression.

Meta counts an impression when the ad enters the viewport, but without LinkedIn’s one-second threshold.

10,000 LinkedIn impressions represent a stricter standard of ad visibility than 10,000 Google impressions, but a unified report treats them as equivalent. There is no way to retroactively apply LinkedIn’s viewability standard to Google data; the best approach is to document the difference and surface it to stakeholders.

Click Definitions

Clicks have the same comparability problem. LinkedIn counts social actions (likes, shares, comments) as clicks by default. Google and Meta don’t — their click metrics only count clicks that navigate the user somewhere.

If you’re comparing CTR (click-through rate) across platforms without adjusting for this, LinkedIn will appear to have a higher engagement rate than it actually does relative to the others. The “clicks” column in your unified model contains structurally different things depending on which platform produced the row.

For LinkedIn, you can request clicks excluding social actions through the API (using externalWebsiteClicks or landingPageClicks instead of total clicks). If your extraction tool supports it, configure it to use the narrower click metric for better cross-platform comparability. If not, document the difference and flag it in your model.

Attribution Windows

Each platform’s conversion numbers reflect a different lookback period:

PlatformDefault Click WindowDefault View WindowNotes
Google Ads30 daysNone (search)Longest click window
Meta7 days1 dayReduced post-iOS 14
LinkedIn90 daysExtremely long window

A Google Ads “conversion” attributes clicks from up to 30 days ago. A Meta “conversion” only counts clicks from the last 7 days. LinkedIn’s 90-day window means it claims credit for purchases that happened three months after the ad click.

If all three platforms claim credit for the same purchase, the total will exceed actual revenue. Each platform is telling its own version of the truth, and your warehouse is the only place where you can see that the numbers don’t add up. This is the fundamental attribution bias problem that makes blended metrics more trustworthy than platform-specific ones.

Conversion Time vs. Impression Time

Meta’s June 2025 attribution change introduced another layer of complexity. On-Meta events (leads generated on Facebook forms, for instance) are now attributed to impression time. Off-Meta events (purchases on your website) are attributed to conversion time. This means a single Meta campaign report mixes two different temporal attribution methodologies in the same dataset.

For your intermediate models, this means Meta conversions can’t be cleanly compared to Google conversions on a day-by-day basis. A conversion appearing on “Tuesday” in Meta data could mean “the impression happened Tuesday” or “the conversion happened Tuesday,” depending on where the conversion event occurred.

Timezone Pre-Aggregation

Each platform pre-aggregates daily totals in its own timezone (typically the ad account timezone). Your warehouse likely stores everything in UTC. A campaign that spends $100 between 10pm and 2am in the ad account’s timezone will show that spend split across two different UTC days.

Fivetran’s dbt_ad_reporting package documents this challenge explicitly in its DECISIONLOG. The guidance: accept small timezone-related discrepancies (1-3%) as normal rather than chasing exact reconciliation. If your cross-platform spend totals are off by more than 3% from the sum of individual platform UIs, the problem is likely a pipeline bug, not a timezone issue.

Making Divergence Visible

A practical pattern: add a metric_notes documentation model or column that records, for each platform, what “impressions” means, what “clicks” includes, and which conversion events are counted. Putting the definition in the dbt model description rather than a separate wiki keeps it accessible when stakeholders ask why numbers differ.

The five metrics that can be compared across platforms — clicks, impressions, spend, conversions, and conversions_value — are the best available proxies, not exact equivalents, and should be documented as such.