The Math Never Adds Up
Pull up your ad accounts right now. Your Facebook Ads dashboard probably shows a 4x or 5x ROAS. Your Google Ads account is showing something similar. If you run email, Klaviyo is claiming its share too. Add up all the revenue each platform says it drove, and the total is almost certainly 150% to 300% of your actual revenue.
This is not a rounding error. It is not a tracking glitch you can fix with a better pixel. It is structural. Every ad platform is built to measure its own contribution to a sale, and every ad platform has a strong financial incentive to make that number look as large as possible. They are not lying, exactly. They are just each telling you their version of the truth - and those versions overlap enormously.
The result is that most brands are making budget allocation decisions based on numbers that are fundamentally broken. They scale the channel that shows the best platform-reported ROAS, cut the one that looks weak, and never discover that the "weak" channel was the one actually driving the customer to buy in the first place.
How Every Platform Overcounts in the Same Direction
The mechanism is straightforward once you see it. It comes down to attribution windows and the definition of a "conversion."
Facebook's default attribution window is 7-day click, 1-day view. This means: if someone clicks your Facebook ad anytime in the last seven days, or even just sees your ad in the last 24 hours, and then converts - Facebook takes credit. The visitor does not need to have come back through Facebook. They could have Googled your brand name, found you organically, and checked out. Facebook still claims the sale.
Google's default is a 30-day click window for most campaigns. So that same customer who clicked a Google Shopping ad three weeks ago and finally bought today after seeing your Facebook retargeting ad - Google claims it too.
Now compound this across a typical customer journey: a prospect sees a Facebook Prospecting ad on Tuesday, clicks a Google retargeting ad on Thursday, opens a Klaviyo email on Saturday, and converts. All three platforms take full credit for that one sale. Your actual revenue: $150. Platform-reported revenue: $450.
Every platform is an attorney arguing for its own client. The client is always the same sale. Your job is to be the judge.
The problem gets worse during high-intent periods like Black Friday, when customers were already likely to buy regardless of which ad they saw last. Platform-reported ROAS spikes precisely because the denominator (ad spend) often stays flat while the numerator (claimed conversions) balloons. The platforms look like heroes. The incremental contribution of those ads may be a fraction of what they claim.
The Attribution Models, Ranked by Actual Usefulness
Most platforms now offer you a choice of attribution models. Here is what they actually tell you, and what they do not:
| Attribution Model | Best For | Blind Spot |
|---|---|---|
| Last Click | Bottom-funnel search campaigns where the final click is the decision point. Fast and cheap to implement. | Ignores every touchpoint that built awareness and intent. Systematically starves top-funnel channels. |
| Linear / Time Decay | Multi-channel brands that want to acknowledge the full funnel without doing real math. | Arbitrary weightings with no statistical basis. You are distributing credit by feel, not by evidence. |
| Data-Driven (Google) | High-volume accounts with 3,000+ conversions per month. Genuinely attempts to model actual contribution. | Only models within Google's walled garden. Cannot see Facebook, email, or organic touchpoints. |
| Platform-Reported ROAS (any) | A starting point for channel-level performance direction. Directionally useful if consistent over time. | Systematically inflated by overlap. Never use as the primary metric for budget decisions. |
The honest summary: no single in-platform attribution model tells you the truth. They are all looking through a keyhole and describing what they see as if it were the whole room.
What I Do Instead: A Practical Framework
After managing ad accounts with budgets ranging from $5K/month to $500K+/month, here is the attribution framework that actually holds up.
1. MER as the North Star
Marketing Efficiency Ratio: total revenue divided by total ad spend. Calculated with real numbers from your Shopify dashboard or payment processor - not from any ad platform.
Your MER does not care about attribution windows. It cannot be inflated by overlap. If you spent $50K in March and made $200K in revenue, your MER is 4.0. That is your ground truth. When platform-reported ROAS diverges significantly from MER, you have an attribution problem worth investigating. When they roughly track together, you can trust your directional signals.
2. UTMs + GA4 for Path Analysis
Tag every single link with UTM parameters - every ad, every email, every post. Then use GA4 to understand the actual paths customers take before converting. GA4's path exploration report will show you what touchpoints appear most often in converting sessions, and in what order. It is not perfect - it cannot track cross-device or offline - but it is a better view of reality than any single platform's self-reported data.
3. Incremental Lift Tests for Top-Funnel Spend
The only way to know if a channel is actually causing sales rather than just witnessing them is to turn it off for a portion of your audience and measure the difference. Both Facebook and Google offer native lift studies. For other channels, a geo-based holdout - run the campaign in five markets, withhold it in five comparable markets, compare conversion rates - gives you genuine incrementality data.
This is especially important for broad awareness spend. Upper-funnel click-through rates look terrible compared to retargeting, but that does not mean the awareness campaign is not working. An incremental lift test will tell you if it is actually moving the needle or just claiming credit for people who would have bought anyway.
The Signals That Don't Lie
Beyond MER and lift testing, there are a handful of signals that attribution cannot distort. Watch these closely:
- Branded search volume: If you scale Facebook Prospecting and branded search queries increase in Google Search Console, your prospecting is working - people saw the ad, got curious, and Googled you. This is one of the clearest signs of genuine upper-funnel impact.
- New customer rate: Platform-reported ROAS says nothing about whether you are acquiring new customers or just re-selling to existing ones. Track your new-to-file rate separately. A ROAS of 8x that is 90% returning customers is a very different business than a ROAS of 3x that is 70% new customers.
- Contribution margin after ad spend: ROAS without margin context is meaningless. A brand with 70% gross margins can profitably run at 2x ROAS. A brand with 30% margins needs 4x just to break even. Model your real numbers in something like the Noble Growth Ad Calculator to understand what ROAS you actually need, then back-calculate from MER.
- Revenue trend vs. spend trend: If you increase total spend by 20% and revenue increases by 20%, the marginal efficiency is roughly flat - you are not losing or gaining. If revenue grows by 35%, you have found a channel with room to scale. If revenue grows by 8%, you are pushing into diminishing returns.
The Uncomfortable Conclusion
Attribution will never be perfectly solved. The customer journey is too nonlinear, too cross-device, too influenced by offline touchpoints - a podcast they heard, a friend's recommendation, a billboard they drove past - for any pixel-based system to capture fully. Any vendor selling you "complete attribution" is selling you confidence, not truth.
What you can do is triangulate. Use MER as your ground truth. Use platform data for directional signals and channel-level optimization. Use lift tests to validate your major bets. Watch the proxy metrics that attribution cannot touch.
The brands that get this right do not panic when Facebook's ROAS drops from 4x to 3x. They check whether MER moved. They look at branded search trends. They run a holdout and find out if that channel is actually driving incremental revenue or just following people around who were going to buy regardless.
The ones who get it wrong keep chasing the platform-reported number - scaling whatever looks best in the dashboard, cutting whatever looks weak - until they have optimized themselves into a corner and cannot figure out why total revenue is flat despite "great" ROAS across every channel.
Every platform claims your sale. Your job is to decide who earned it.
Attribution keeping you up at night?
We audit your measurement setup, build a MER dashboard, and design holdout tests that tell you which channels are actually driving growth - not just claiming it.
Get my free audit →