Skip to content
Insights

Your marketing report proves correlation. Not causation.

Adela Mincea
Adela Mincea6 Min Read

The campaign ran. Sales went up. The report says the campaign drove the sales. It doesn't. It shows they happened at the same time. The difference between correlation and causation is the difference between a budget decision and a guess.

LinkedInX
Featured image for Your marketing report proves correlation. Not causation.
Same time, not same cause.adelamincea.com

The measurement problem

The campaign ran and sales increased. The platform attributed those sales to the campaign. None of that confirms the campaign caused the sales. It confirms they occurred during the same window. These are not the same thing, and the difference has a direct cost.

A business runs a Google Ads campaign for four weeks. Revenue is up 22% compared to the same period last year. The platform reports a ROAS of 6.4x. The marketing team presents this as evidence the campaign is working.

The question nobody asks: what would have happened if the campaign had not run?

If revenue was going to grow 22% anyway, because of the season, because of a competitor going offline, because email drove the lift, then the 6.4x ROAS is not a performance number. It is a coincidence the platform measured and called a result.

What correlation actually tells you

When two things move together, they are correlated. The campaign ran. Sales increased. Those two facts are real.

Correlation tells you that two things happened at the same time. It says nothing about which one caused the other, whether either caused the other, or whether a third factor caused both.

In marketing, correlation is what every platform report shows you. The platform records when an ad was shown or clicked, and when a conversion occurred. If both happened, the platform attributes the conversion to the ad. That is not causation. That is proximity: a measurement of timing, not mechanism.

The attribution model decides how much credit to assign. Last click, data-driven, linear: these are methods of distributing credit across the user journey. None of them prove that the ad caused the purchase. They distribute credit according to a rule.

Where the error becomes expensive

The error is not academic. It shapes budget decisions.

If a campaign is attributed with $80,000 in revenue at a ROAS of 8x, the natural conclusion is that scaling the campaign will produce more revenue at a similar ratio. The campaign is "working." Put more in.

But if that campaign was largely capturing branded search (people who were already going to buy, typing the brand name, clicking the ad instead of the organic result) then scaling it produces more of the same. You acquire customers who were already acquired. The incremental revenue from additional spend is much lower than the attributed revenue suggests, possibly near zero.

The platform shows 8x. The real return on incremental spend might be 1.2x. Budget is allocated to the 8x number. The P&L absorbs the actual economics.

This gap between attributed performance and incremental performance is where most marketing budget is lost.

Free calculator · Marketing tool

Marketing P&L Calculator

What marketing actually contributes after you pay for it.

Build a focused marketing P&L in 60 seconds. Revenue, COGS, ad spend, tools, team — see the contribution your marketing is leaving on the table.

Try the calculator

Why the report can't tell you what you need to know

Marketing platforms are built to attribute. Attribution is not the same as causal measurement.

A platform can tell you: this ad was shown, this user converted, here is the credit assignment. It cannot tell you: this ad caused the conversion, and the conversion would not have occurred without it.

To know whether a campaign caused an outcome, you need a counterfactual: a version of events where the campaign did not run. Platforms do not provide counterfactuals. They provide attribution.

The practical consequence: every marketing report is showing you correlations formatted to look like causes. The 6.4x ROAS is not a causal claim. It is an attribution. Those two things are often confused, especially when the report looks definitive.

What causal measurement actually requires

Getting from correlation to causation in marketing requires one of three approaches.

Holdout testing. A portion of the target audience does not see the campaign. The difference in conversion rates between exposed and unexposed groups is the incremental lift. This is the cleanest method. It requires either platform-level holdout capability or a controlled test design.

Geo experiments. The campaign runs in some markets but not others, matched for underlying demand. The difference in outcomes between markets gives an estimate of causal impact. Less clean than holdouts, but practical for businesses that cannot run user-level tests.

Marketing mix modeling. A statistical model that attempts to decompose revenue into its drivers: paid media, organic, seasonality, pricing, macro conditions, using time-series data. It requires historical data and methodological rigor. It gives an estimate of contribution, not a precise causal claim, but it is far more informative than platform attribution alone.

None of these are standard in most marketing operations. Most businesses run the campaign, read the platform report, and treat the attributed revenue as causal evidence. The budget decision follows from that.

The Digital Economic Review audits attribution quality as part of the analysis: whether your ROAS numbers reflect commercial reality or platform measurement artifacts.

See what it covers

The practical question

The gap between correlation and causation shows up most visibly in two places.

Branded search campaigns. High ROAS, low incremental value. The people clicking branded ads were going to buy. Pausing branded campaigns usually reveals that most of the attributed revenue continues through organic. The campaign was measuring existing demand, not generating new demand.

Remarketing campaigns. High conversion rates, high attributed ROAS. The audience being targeted already showed purchase intent. The platform attributes the conversion to the remarketing ad. In many cases, those conversions would have occurred without it.

This is not an argument for pausing these campaigns. It is an argument for knowing what they are actually contributing, rather than what the platform says they are contributing. The two numbers can be very different.

The discipline behind the question

A marketing report that shows correlation is not a bad report. It is an honest one, if you understand what it is showing.

The discipline is in knowing that the report cannot answer the causal question, and building the measurement practice that can.

Most businesses don't. They treat attributed revenue as causal evidence, allocate budget based on ROAS, and wonder why scaling high-ROAS campaigns produces diminishing returns. The diminishing returns were always there. The attribution model was not showing them.

Causal measurement is harder than attribution. It requires test design, controlled conditions, and willingness to accept that the answer might be lower than the platform reports. It is also the only way to know whether the spend is actually driving the outcome, or just measuring it.


Adela Mincea is a Marketing Economist based in Cluj-Napoca. The Digital Economic Review covers attribution quality, ROAS stability, and CAC by channel: the analysis that sits between what the platform reports and what the business actually experiences.

About the author

Adela Mincea is a marketing economist, paid media strategist, and certified trainer. She helps growing businesses make marketing profitable before scaling it by validating margins, acquisition economics, and pricing power before deploying paid media and AI-enabled systems.

Adela Mincea

Adela Mincea

Marketing Economist

The Marketing Economist

One concept from economics. One marketing decision it changes.

One issue per month. Each one takes an economic concept and applies it to a real marketing decision, the kind that affects budget, margin, or growth.