Categories
Articles

Why no one is happy with digital marketing analytics — and how we got here

“What did marketing actually achieve?” – is the question at the end of every quarter. Or every month. And the answer never quite satisfies anyone in the room.

“What did marketing actually achieve?”

It’s the question at the end of every quarter. Or every month. And the answer never quite satisfies anyone in the room.

Executives roll their eyes at vanity metrics, marketers feel misunderstood, and analysts stay silent.

Dashboards grow more colourful and interactive but not more helpful.

Why is it so hard to tell a compelling story about marketing success? Why do we keep running in circles?

Let’s unpack this from the perspective of each role in the room.

The C-level wants clear impact

The executive team — CEO, CFO, CRO — doesn’t have time for nuance. They want to know:

  • Is our investment paying off?
  • Are we growing?
  • Where should we allocate budget next?

Their ideal marketing report has a single number. Revenue up and to the right. ROI. CAC down. Pipeline value. Something that fits into a deck for the board meeting.

But they rarely get it. Instead, they’re given clicks, impressions, engagement rates, and conflicting numbers from GA4 and advertising platforms. 

And, of course, none of these match the CRM data.

Marketing and analytics don’t answer their question. So, they don’t trust marketers and analysts.

Marketers want credit for everything

Marketing, on the other hand, knows that brand awareness, trust, and content influence happen long before a form submission or a sale.

They want credit for:

  • Building visibility in a new market
  • Brand recognition
  • Nurturing buyers over long sales cycles
  • Leads
  • Deals and transactions

They know not everything can be tracked, but they also need analytics to justify their budget. 

So, they ask for dashboards, more events, and better attribution.

But even then, the numbers rarely match their internal narrative because:

  • Analytics tools weren’t made for measuring brand.
  • Advertising platforms’ attribution reports are one-sided.
  • Sales and marketing data are siloed.
  • We can’t track most customers in any case.

Analysts are stuck in the middle

Digital analysts (often technical, sometimes outsourced, and always under pressure) are told to “show marketing’s impact.”

But:

  • The data is incomplete or dirty.
  • The definitions of success change every quarter.
  • No one agrees on which metrics matter.
  • Web analytics tools and advertising platforms tell only their part of the story.
  • Analysts are too technical to work with business realities.

So they build what they’re asked to build. More dashboards. More events. More charts. Data we can collect and visualise.

Because that’s what marketers ask for — not because it answers the real C-level questions.

We keep running in circles

But at the end of the quarter, the question remains: “What did marketing achieve?”

And no one’s happy with the answer: clicks, form submissions, engagement.

And, of course, they could also ask: “What did analytics achieve?”

The answer: more tags, events and dashboards. But not insightful analyses, answers to the critical questions or business recommendations.

So we try to fix it.

We buy better tools.

We build more dashboards.

We invest in data warehouses.

We pretend that if we tag everything, the truth will reveal itself. But it doesn’t because the problem isn’t a lack of data.

It’s the lack of agreement on success and how it’s measured.

And lack of trust in data. And lack of analysis.

Marketing does what it can. Analytics does what it must. And C-level waits for a story that makes sense of it all.

Still waiting.

What needs to change (beyond dashboards and tracking)

We’ve been through this loop enough to know it’s not just about better reporting.

We need a new foundation for marketing measurement — one built on realism, integration, and experimentation.

Stop chasing a single source of truth

Marketers often ask: “Which number should we trust — GA4 or Meta Ads?” 

The honest answer? Neither and both. Each platform reports what it can see. 

GA4 misses a lot due to consent, device switching, and attribution rules. Ad platforms overstate their contribution.

Instead of debating which tool has correct data, accept that there is no single source of truth.

Integrate data sources and store them centrally

To get a more complete view, bring all your data together:

  • GA4 event and session data
  • CRM and sales pipeline data
  • Campaign cost and conversion data from all ad platforms

Store this in a data warehouse to give analysts the freedom to query, model, and combine data across sources without being tied to one tool’s interface.

(And, hopefully, you do have analysts in your team.)

Use models that reflect how marketing works

Once the data is centralised, don’t just count events. 

Use econometric models or machine learning algorithms to analyse how different variables affect business outcomes over time. This allows you to:

  • Measure long-term brand effects
  • Separate the influence of seasonality, pricing, and promotion
  • Understand the marginal ROI of each channel

It’s not easy, but it’s how mature teams move from gut feeling to informed decision-making.

Validate with triangulation, not trust

Don’t expect GA4 to match Meta’s ad reports. 

Compare models with multiple data sources, test assumptions, and look for patterns, not for perfect matches. 

When your model says Meta drives 30% of sales and Meta says 70%, you’re starting a conversation — not exposing a failure.

Run bold experiments

The ultimate test isn’t in dashboards — it’s in reality. Try:

  • Turning off a channel for a few weeks
  • Allocating more budget to promising channels
  • Running A/B tests on creative, messaging, or media mix.

Yes, it’s risky. Yes, it could impact short-term results. But without experimentation, you’ll never escape the trap of attribution models based on cookies and IDs and marketing mix models that can’t answer the big questions without experimentation.

Why it doesn’t happen (yet)

The problem isn’t technical. It’s cultural.

Marketing departments fear losing momentum. Analysts fear being ignored. Leadership fears complexity.

So, the status quo wins. We build dashboards, track more, and chase the same unsatisfying answers.

Because changing course (collecting better data, integrating it, modelling it, validating it, experimenting with budget) feels risky, difficult and expensive.

But here’s the irony: continuing as we are is the biggest risk of all.

If we can’t prove marketing’s value, someone else will decide for us. Budget cuts, reorganisation, layoffs… We’ve all seen it.

Toward a shared story

So maybe we don’t need more dashboards. Maybe what we need is a shared story:

A story executives believe.

A story marketers feel represents their work.

A story analysts are proud to support with clean, tested, realistic data.

That story won’t come from GA4, Adobe Analytics, Piwik PRO, attribution tools, or tagging “everything.”

It comes from doing the hard work: defining success, collecting the correct data, analysing with humility, and testing bravely.

Until then, we’ll keep running in circles.