I’ve been working with digital analytics long enough to remember when we didn’t have Looker Studio, which back then was still called Data Studio, or tools like Supermetrics that could pull data from almost anywhere with a few clicks.
I even remember when Google Tag Manager was something new and when Universal Analytics was launched in beta.
In those days, everything required effort. From implementing tracking without a tag manager to building a dashboard. Not only technical effort, but conceptual effort as well.
You had to decide what you actually wanted to show, because every chart, every metric, and every data source added complexity. That forced prioritisation, and quite often it forced better conversations. You could not track or visualise everything, so you had to agree on what mattered.
Today, those constraints are mostly gone. And with them, something else quietly disappeared.
When dashboards multiplied
Dashboards multiplied for understandable reasons.
Several things happened at the same time:
- visualisation tools became free or inexpensive
- data pipelines were productised and easy to connect
- GA4 became harder to use for many everyday questions
- BigQuery data streams made raw data feel both accessible and powerful
When the core analytics interface stopped serving teams well, they naturally looked for alternatives.
Raw data promised flexibility. Dashboards promised control.
It became easy, and often necessary, to build separate reporting layers for marketing, product, sales, and management.
On paper, this looked like progress. More flexibility, fewer platform limitations, and more tailored views for different stakeholders. In practice, dashboards multiplied much faster than shared understanding.
Each dashboard solved a local reporting problem. Very few were designed to support an actual decision. Metrics looked clean and precise, even though they never really were.
Precision without truth
Analytics metrics have always been approximations.
Sampling, attribution models, consent limitations, missing users, imperfect identifiers. None of this is new. What has changed is how numbers are presented, and how easily that turns into authority.
Modern dashboards look exact.
Two decimal points, clean trend lines, neat comparisons, sometimes even forecasts.
The visual language suggests accuracy and certainty, even when the underlying data is anything but.
Precision, however, is not the same as truth.
And the metrics are never bullet-proof, no matter how much management wants to see precise and bullet-proof metrics.
When metrics are detached from their assumptions, they gain more authority than they deserve. Numbers stop being inputs to discussion and slowly start acting like answers. And once a metric looks official enough, questioning it feels unnecessary or even uncomfortable.
This is how analytics quietly loses its critical role.
When answers disappeared
As dashboards multiplied, answers became harder to find.
When something changes today, the first reaction is often not “why did this happen?”, but “which dashboard is correct?”.
Conversations shift from interpretation to reconciliation. Screenshots replace reasoning.
Time is spent aligning numbers instead of understanding behaviour.
At the same time, it is worth being honest about how few decisions analytics actually drives. In many enterprises, marketing budgets are still allocated largely based on previous experience, internal beliefs, and organisational memory rather than on current data.
Analytics is often brought in after decisions are already directionally set, to support, explain, or legitimise them.
When that is the role analytics plays, clarity is difficult to achieve by definition.
How self-service accelerated confusion
Self-service analytics was supposed to democratise data, and in many ways it did.
But it also removed a shared point of reference.
When everyone can create their own reports, several things tend to happen over time:
- definitions start to drift
- assumptions remain implicit
- context disappears
- responsibility becomes unclear
Two teams can talk about performance while looking at entirely different realities, using the same words but meaning different things.
Meetings still move forward because numbers are present. Slides still look convincing. Decisions still get made. What is missing is not data, but alignment. The result is a subtle and dangerous illusion: agreement without understanding.
“Just in case” analytics
Another shift happened quietly along the way.
Because storage is cheap and tools promise future flexibility, organisations started collecting everything. Every click, every interaction, every potentially interesting dimension. Not because it supports a known decision today, but just in case it might be useful one day.
This kind of “just in case analytics” feels responsible and forward-looking. In reality, it often increases noise, risk, and cognitive load.
Data accumulates much faster than meaning. Years later, teams struggle to remember why certain events exist, what they were meant to represent, or whether they still describe anything relevant at all.
Collecting data is easy. Maintaining meaning over time is not.
Why this isn’t a tooling problem
It is tempting to blame the tools.
GA4 is complex. Dashboards are fragmented. AI promises insights and delivers confidence. Surely the next platform or feature will fix it.
But this is not a tooling problem.
Analytics tools are very good at collecting data, visualising patterns, and scaling access.
They are not designed to define decisions, surface assumptions, expose uncertainty, or create shared understanding.
Those responsibilities do not disappear when tools improve. They become more important.
From measurement to meaning
When analytic as a function (not as a tool) genuinely helps decisions, a few things are usually true:
- fewer metrics matter
- assumptions are explicit
- uncertainty is acknowledged
- not every question has an answer, and that is accepted
Clarity does not come from more dashboards or more data streams. It comes from agreeing what the data is for, and just as importantly, what it cannot do.
That agreement requires effort and judgment. It rarely survives environments where tools promise to make thinking unnecessary.
A familiar feeling
If this feels familiar, you are not alone.
Most organisations do not suffer from a lack of data. They suffer from a lack of shared understanding about what their data actually represents, and how it should be used.
And if this feels familiar, the problem isn’t missing data. Or missing tools.