Isn't it fascinating? The very lifeblood of modern business – data – is often entrusted to departments that seem to operate under their own economic laws. Meanwhile, it's the actual IT systems that generate most of an organization's revenue and profit, yet they receive a fraction of the fanfare. For years, the analytics function has felt less like a profit center and more like an expensive experiment where spreadsheets spontaneously combust into million-dollar losses, all under the guise of "insights." But fear not, because just as we started questioning the ROI on all that 'big data,' along came AI – the dazzling magician promising to finally pull a revenue-generating rabbit out of the analytical hat.
The Price of Promise
Walk the halls of any enterprise today and you'll hear the gospel of data-driven decision making. "Data is the new oil!" they proclaim with religious fervor. This metaphor is, of course, completely accurate – both require massive infrastructure investments before delivering value, both create environmental hazards when mishandled, and both are rapidly being replaced by newer, cleaner alternatives.
The chorus has a new verse now: "AI will revolutionize everything!" The same teams that struggled to deliver actionable insights from basic analytics are suddenly implementing transformative AI solutions. These newly minted "AI visionaries" speak with the precision and clarity of quantum physicists explaining string theory to kindergartners.
A typical scenario unfolds with clockwork predictability:
Company invests millions in data infrastructure and specialized talent
Everyone agrees that being "data-driven" is mission-critical
Two years pass with increasingly baroque dashboards to show for it
Leadership quietly questions the return on investment
The solution? Add AI – like putting a jet engine on a skateboard to improve your commute. Guaranteed to solve all your speed problems.
This isn't to suggest data initiatives lack value. The profound disconnect lies in how we measure contribution. The data team often presents findings as the sole reason for any positive trend, claiming credit for the revenue that reliable IT systems have been quietly generating all along. Meanwhile, the fundamental question – "Are we getting value commensurate with our investment?" – gets lost in dashboards showing impressive metrics with increasingly tenuous connections to actual business outcomes.
The Architecture of Excess
We build data platforms like they're meant to survive nuclear winter, complete with layers of technology that solve problems the business doesn't know it has. It's like we're preparing for an alien invasion that will be thwarted only by perfectly governed customer contact information. Meanwhile, the marketing team just wants to know which ad campaign is performing best this week.
Enter AI – the perfect excuse to add yet another architectural layer. Organizations that couldn't effectively organize customer contact information now implement "AI-driven customer intelligence." This approach makes perfect business sense, much like adding a fifth wheel to a car with four flat tires to improve performance.
Boardrooms echo with the panicked whispers: "But what if they are doing something we're not?" This AI FOMO drives investments based not on value but on the terror of appearing technologically backward. After all, nothing says "strategic vision" quite like chasing whatever technology your competitors might possibly be considering.
Data governance deserves special mention in our architectural cathedral:
It has evolved into its own sovereign nation with a constitution thicker than War and Peace
Its language consists primarily of acronyms that would make the Pentagon blush
Its border security ensures no "ungoverned" data crosses its boundaries without proper documentation
The actual business users remain outside, wondering if they need a passport and three forms in triplicate just to glance at their own information
And now we're adding AI governance to the mix, often before mastering basic data governance. This sequential approach is highly efficient, much like learning to run a marathon before figuring out how to walk.
The Language Barrier
Perhaps the most insidious cost driver is linguistic. Data professionals have developed vocabularies so specialized they might as well be speaking in encrypted code. The AI era has supercharged this linguistic arms race. The most remarkable transformation isn't in technology but in people – the speed at which professionals with limited statistical backgrounds have rebranded themselves as AI experts is truly the most impressive algorithm of all.
Witness this entirely hypothetical but somehow universally recognized conversation:
Product manager: "Can we see which customers are most likely to upgrade?"
Data team: "We'll need to implement a feature store with proper data lineage tracking, deploy a gradient-boosting classification model through our MLOps pipeline, and visualize the outputs through our self-service analytics platform."
AI team (interrupting): "Actually, we should leverage a multi-modal large language model with zero-shot learning capabilities to analyze customer sentiment across touchpoints, then use reinforcement learning from human feedback to optimize our recommendation engine."
Product manager (silently): "I just wanted to send an email to the right people."
The translation gap widens exponentially, and with it, the budget required to bridge it.
The Self-Fulfilling Salary Prophecy
This specialized language creates another remarkable economic phenomenon: compensation expectations completely detached from value creation. The mystique around data science has created a market where salaries often reflect perceived complexity rather than actual impact.
The AI era has accelerated this process:
Yesterday's "data analyst" is today's "AI architect" through the magical process of updating their LinkedIn profile
Salary expectations increase in direct proportion to the number of incomprehensible terms one can include in a job description
The speed of this transformation is remarkable – from analyzing spreadsheets to "designing neural architectures" in approximately the time it takes to complete a Coursera certificate
There's a self-reinforcing cycle at work here that would make economic theorists proud. The more complex the work appears, the higher the perceived value. The higher the compensation, the more complex the solutions must become to justify it. This cycle continues until someone impolitely asks about ROI, at which point the conversation quickly pivots to the next emerging technology.
Breaking the Cycle
How might organizations approach data and analytics differently? Perhaps a radical thought: What if we measured the data team not by the sheer volume of data they manage or the elegance of their technical solutions, but by – and I recognize this might sound utterly revolutionary – the actual business value they create?
And what if we applied the same standard to AI initiatives? Instead of asking "How can we use AI?" we might ask "What business problems are we trying to solve, and is AI actually the most efficient solution?" I recognize that such pragmatism borders on heresy in some circles.
Consider these alternatives:
Value creation over asset accumulation: Data functions measured by tangible business value delivered, not technical complexity managed
Integration over specialization: Data governance embedded in business processes rather than existing as a separate kingdom with its own aristocracy
Simplicity over complexity: "Customers who buy product A are three times more likely to cancel if we don't contact them within 30 days" instead of elaborate visualizations of statistical distributions
Technology appropriateness over technological superiority: Sometimes a well-designed Excel spreadsheet or, dare I say it, a conversation with actual customers provides better insights than the latest neural network
The Path Forward
So, the next time you hear about a multi-million dollar data initiative – or an AI transformation that promises to revolutionize your business – ask the tough questions. Demand to see the receipts. The emperor of data-driven decision-making might be wearing considerably fewer clothes than we've been led to believe.
The uncomfortable truth is that many data functions would have been shut down long ago if they operated as standalone businesses. Their costs often exceed their demonstrable contributions, while the core IT systems that actually generate most revenue are expected to run flawlessly on ever-shrinking budgets. Yet organizations continue to invest in ever more elaborate data initiatives:
Partly because of genuine potential
Partly because of fear of falling behind
Partly because measuring actual impact is genuinely difficult
And partly because we've created organizations where asking "Is this worth it?" is career-limiting
The AI wave has only intensified this dynamic. Organizations fearful of missing the next technological revolution funnel resources into initiatives led by the same teams that struggled with basic analytics. This progression makes perfect sense, much like promoting someone who struggles with addition to teach calculus.
Moving forward requires a return to fundamentals. The most successful data organizations focus relentlessly on business outcomes, speak in terms their colleagues understand, and implement solutions that match the actual complexity of the problem – not the perceived complexity that justifies their existence.
In the end, the goal isn't to have the most advanced data capabilities or the latest AI models. The goal is making better decisions that lead to superior business outcomes. Everything else is just expensive ornamentation. Or in today's parlance, it's just another large language model confidently stating things that sound plausible but have no basis in reality – the ultimate evolution of business communication.
Nicely summarised. I prefer the phrase “evidence based” which includes data, analytics, and a myriad of other information on an issue, derived from methods that have been around for quite some time.