A story is doing the rounds in the business intelligence community that I think is worth paying attention to.
A visual analytics consultant, posting on the Tableau subreddit this week, watched his client demo everything they’d rebuilt using Claude and Google Sheets.
It had taken them one week. He’d been working with them for years. That’s not one bad engagement. That’s a signal.
Now, a sceptic would reasonably point out that this is one example, and I don’t know enough about that engagement to draw firm conclusions. But I don’t think it’s an isolated case, and the structural forces I’ve been watching for the past year suggest it’s going to become more common, not less.
For the past year or so I’ve been experimenting at the intersection of AI tooling and data visualisation. I should say upfront: I use Tableau and Power BI daily, I’m a Tableau Ambassador, and I’ve been working with BI tools for over 15 years. This isn’t a critique from the outside. But increasingly I’ve found myself bypassing these tools altogether, building dashboards directly on top of cloud data platforms like MotherDuck and Snowflake.
The results have honestly surprised me. Claude can connect to a database, make sense of messy structured and unstructured data, and surface useful, communicable information at a speed I couldn’t match myself. And I’ve been doing this for twenty years. Something is shifting.
I think a lot of analytics professionals are currently falling into what I’d call the rendering trap: mistaking the output layer for where the value actually lives. Business intelligence tools, Tableau, Power BI, Looker, are rendering layers. Yes, they bundle in data preparation, security and semantic layers. But the overwhelming majority of their value is making vast amounts of information understandable.
That job is now contestable in a way it simply wasn’t three years ago. And this isn’t just my observation.
Mat Hughes at Convivial Tools recently mapped the structural forces bearing down on the BI market, and they’re converging from every direction. From above, AI-native analytics tools are consuming dashboards as context for reasoning rather than destinations for users. From below, data platforms like Snowflake and Databricks are realising they can deliver insights directly, without a separate BI layer sitting in between. And from the side, cheaper and more flexible alternatives are picking up the problems the major vendors have been slow to address: ease of use, cost, and the sheer weight of features many organisations simply never asked for.
The BI tool is the middleman in a market that’s now eliminating middlemen.
To be fair, the likes of Tableau/Salesforce and Microsoft are responding. Tableau Next is built around agentic AI, with conversational assistants and a new semantic layer designed to give the AI a proper understanding of your business data. Microsoft has Copilot embedded in Power BI and adjusted their file formats to be more AI-friendly. These are real investments, not vaporware.
But I keep coming back to the same questions.
Why would I pay for conversational AI locked inside Tableau, when I can connect a frontier model directly to my data warehouse and get the same result, probably better, at little to no cost?
Why maintain an expensive Power BI estate when Snowflake and Streamlit will do the job for a fraction of the price?
To be clear, I’m not saying the Claude+Snowflake+Streamlit solution comes at zero cost. Someone has to build and maintain it. But the economics have shifted enough that for many organisations that calculus now looks very different to what it did three years ago.
But the problem I see is that vendors are building AI features inside their walled gardens at the precise moment the walls are coming down.
Vendors would reasonably push back on much of this and point to governance. Security, enterprise trust layers and compliance audit trails and all that. For large regulated organisations, that’s a convincing argument. But for the mid-market, where most of the churn risk probably sits, I’m not sure it’s compelling enough to absorb a 40% price increase on a rendering layer that’s becoming contestable from every direction.
If my argument is right, where does that leave the visual analytics professional?
The instinct for many will be to double down. Another certification. Deeper expertise in a specific platform. It’s an easy trap to fall into, and the vendors have spent years encouraging it. But tying your identity to a specific tool is to risk falling into a vendor trap, and right now it’s a particularly dangerous one.
The data professionals I see doing well in the future are the ones who’ve stopped building their careers around a particular platform and started investing in skills that transfer regardless of what the rendering layer looks like.
I think there are three core skills that matter, and they’re worth naming properly.
Data and information communication has always been the core of the job: figuring out how to take something complex and make it legible to someone who doesn’t live in the data. That skill doesn’t go away. If anything it gets more valuable as AI makes it easier to generate outputs, because the gap between generating something and actually communicating something widens. Understanding visual hierarchy, narrative, how people read and interpret information. That’s still incredibly important.
Data modelling is where the value is quietly moving. At a recent data engineering conference I attended, this was the #1 skill that group of practitioners saw as vital to their career. I think it’s the same in the analytics discipline.
As the visualisation layer becomes cheaper and more commoditised, the real leverage sits upstream in how data is structured, defined and governed. Building semantic layers that are accurate and reusable, ensuring the metrics an organisation runs on are actually trustworthy. This was always where the value lived, honestly. Many analytics professionals just didn’t have to think about it because the BI tool abstracted it away.
The third is likely context engineering. As AI takes on more of the analytical heavy lifting, the practitioner’s job increasingly becomes defining the guardrails: what metrics make sense, what business context needs to be applied, what the AI should and shouldn’t be trusted to conclude on its own. It sits at the intersection of data literacy, domain knowledge and understanding how these models actually reason.
The job was never “use Looker.” It was never “build dashboards.” It was always something simpler and probably harder: get the right information to the right people in a form they can actually use.
The consultant in that Reddit post I referenced at the beginning still has a role. His client knew it too, which is why they want them around for the next phase of their data and analytics journey.
Given all of the above, the most rational move for any visual analytics professional right now is probably to loosen your grip on the vendor ecosystem and invest seriously in the skills that will matter regardless of what the tools look like. Communication, modelling, context engineering. Those aren’t going anywhere.