Lessons from the bleeding edge!
What the Insight industry can teach us about AI and the future of knowledge work
For CEOs and senior leaders confronting the challenges of AI transformation, the question is no longer whether to act—it’s how to act with speed and clarity. Yet pinning down how AI will reshape an organization isn’t straightforward. This is a jagged frontier: some functions are being transformed overnight, while adjacent tasks—at least for now—remain untouched.
In this landscape of uneven disruption, one compass point is clear: look to industries already in the vanguard. The consumer insight industry, where we’ve spent decades, is one such frontier case. It is a natural early adopter of AI: it is highly structured with many pattern-based, repeatable workflows; it handles high volumes of anonymized, structured, and comparatively accessible data which makes it a lower-risk sandbox for AI experimentation; and it is under constant pressure to deliver more for less. And in an era of relentless efficiency drives, the attractiveness of ‘another way’ … faster, smarter, and cheaper, is obvious, and in the case of AI, the promise sounds especially seductive.
We have discussed previously Andreessen Horowitz’s recent article (Faster, Smarter, Cheaper: AI Is Reinventing Market Research) which suggests that the demise of the traditional insights industry is, if not at hand, getting uncomfortably close. And to be fair, this narrative has gained traction in tech circles far beyond Andreessen Horowitz. But they claim that “we’re seeing a crop of AI research companies replace the expensive human survey and analysis process entirely” and in this ideal world they argue … even the respondents are simulated. And so much the better they say. “The companies that adopt AI-powered research tools early will gain faster insights, make better decisions, and unlock a new competitive edge”.
“Crucially, success doesn’t mean achieving 100% accuracy. It’s about hitting a threshold that’s “good enough” for your use case. Many CMOs we’ve spoken with are comfortable with outputs that are at least 70% as accurate as those from traditional consulting firms, especially since the data is cheaper, faster, and updated in real time” Andreessen Horowitz
We are witnessing what might be called the "Maximal" approach to AI in research—a maximalist faith in automation, treating the presence of humans in the process as a source of inefficiency or friction. The solution? Replace them. Why keep messy, expensive humans in the loop when you can replace them with models that are “good enough”? If AI can deliver 70% of the quality of human-led research, but at a fraction of the time and cost, then “good enough,” they argue, is now good enough.
The rise of generative AI and agent-based simulations promises a future of always-on, infinitely scalable, and hyper-efficient research. Andreessen Horowitz and other tech optimists frame this as a seismic shift: market research liberated from its traditional bottlenecks and biases, finally reinvented for the software age. Their thesis is clear: faster, smarter, cheaper. But the real risk is not bad AI, it is good enough AI—systems that produce persuasive results that go unchallenged because they sound plausible and arrive quickly.
And so, as an industry, we are facing our ‘Barbarians at the Gate’ moment. Pushing back, we argue that the siren voices are confusing data and information, with knowledge and insight. But we’re not Luddites. AI is not the enemy. It is inevitable. We are embracing it. But we challenge the premise that it can, or should, replace all humans in the insight process. Insight is not just a throughput function. It is an interpretive, critical, creative process. Over-automating it risks replacing deep thinking with shallow mimicry—fast answers that go unchallenged because they sound convincing.
So, as an industry, we stand as both pioneer and test case for AI adoption—and offer critical lessons for any knowledge-intensive business seeking to deploy AI with purpose, speed and strategic discipline.
AI eats the easy parts first—but it’s the hard parts that matter most
AI has rapidly automated the predictable, patterned tasks once central to insight work—data gathering, analysis, dashboarding. What once took teams days or weeks now takes minutes. The appeal for CFOs and COOs is obvious: faster, cheaper, cleaner.
But here’s the paradox: the more successful the deployment, the greater the risk of displacing the human thinking that defined the value of the function. AI is confident in pattern recognition. But it avoids ambiguity, emotional nuance and creative synthesis—exactly where strategic insight lives.
Pretending that AI has absorbed this nuance—when it hasn’t—is a dangerous form of self-deception.
Don’t give your top thinkers the cold shoulder
Insight has always depended on intellectual leaps—the ability to connect dots others can’t see. The great “insight polymath” understands, as Isaiah Berlin once put it, “what fits with what, what springs from what, and what leads to what.”
Now AI is delivering 70% solutions at a fraction of the cost. The temptation is to quietly phase out expensive, creative thinkers. But beware: those final yards - the emotional, cultural, human leap—is often where competitive advantage lives. That’s what separates insight from trivia, and strategy from noise.
Democratize access—but don’t discard depth
AI-powered DIY research tools have revolutionized access to data. Insight is no longer hoarded in the hands of a few—it’s on desktops across the enterprise. This shift enables experimentation and supports a culture of curiosity.
But with decentralization comes dilution. Some organizations are losing sight of the discipline behind good research thinking. The risk is that speed trumps depth, and specialist craft—the ethnographic eye, the narrative instinct, the discipline of listening—is lost.
Powerful AI tools need skilled human stewards. Without them, the signal-to-noise ratio collapses.
Cutting junior roles may offer short-term gains—but carries long-term costs
Generative AI can now complete in seconds what once kept junior teams busy for days: summarizing findings, structuring reports, scanning datasets. The cost and time savings are compelling.
But here’s the catch: who becomes tomorrow’s thought leaders if today’s entry points are eliminated? In insight—as in many knowledge fields—judgment is learned by proximity. You sit beside a seasoned researcher. You watch them interpret a pause, frame a question, read a room. AI isn’t learning those skills—and if humans aren’t either, we’re in trouble.
Beware fantasy thinking dressed as innovation
At industry conferences, the sales pitch is seductive: plug in the right tools and you can finally fire the messy humans. No more quirky consultants. Just seamless automation.
It’s a mirage. Even the best AI systems still need human judgment to frame problems, interpret results, and ask the next question—not just the obvious one.
We’ve seen the industry flirt with full automation. But the sobering realization is this: the craft of insight isn’t obsolete—it just needs reframing. Other knowledge sectors—legal, strategy, HR—should heed the warning. Don’t let your essential skills disappear under a pile of dashboards.
Reverse engineer from purpose—and protect what is human
The most forward-thinking insight teams aren’t clinging to the past, but neither are they surrendering to machines. They’re redesigning around a clear human-first principle.
They start by asking: what is our core contribution to the organization’s mission? What values, judgment, and emotional intelligence must endure? They embed those principles into AI workflows—not as an afterthought, but as a design foundation.
This is leadership. Not resisting AI but shaping it to serve a human end.
AI won’t kill your sector. But human neglect might.
AI could be existential for your business—not because it’s coming for you, but because your competitors may deploy it more wisely. In insight, as in many knowledge industries, it is a transformative force.
To protect what matters most, define it now. Codify your values. Nurture your craft. Embed human intelligence into your AI systems. Don’t wait for others to do it first.
Because the real question isn’t whether AI will change your business. It’s whether you’ll shape that change—or be shaped by it.