There’s a version of this conversation happening in every B2B company right now. Someone in leadership (usually someone who’s just come back from a conference or read a newsletter) wants to know why the sales team isn’t using AI yet.
It’s a fair question. AI tools are genuinely useful. I use them constantly in my own work. But there’s a specific way this conversation goes wrong, and it goes wrong in a predictable direction: people reach for AI as a solution to problems that are fundamentally about process, data quality, and structure. And AI doesn’t fix those. It amplifies them.
What AI does to a pipeline full of bad data
Imagine your CRM has 200 “active” opportunities. Some of them are genuinely live deals. Some of them closed six months ago. Some are prospects who gave a polite no and are still in the system because nobody wanted to mark them as lost. A handful were never real in the first place: they got added because someone had an optimistic conversation and wanted something to show for it.
That’s most CRMs I encounter. Not an indictment of anyone; it’s just what happens when there’s no stage discipline, no regular pipeline review, and no cultural norm around marking things as lost.
Now add AI to that. You can get AI to analyse your pipeline, identify patterns, predict close likelihood, draft follow-up sequences. The problem: every one of those analyses is built on data that’s somewhere between misleading and fictional. The AI will work very hard on your behalf to produce confident-sounding insights from a dataset that can’t support them.
The pipeline review might tell you your average deal velocity is 45 days, which sounds useful, except that number is distorted by 60 deals that have been sitting in “proposal sent” for seven months because nobody moved them to lost. The forecast is inflated. The close rate calculations are wrong. The AI-generated “next step” suggestions are being applied to opportunities that aren’t real.
You haven’t accelerated your revenue function. You’ve accelerated your ability to create the appearance of one.
The actual prerequisite
Before AI adds genuine value to a revenue function, a few things need to be true:
The pipeline reflects reality. Stage definitions mean something specific. A deal in “qualified” has actually been qualified, not just spoken to once. Things get moved to lost when they’re lost. The forecast is built from real data.
There’s a documented process. Not in someone’s head; written down, shared, applied consistently. Discovery questions, qualification criteria, exit criteria for each stage, proposal standards. Something you could hand to a new rep and have them follow.
The CRM is actually used. Not just for management oversight; as the operating system of the team. Notes go in. Activity gets logged. Deals get updated after every meaningful conversation. If reps are managing their real pipeline in a spreadsheet or their own notes, no AI layer will help.
If these three things are in place, AI tools can do genuinely impressive work. Analysis gets faster. Documentation gets better. Execution becomes more consistent. The good stuff compounds.
If they’re not in place, you’re building on sand.
Where this goes wrong in practice
The failure mode I see most often isn’t buying the wrong AI tool. It’s buying the right AI tool and aiming it at the wrong problem.
A team that can’t forecast accurately doesn’t need AI forecasting; it needs stage discipline and honest pipeline reviews. A team producing inconsistent proposals doesn’t need AI proposal generation; it needs a clear value proposition and a documented structure. A team that can’t qualify prospects reliably doesn’t need AI qualification; it needs a qualification framework and someone to enforce it.
The AI solutions to these problems exist. Some of them are quite good. But they all assume a baseline of process and data quality that many teams don’t have. Used without that baseline, they produce faster noise.
The businesses I’ve worked with that are getting genuine, measurable value from AI in their revenue function all had one thing in common: they were already doing the fundamentals reasonably well before they added the AI layer. Clean pipelines, documented process, consistent data entry. The AI made good things better. It didn’t create good things from scratch.
A practical way to think about it
Ask yourself two questions.
First: if you ran a report on your pipeline right now (average deal size, conversion rate by stage, average velocity, forecast accuracy over the last four quarters), would you trust the numbers?
Second: if you asked three different members of your sales team to explain your qualification criteria, would they give you roughly the same answer?
If yes to both, you’re probably in a position where AI tools will genuinely accelerate your revenue function. There’s real process to build on and real data to analyse.
If no (if the honest answer to either is “not really”), then the AI conversation is premature. The work isn’t integration. It’s foundations. And that’s fixable, but it’s a different project.
There’s nothing wrong with starting there. Most businesses do. The point is to know which project you’re actually in.
What I tell clients who want to start with AI
I tell them the same thing I tell anyone who comes in with a specific solution they want to apply: let’s understand the problem first.
If Discovery Week reveals clean data, consistent process, and a team that’s executing well but slowly, then yes, there’s a strong case for integrating AI tools into the workflow. We’ll do it properly, with clear use cases and genuine measurement.
If Discovery Week reveals the things that most Discovery Weeks reveal: inconsistent data, undocumented process, a pipeline that tells a story nobody fully believes, then we’ll fix those first. The AI conversation can happen once there’s something worth accelerating.
That’s not a conservative position on AI. It’s a practical one. The businesses getting the most from these tools didn’t get there by bolting AI onto a broken system. They got there by building something that worked first.
If you want to understand what the foundations actually look like — the pipeline architecture, process documentation, and data discipline that make AI genuinely useful — What RevOps Actually Means covers the full picture. And if you want to see how AI tools fit into a revenue function that’s already working, the AI & RevOps page covers how I use them in practice.