One of the first areas business leaders will have identified that AI can bring gains and efficiency is in go-to-market. It’s a brilliant use case but is it built on quicksand?
One of the first areas business leaders will have identified that AI can bring gains and efficiency is in go-to-market. It’s a brilliant use case but is it built on quicksand?
Our intent data shows that many organisations are investigating or already using AI and large language models (LLMs) in the sales cycle, from powering lead scoring, to drafting targeted messaging, to automating workflows.
Used correctly, AI has the potential not just to reduce overheads and automate slow sales processes but also to improve targeting and speed to market, with the ability to better identify high-quality prospects based on factors such as circumstances, intent, and propensity to buy.
That’s the dream outcome of any sales leader or chief revenue officer (CRO) currently looking at AI system investments or investigating new LLM capabilities. There could, however, be a critical fault line on the path between their dream and reality: the quality of the underlying data of the AI models. Our recent research has shown that data acuity is the factor keeping CROs up at night.
Bad dreams of dirty data
According to our research, 75% of revenue leaders consider data quality their biggest challenge, reflecting an understanding that – while their go-to-market strategies increasingly rely on agentic AI – their underlying data infrastructure hasn’t necessarily kept pace.
CROs realise that their shiny new AI systems could be undermined by feeding “dirty”, i.e. stale, inaccurate, or incomplete data into them, impacting the organisation’s ability to win business and grow revenue.
Unfortunately, organisations’ historic sins in the handling and management of data are now coming back to bite them and, unless these issues are quickly addressed, a company’s data could become its single point of failure and result in it losing critical advantage against competitors.
For years there has been a lack of unified enterprise-wide data strategy and governance. Data sources are often disconnected and exist in departmental silos. Furthermore, much of their data is decaying faster than organisations realise, lacking quality and reliability, which dramatically impedes decision-making and execution.
Data decay due to C-Suite churn
This data decay is no better illustrated than in the sales data underpinning companies’ go-to-market strategies. Our research shows that the rate of churn in the C-Suite means that a vast amount of account data will become obsolete within a matter of months.
In the UK, France, and Germany – Europe’s three biggest markets – more than half of data relating to C-Suite roles is out of date within two years because of business leaders leaving their roles. The CMO role sees the most churn, with 35% annual data decay, very closely followed by CROs (34%) and CFOs (32%). The CEO role appears to be the most stable, however still over a quarter of that data (26%) goes out of date every year.
Churn – and therefore data decay – is lower for US roles, pointing to a marginally more stable market. However, the pattern is the same, with CMOs (30% annual data decay) being the most volatile, followed by CROs (29%), CFOs (24%), and then CEOs (23%). C-Suite data lasts longer across the pond, but half of records will still become inaccurate in just 25 to 27 months.
All of this is to say that account maps age fast, organisation charts shift overnight, and buying committees could look different every quarter – leaving AI systems knocking on closed doors if the underlying data they are using is pointing them to a contact that is actually six months into a new role at a different company.
Reaching decision-makers within a narrow buying window
The challenges created by sales data volatility are compounded by the fact that the time to influence potential buying decisions is also becoming shorter. Our research shows that (perhaps driven by pressure to deliver change in their shorter tenures) 78% of decision-makers spend their budgets within their first 90 days.
This leaves an extremely tight window for a sales team to identify a prospect has entered a new role, update their system with the correct data, and set about outreach before the time to influence a buying decision has passed.
Designing a data-first approach
The companies that will thrive under these market conditions are not necessarily the ones that have the best AI tools. They are the ones that are fluent in data – able to detect, interpret, and act quickly on live buying signals. Or, to put another way, the ones using live data to adapt faster than their buyers can move.
The most forward thinking companies are rightly thinking about building a data infrastructure before applying new AI capabilities. This approach also allows for more flexibility in the tools they use to action the data now and in the future, at a time where there is currently no universally adopted model and sales leaders are experimenting with a combination of a number of different systems, including CRMs, MAPs, business intelligence tools, or direct deployment of LLMs.
Taking a data-first approach means that, whichever tools or systems take dominance - including AI - the company will be able to draw on the best information, keep pace with the fast-moving business landscape, and ultimately gain an advantage over their competition.
Dominic Allon is CEO of Cognism
Thanks for signing up to Minutehack alerts.
Brilliant editorials heading your way soon.
Okay, Thanks!