Matt Melymuka, Co-Founder & Managing Partner & Sales Tech Team Lead, PeakSpan Capital | PUBLISHED ON APR 12, 2023
“I keep saying that the sexy job in the next 10 years will be statisticians, and I’m not kidding.”
– Hal Varian, Chief Economist, Google
We live in the Information Age, defined by Merriam-Webster as “a time in which information has become a commodity that is quickly and widely disseminated and easily available especially through the use of computer technology.” Knowledge previously recorded on paper and living in libraries is now digitally domiciled and hovering in the cloud. Information is accessible from anywhere, at any time – and the sheer volume of it being captured has exploded. We’ve produced more data in the last two years than we did throughout the entirety of human history before that. The only thing growing as rapidly as data itself, fortunately, is our ability to process it. Computing power is increasing at an exponential rate (literally) – for context, the latest iPhone has over 1.5 billion times more processing power than the NASA supercomputer that landed man on the moon.
Data itself is a commodity – like a bag of grapes anyone can grab from their local grocer for a few bucks. But winemakers have shown that those same grapes, when harvested in a specific way, blended precisely with other ingredients, and fermented for a particular amount of time, can be transformed into a highly valued fine wine. But renowned vino producers didn’t just get lucky the first time they made a batch. The masters will tell you their best product is the result of years of iteratively fine tuning their process, maniacally focusing on understanding the variables at play and analyzing how they interact with each other to optimize the blend with each successive vintage.
Analogizing grapes to data might be a bit of a stretch, but hopefully the premise is clear. Leveraging data to make strategic decisions is important… no shit, Sherlock! But the reality is we are still in the early innings of the information revolution, and while more organizations are beginning to weave data-informed decisioning into the fabric of company culture, transforming behavior and ushering in a shift in mindset takes time. Regardless of how powerful or impactful a solution claims to be, buyer readiness to adopt requires trust. And the most effective way to garner trust is to deliver clear, tangible value.
The world isn’t suffering from a lack of insanely smart, overly educated individuals trying to change the world with big data analysis and machine learning. Countless new startups equipped with billions of dollars of investment are focused on the opportunity. But who cares how algorithmically amazing a software solution is or how much analytical horsepower a platform has if the value it provides to customers is nebulous? Innovation without impact is nothing more than the next bright shiny penny.
The platforms we believe are best positioned to gain traction over the next few years are those focused on harnessing the power of data and applying it pragmatically to influence specific, well-defined business outcomes. Part two of our thought map series examines three sub-segments comprising vendors leveraging data-centric approaches to help go-to-market teams answer some of the most fundamental and impactful questions they face today.
The rise and proliferation of product-led growth – and more broadly the increasing pervasiveness of self-serve models – has enabled individual business users across functional groups to get up and running with productivity applications, workflow optimization solutions and broader enterprise platforms with the swipe of a credit card. Classic top-down enterprise software buying and selling is not going anywhere for a while, but we’ve clearly entered an era where almost every category of software has one (or many) vendors attempting to disrupt the landscape with a bottoms-up, friction-light GTM approach supporting the ability to purchase and deploy a solution with little or no direct human involvement. Nobody enjoys the feeling of “being sold to, especially in an in-your-face, pop-up-ad kind of manner, but this has evolved in a much more nuanced way over the last few years. Now that prospective buyers can – and very often do – ingest massive amounts of information comparing product capabilities, pricing and value proposition before ever engaging with a given business directly, the role of the seller in the buying process needs to change.
Automation and seamless deployment models have enabled this new wave of procurement, and effective sales motions must embrace this dynamic and evolve to provide value to buyers across the modern path to purchase. The roles of the seller in this inherently more buyer-led journey needs to shift from proactive engagement and explicit orator of value proposition to responsive enabler and implicit buyer-assist. When a prospect (or current customer) interfaces with the platform itself far more frequently than a human being, understanding the needs of the buyer necessitates “listening” to the data produced as the user engages with the solution. A new category has emerged and is burgeoning comprising businesses focused on not fighting, but supporting this shift, and strategically software-enabling these modern sellers to be better listeners. We believe this is one of the more pronounced changes in the buying motion over the last decade, and platforms that empower sellers to adapt and transition with the times in an elegant way will garner widespread adoption over the coming years.
Every company leverages some tool or vendor for sales data. Most use ZoomInfo – often in combination with other data providers – to supplement an email address with a variety of salient and not-so-salient data points that help piece together the puzzle of who the person on the other end of the email/phone call is. While data is a commodity, the vendors in this segment that we think are paving the way for a new type of truly engaging experience are positioned as much more than a data vendor, but rather a thought partner to your GTM team.
Data in a silo can be tactically helpful, but arming reps with the most pertinent context to ensure every interaction with a prospect is engaging and productive is highly strategic.
Delivering these insights in the most digestible way, to the right individual, at the right time, without disrupting their existing workflow is as important as the information itself – any tools that introduce friction or overhead for end users will struggle to garner adoption. As the amount of data available continues to explode, being able to sift through the noise and surface the right piece of information in the context of the situation is tremendously impactful. One of only two shameless portfolio company plugs in this series goes here to Cognism, a global leader in sales intelligence, providing businesses with data, tools to action the data and the contextual insights required to provide sellers with the best chance at having a great conversation. This focused, outcomes-based approach has catalyzed the business to grow over 10x in less than three years and will push them past $100M ARR in the near future. Context is queen, and the queen is valuable.
The finger in the air guessing game to answer the age-old question of “Where will we end up this quarter?” no longer plays. Given the massive amount of data we now capture across the sales cycle, doesn’t it seem a bit archaic that most companies still track pipeline opportunities using just a handful (typically seven) of high level, round number percentage stages? “25% for qualified”… yeah, that seems fair. Every opportunity has specific nuances, and I guarantee some of those prospects in that 25% bucket are more qualified than others. Now that we are able to track more and more data related to rep activity and capture information from every interaction/touchpoint with a prospect – in an automated way – opportunities can be analyzed with much more than seven stages of granularity. There is a new crop of vendors leveraging innovative analytical approaches to surface signals gleaned from the breadcrumbs of data produced across the buyer journey to form a much more detailed view of pipeline health. Like grading a test on scale from 1-100 versus using the five-tier “A” through “F” letter grade range, these approaches seek to incorporate a multitude of variables to provide a highly refined view of where each engagement stands. Organizations adopting these solutions are also better positioned to become increasingly agile with budgeting and real-time with forecasting as they scale.
The local weather woman doesn’t just guess what tomorrow will bring based on gut feel, she listens to the data to make an informed prediction. Sales should be no different.
Stay tuned for part three of our Thought Map series – Strategic Knowledge Flow – coming soon! Please feel free to email Matt Melymuka at matt@peakspancapital.com or Andrew Bartusiak at andrew@peakspancapital.com for a downloadable version of the thought map below and this article.