Analytics are great, but they’ll never be as useful as asking customers
This article originally appeared on our parent company’s website on the CallidusCloud Blog.
In his autobiography, Mark Twain said “There are three kinds of lies: lies, damned lies and statistics.”
If commenting on business life today, Twain may well have said, “There are lies, damned lies, statistics and analytics!” Analytics, particularly the predictive variety, is oft presented as the new panacea: the cure-all for companies as they seek to understand their customers and improve sales. There is no doubt that 2015 will see the analytics and big data train continue to roar down the track at an increasing rate. All that investment can’t be wrong, can it?
There is another camp that argues it is better just to ask customers what they want rather than try to imply that understanding from complex analysis of data. The “customer knows best” advocates suggest that collecting feedback on what customers want and measuring how well those needs are met is the surest way to growth.
Well here’s the short answer: you will benefit from having both! However, it is important to be aware of the benefits and challenges each present.
Data quality can derail both analytics and feedback: the old maxim of garbage in, garbage out applies. Poorly designed feedback programs may be unrepresentative of the customer base, providing unreliable results. Equally, analytics can only work on the data available. Incomplete or erroneous data will give false results that may result in poor decisions.
The timeframes of the data can also cause problems. Many companies still rely on what we call “the annual do you love us survey” for customer feedback. Often designed to appease senior managers and drive bonus plans, this type of feedback is neither use nor ornament. Few can remember details of an interaction with a company last year, so as a basis for understanding or meaningful action it is about as useful as a chocolate teapot. Effective feedback is collected in a timely manner and, most importantly, acted on promptly.
Analytics can also fall foul of inappropriate timeframes for the data underpinning analysis and predictions. For example, economies are known to operate on different time cycles, from the relatively short-term Kitchin Inventory cycle (roughly 40 months) to the very long-term Kondratieff Wave, which operates over 40 years. Basing decisions on the wrong time frame may miss the impact of a different time frame and thereby wrongly influence results.
Interpretation of the data often falls foul of poor or downright wrong assumptions. Many people mistake correlations and causality: just because things relate to each other does not always imply cause and effect. UK phone company BT once produced research that showed the greatest correlate of phone use was cat ownership. That does not make buying everyone a cat a valid growth strategy. I recall a finance director questioning the link between training spend and profitability. Most believed that more training led to more competent people and therefore greater profitability. His argument was that profitable companies can afford to spend more on training. With the data available, it was impossible to establish which factor was the driver.
Many of the issues associated with interpretation of data reflect the viewpoint of the person doing the interpretation. People in companies often share similar mental models of their customers and markets, so the middle of the distribution curve is often less represented. People are more likely to respond if they have had a particularly good or bad experience – extremes are more likely to elicit a reaction. The same is true of the pattern of survey responses. Feedback can also be intentionally biased by the respondent. When asked about their drinking habits, people often understate their consumption, even when they know the true figures.
How can we use an understanding of the challenges of analytics and feedback to better understand customers as a basis for improving performance? Here are a few tips.
- The first is obvious: use both feedback and analytics to provide a richer picture of prospects and customers. In fact feedback should be a data feed available to analytics.
- Put all data in one place. Not only does this make it easier to build holistic models, but it helps improve the customer experience as staff dealing with customers can tailor their interactions based on full knowledge of the relationship. Remember, many customers get frustrated when companies fail to use information they have, and have to repeat it or suffer from inappropriate interactions.
- Make actions the focus of both feedback and analytics. Building understanding is not an end in itself – it is merely a means to better decisions and actions that improve business performance and the customer experience. Analysis without action is for academics, not business people.
- With all the hype about “big data,” don’t lose sight of the value of little data. Addressing the concerns or issues of an individual can turn a customer into a valuable customer and a vocal brand advocate: remember, retention happens one customer at a time.
- Try to be aware of the company’s blind spots. Be willing to listen to the lone voice – they may just be the one with the right answer, or the one who has spotted the emergent trend that is the next big thing.
- Build rich models of inter-related data points and look for unexpected patterns. Investigate correlations further to see if they point to issues of significance.
- Test predictions against real outcomes to fine-tune the analytics engine and refine your feedback. Beware those that say things must stay the same to provide benchmarks and trends. Tracking a trend that no longer matters to customers is a waste of time.
- Recognize the limitations of analytics and feedback. In 1886, Karl Benz had no research suggesting demand for a motor-powered vehicle; Steve Jobs was famed for saying “Customers don’t know what they want until shown.” That doesn’t mean Apple doesn’t work hard to understand the unspoken needs of customers.
Jobs’ stance leads me to my final point. The saying “the harder I work, the luckier I get” has a corollary in analytics and feedback: “The more data I have, the more intuitive I become.” Intuition is nothing more than spotting patterns earlier than others. The more data types and points available, the easier it is to see a pattern. Use analytics and feedback to fine-tune and guide intuition, not to replace it entirely.