Three Serious Mistakes in
CX Program Design
Three big blunders which ensure that failure is your fate.
Otto hears that NPS is a hot topic in the industry. He is VP of Support Operations at a Silicon Valley B2B software start up. He gets approval to begin an NPS program.
Though not understanding that NPS is a measure of the sum of the end to end customer journey, he crafts a survey. It measures only two aspects of that journey: engaging with support and product quality. The survey send list is pulled from the contacts in Otto’s support operations.
Initial scores are good and new client acquisitions are exceeding expectations. Otto hypes the organization’s high scores from engagement with support. As the months go by, however, client retention becomes a major challenge. Revenue takes a big hit.
How can this happen when NPS is strong?
We have met a lot of Ottos
Our team has collectively worked on over one thousand client NPS and CX projects. In most cases we were asked to improve existing programs. Over time we learned to recognize the most serious design errors that companies had made when they first introduced these programs.
The errors are quite easy to explain and understand. They are much harder to fix, though it can always be done.
The feedback process covers only part of overall customer experience
In Otto’s case, the main feedback that they got was about things going wrong. It is no surprise that a bad support experience or product quality issues can make customers angry and disloyal.
But what about the things that can make them happy? What about other critical items such as software updates? Do customers find the software easy to use? Are the training videos helpful? For the more sophisticated and expensive offerings, do customers get the ROI they expect?
Related to these questions is a bigger one. Are Otto’s measurement goals and the company’s strategic goals aligned? Perhaps the CEO wants the software to have a great reputation on LinkedIn. Does it? If the software is sold or implemented by consulting or implementation partners, what is their experience? How does it compare with the experience that users have with software available from competitors?
Moral of the story: capture all relevant information!
The data design phase is too ‘yesterday’
Imagine that Otto’s measurement process digs deeply into what customers think about the support experience. Imagine though that he does not look at the wider situation and the customer journey. He will be unable to answer important questions. Why, for example, are there are so many support calls? What are the underlying and root causes? Was what was implemented for the customer the same as what was sold to the customer? Are commonly used workflows too cumbersome? Is it too difficult to find or to understand key features?
Otto’s blunder is to treat support data in isolation from other operational data. That is yesterday’s approach. Today we demand and achieve greater sophistication.
Take SaaS software. Throughout the customer lifecycle, SaaS operational data should alert companies to potential issues. Below-expectation usage of a product, for example, is a red flag for pending customer disloyalty. Customer experience professionals can detect signals like this with very short delay, investigate, and deliver solutions.
Make sure your processes embody today’s (or tomorrow’s) approaches, not yesterday’s!
Rushing the design communication and roll-out phase
We have seen all too many cases where the result of a new NPS initiative is a nice set of scorecards and nothing more. Otto seems to have fallen into this trap. As a new system is rolled out, you must communicate and agree about who is going to do what with the results of the NPS implementation. Doing so sounds straightforward, but it is not.
We know all readers will have answered surveys, suggested improvements, and then never heard anything back from the company concerned
Take suggestions, made by customers, which lead to an approved improvement process. At the simplest level, implementation can be routed to whatever department should logically deal with the improvement.
But then some key questions arise. Has that department been informed of the new process? Does it have the resources to deal with the improvement opportunities? Is it in their plan for the year? Who will follow up to ensure that the suggested action is taken?
We know that all readers will have answered surveys, suggested improvements, and then never heard anything back from the company concerned. In our view, that is worse than not getting customer feedback at all. Communication strategies need to involve customer communication too.
At a more sophisticated level, what happens when an improvement suggestion crosses organizational boundaries and requires multiple people to work on it? Who takes the lead? Who tells customers what is happening?
The purpose of any measurement and improvement system is to drive positive change. Change in turn should boost a company’s financial results. Plan change systematically, at the right systemic levels! Do not rush it!
The processes for communicating customer suggestions and prioritizing resulting changes will not happen on their own. Nor will assigning project leaders to the planned improvements. Nor will dedicating funding for the more complex initiatives. And nor will creating a process for letting the customer know what is happening.
Give each step careful thought and agreement before launch!
ABOUT INSIGHTS FROM OCX COGNITION
OCX Cognition delivers the future of NPS. We ensure customer experience success by combining technology and data science with programmatic consulting. In our Insights section, we present a comprehensive and evolving collection of resources based on our research and expertise, collected for CX leaders committed to delivering business outcomes.