Over the past fourteen years, ServiceTick (now part of the Davies Group) has delivered millions of Voice of Customer surveys and analysed hundreds of thousands of responses. We’ve worked with some of the UK’s leading Banks, Building Societies, Insurance Companies, Supermarkets, Travel Companies as well as key Regulatory Bodies. We’ve delivered surveys by email, web, IVR, in-App and Beacon technology. We’ve generated returns on investment worth hundreds of thousands of pounds and helped our clients develop stronger bonds with their customers.
It’s been a rewarding process all round.
But things haven't always worked out as they should. As well as learning what makes a world-class VoC programme tick we’ve also seen examples of practice that leaves a lot to be desired.
Here’s a quick round-up of the eight most frequent pitfalls we’ve encountered
Setting out without a clear set of objectives - If you're not absolutely certain about what you're trying to achieve then it's going to be tough to build the right survey. Having a clear set of objectives will influence the methodology you use, the way you structure the questionnaire and the actions you take as a result. Our clients use VoC surveys for a variety of reasons - customer recovery, service improvement, agent performance management, proposition development, brand and culture change - but they all have a clear goal in mind. Above all your questions should reflect and measure the values and behaviours that you are trying to build in your organisation
Holding your VoC provider at arm's length – you can treat your supplier as a partner or a provider. As a partner, you will get more engagement, more insight and better results all round. You will have an objective observer able to give you advice that is untarnished by internal politics. You will also have the benefit of what we have learned from 14 years of VoC delivery. As a provider, you will have someone who delivers numbers but is not allowed to comment on what they mean. Worst of all you may even be tempted to use the data to massage perceptions of CX performance when the reality is somewhat different. It’s sometimes tempting to deploy 'vanity metrics' to impress the Board but you only end up fooling yourself.
Using the wrong methodology - Real-time feedback can be delivered via any communication channel (telephone, SMS, email, web, webchat, in-app, social media) but your channel choice will have an impact on the outcome of the surveys. Our rule of thumb is to use the channel through which the customer contacted you in the first place. And survey as soon as possible after the transaction. We’ve seen response rates fall by up to 70% by delaying the survey request for 24 hours. And the quality of feedback is less sharp when it’s a day or two later.
Asking the wrong questions - VoC surveys work best when they are tailored to a specific touchpoint on the customer journey. Questions should reflect as closely as possible the particular experience the customer has had. Asking exactly the same question set across a new business, renewal, admin and claim process will limit the insight you can generate. Definitely have a core (and unchanging) set of KPI questions - NPS, CSAT, Effort – but use other questions to explore different aspects of service.
Worrying too much about the numbers - What all VoC metrics have in common is the goal to reduce the totality of customer experience to a single number. There is clearly a benefit in a 'one-number' solution (easier for business-wide communication, employee engagement, board reporting) but there is also inherent risk. In particular from executive boards who tend to see things only in terms of numbers going up or down. The key to getting the most out of any metric is to see it as a signpost, an indicator of how your brand, product, call centre team or call agent is performing over time. The score at any given point is not as crucial as understanding why it has gone up or down. This is why every survey must have a verbatim opportunity that allows the customer to share their experience. NPS and Effort questions will tell you what is happening. Only verbatim questions will tell you why and what you need to do.
Responding to trends that are not statistically significant - By the same token, when introducing a VoC survey to a new client we often find that team leaders or agents become fixated on the numbers, even though they may not be statistically significant. An agent spending time worrying that one of the last five responses did not give them the maximum score is an unnecessary distraction. To avoid this we spend time bedding in new VoC programmes, training team leaders and ensuring staff understand and are engaged with what is happening.
Losing your audience – It’s vital to ensure that you keep your surveys up to date and relevant. This may mean tweaking your questions occasionally to reflect the realities of changes in business strategy. And it’s as important to ensure that internal reporting - both at top floor and shop floor level remains fresh and useful. Losing either of these tiers can sound the death knell for any programme......communication, communication, communication.
Not taking action - Perhaps the most surprising pitfall is the one we have left till last. The only reason for running a VoC programme is to address whatever objective you have set (see 1 above). If you don't take action as a result of the feedback your customers are freely giving you will never realise a return on investment. In 14 years I have never seen a VoC programme that did not throw up multiple opportunities to improve customer experience. But I have seen clients who did not act on the findings of their surveys.
Bill Bernbach, voted the most influential advertising executive of the 20th Century once said, "Logic and over-analysis can immobilise and sterilise an idea. It's like love - the more you analyse it the faster it disappears." So it is with VoC; its effect is only ever measurable by the action you take.