fbpx

Many companies have had a formal customer experience program in place for many years. They have accumulated thousands, and in some cases, hundreds of thousands of interviews. Employees have completed training, and new technologies have been implemented to improve the customer experience. Among our long-time clients, all have a customer experience that is superior to what it was in the past. As I work with these clients, I have had several discussions about how to keep their program both fresh and effective. I want to outline some suggestions that can help keep the feedback component of your program alive and well:

  • Review your questions. Are you asking the ones most relevant to your outcome measures? One way to determine relevancy is to run a simple correlation with your outcome measures and the other questions on the survey (e.g., NPS, Overall Satisfaction, etc.). Which ones have the strongest correlation? Which the weakest? I suggest eliminating those with weak correlation because they are not helping you. This process may lead to a shorter survey, which is always important. If there is no strong correlation with any of the questions, this means you need to start over with your questionnaire.
  • Are you missing input from certain customers or customer groups? If these customers have opted out of the survey process, can you get them back in by promising a different type of survey (relationship versus transactional) or surveying them in a different way (phone versus email)? If you give customers a choice as to how they are going to participate in a feedback process, they are more likely to participate in the first place and continue to participate.
  • Consider different survey approaches. If you are using exclusively phone or exclusively digital, consider mixing your options or perhaps exchanging one for the other altogether. For example, we have traditionally conducted phone surveys, but we are beginning to blend phone and email. One caveat is that the results may vary depending on the survey modality chosen.
  • Connect survey results with business outcomes. Anecdotes are fine but blend in some hard analysis. For example, one of our clients looked at how their more loyal customers compared with less loyal customers on such things as revenue, the length of time owning the brand, and warranty repairs. In each case, they found that those customers who gave a 9 or 10 on their loyalty metric had greater sales, longer time with the brand, and lower warranty costs. This research provided a strong case for their dealers to redouble their efforts to improve customer experience.
  • Always “market” your improvements to your customers. If customers are to stay engaged with the program, they need to have feedback about what you have done to make things better for them. I recommend to our B-to-B clients that every six months they identify 1 to 3 things they have implemented to improve customer experience. Include these things in communications with customers (e.g., invoices) and certainly on your website. While they may experience improvements in your service, it never hurts to remind them that the improvements were intentional.

I want to emphasize that the feedback or customer listening part of your program is only one part of improving the customer experience. My suggestions are aimed at a very important part of improving experience, how you listen to and respond to customer feedback. Keeping a customer experience program vital requires a willingness to continually change and learn from what is working and what isn’t. If the questions are not working, change them. If the survey method is not working, mix it up. Most importantly, if customers provide you with good ideas on how to improve the customer experience, act. If customers and employees see improvements, that loop between customer feedback and action makes for a vital, living improvement process in your company.

Latest from our blog:


Trusted by B2B Businesses