Customer Service Surveys – Bringing Sanity to the Survey

Nissan sent me a post car sale survey. I answered the questions honestly, and the sales rep did a great job; yet, those servicing and supporting the sales rep did some stuff, and I am not particularly pleased. I was specific in the comment section, praising my sales rep, and particular on who and where the ball was dropped creating issues. Not 30-minutes after completing the survey, I receive a call from a senior director at the Nissan Dealership who said that my sales rep was going to be fired and lose all his commissions for the month because the survey is solely his responsibility. The senior director went forward and said, “This is an industry-wide practice and cannot be changed.”

I disagree!

As a business consultant, long have I fought the “Voice-of-the-Customer” surveys for measuring things that a customer service person, salesperson, front-line customer-facing employee, does not control. If the customer-facing employee does not control all the facets that create a problem, then the survey should only be measuring what can be controlled. For a car salesperson, they have a challenging job, and they rely upon a team to help close the deal. Including a service department, a finance department, sales managers, and more. The blame all falls upon the sales rep for problems in the back office.

I sold my Kia Soul for a Nissan Juke, my salesperson at Peoria Nissan was excellent and I wound up purchasing three vehicles from Peoria Nissan. I have purchased a Rogue from Reliable Nissan, and a Juke from Melloy Nissan, both of which are in Albuquerque, NM. Tonna Yutze at Peoria Nissan is a great salesperson. Shawn Walker, at Reliable Nissan, is a young salesperson, with great potential. Of all the cars I have purchased, these are the car salespeople who have made a sufficient impression that I remember their names long after the sale — speaking more about the salesperson, than a post-sale customer survey.

Quantitative data is useful but means nothing without proper context, support, purpose, and a properly designed survey analysis procedure. Even with all those tools in place, at best, quantitative data can be construed, confused, and convoluted by the researcher, the organization paying for research, or the bias of those reading the research report.

Qualitative data is useful, but the researcher’s bias plays a more active role in qualitative research. Qualitative research suffers the same problems as quantitative research for many of the same reasons. Regardless, quantitative and qualitative data does not prove anything. The only thing qualitative and quantitative data does is supports a conclusion. Hence, the human element remains the preeminent hinge upon which the data swings.

Leading to some questions that every business sending out a “Voice-of-the-Customer Survey” instrument needs to be investigating and answering continuously:

  1. Is the data being captured relevant, timely, and accurate?
  2. What is being measured by the survey instrument, and why?
  3. How is the information being used to improve upon that which is measured?
  4. Who benefits from the survey and why?
  5. Who is harmed by the survey, and why?

Even with all this taken into consideration, business leaders making decisions about “Voice-of-the-Customer” survey data need to understand one person can make or break the service/sales chain; but, it requires a team to support the customer-facing employee.  Juran remarked, “When a problem exists, 90% of the time the solution is found in the processes, not the people.” Hence, when bad surveys come in, defend your people, check how your business is doing business, e.g., the processes.

The dynamics of “Voice-of-the-Customer” survey instruments require something else for consideration, delivery. AT&T recently sent me a “Voice-of-the-Customer” survey via text message. Collecting the barest of numerical (quantitative) data, three text messages, three data pieces, none of which gets to the heart of the customer issue.  Barely rating the salesperson in the AT&T store I had previously visited. Recently, I received a call from Sprint, where the telemarketer wanted to know if I wanted to switch back to Sprint and why.  The nasal voice, the rushed manner, and the disconnected mannerisms of the telemarketer left me with strong negative impressions, not about the telemarketer, but of Sprint. Nissan sends emails and while the data collected has aspects of the customer’s voice (qualitative) and numerical rankings (quantitative), my impressions of Nissan have sunk over the use of the survey to fire hard-working sales professionals. My previous bank, Washington Mutual, had a good, not great, “Voice-of-the-Customer” survey process, but the customer service industry continues to make the same mistakes in survey delivery and application.

How and why in “Voice-of-the-Customer” surveys, or the delivery and use of the survey data, leaves a longer-lasting impression upon the customer than the actual survey. Thus, if you are a business leader who purchased an off the shelf “Voice-of-the-Customer” survey analytics package, and you cannot explain the how, why, when, where, what, and who questions in an elevator, the problem is not with the customer-facing employees doing your bidding. If your back-office people supporting the customer-facing people are not being measured and held accountable, then the survey is disingenuous at best, and unethical at worst.

I recommend the following as methods to improve the “Voice-of-the-Customer” survey process:

  1. All business leaders using the customer service survey data must be able to answer the why, what, when, who, and how questions clearly to all who ask about the survey tool.
  2. If you have sections for the customer-facing employee and mix in other questions about the process, the customer-facing employee’s responsibility begins and ends with the specific questions about the customer-facing employees’ performance. You cannot hold a customer-facing employee responsible for broken back-office processes!
  3. Revise, review, and research the data collected. Ask hard questions of those designing the survey. Know the answers and practice responding to those asking questions.
  4. Get the customer-facing employees involved in forming what needs to be measured in a survey, have them inform and answer why. You will be surprised at what is discovered.
  5. Use the data to build, teach, train, coach, and mentor, not fire, customer-facing employees. Support your people with data, not destroy them.
  6. If you cannot explain a process or procedure in an elevator speech, the process is too complicated. No matter what the process is, use the elevator as a tool to simplify your business organization, processes, procedures, and tools.
  7. Be a customer! When a customer, ask tough questions, drive for answers that work, and if the customer-facing employee is struggling, train through the chain of command!

Never forget, the value of the “Voice-of-the-Customer” survey is found in actionable data, to improve cohesion between the front- and back-office, training talking points, and the power to return a customer to your business. Anything else promised is smoke and mirrors, a fake, a fallacy, and sales ruse.

© 2019 M. Dave Salisbury

All Rights Reserved

The images used herein were obtained in the public domain; this author holds no copyright to the images displayed.

 

Advertisement

Published by

msalis1

Dual service military veteran. Possess an MBA in Global Management and a Masters degree in Adult Education and Training. Pursuing a PhD in Industrial and Organizational Psychology. Business professional with depth of experience in logistics, supply chain management, and call centers.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s