• 03:58:39 pm on February 2, 2009 | 1
    Tags: , , , , , , ,

    Surveys Done Right – Part 3 – Survey Best Practices

    OK, on to part 3 – best practice for surveys.

    I have been working on and off on this topic for a while, updating it and making sure it was still valid.  This is the knowledge I have gained over the years of doing surveys.  Enjoy, and do let me know what you think of it.

    Know What You Want. It is essential to know before crafting the survey what information you need to get out of it. Most surveys look for customer satisfaction, others are trying to determine how to improve the delivery of service. Yet others may be trying to determine the effectiveness of a certain service or program. The most important thing to know is that each of these goals should be unique and single – one survey per goal, with documentation before you release it.

    Determine Effectiveness. Survey methods will be unique for each channel (for example, Web, e-mail, phone) and customer segment (geography/postal code, age, gender,  buying/service patterns). Each customer segment is unique and will react to channels and survey methods differently. For example, a customer registering a newly installed personal computer may or may not have a readily available Internet connection during the registration process. A survey that requires customers to go online to complete an installation satisfaction survey may be an annoyance. However, the addition of a simple question, such as “Connected to Internet?” or “Store survey results until next Internet logon?” will advance the process. Effective surveys are delivered at the conclusion of a service interaction via the same channel as the service provided.

    Adhere to the “KISS” Principle. The keep it short and simple (KISS) principle applies to these surveys. Customers are more likely to complete a survey when the time to completion is explicitly displayed at the beginning of the process, the purpose of the survey is clearly stated and the use of information is defined. For example: “To help us improve our service to you in the future, please participate in our satisfaction survey. The survey will take less than one minute, and no information specific to you will be shared or distributed to any third party.” We recommend no more than three to seven questions per survey, one common topic, and short and succinct wording. Furthermore, to make it easier to answer and to tally responses, multiple-choice answers are recommended. Writing questions is a complex and iterative process. The most critical step in the survey process is to test questions. Before implementation, survey questions should be tested in focus groups, followed by a small pilot survey run and measured against a real customer base for a short period.

    Ensure Consistent Gathering. Hand-picking the “right” respondents to achieve higher scores is a common problem when gathering responses. For example, enterprises that discard participant data because the customer appears upset, not in the right frame of mind, biased or not really friendly rob themselves of critical insight and end up with skewed results. Customers who are the least “friendly” or sympathetic are often the most honest in their responses and central to a better understanding of where service delivery may be failing. Survey response gathering should be done consistently, either to all subjects in the target group (if feasible) or by following a predetermined random algorithm that will be applied to all customers (for example, every fourth caller into a Web site or call center).

    Read the Answers. More than 60 percent of enterprises generate a single data point from a survey – one that usually supports their original view of the level of customer service satisfaction – and ignore an analysis of the possible implications of the data. A well-designed survey will reveal trends, patterns and outright new information that are valuable to improving the customer service operation. This information should be used as part of an EFM system (read “Finally, A Definition for EFM“) to better understand, know and serve customers.

    Act on the Information. Even the best surveys, designed and executed with the best of intentions, are a waste of effort unless a set of specific actions is put in place to respond to customer input.  Alas, the majority of organizations fail to change the customer service organization or workflows based on survey results. Conducting a survey just to know where things are will upset customers who believe that the survey is intended to improve things. An enterprise that surveys customers but fails to act on the data will actually damage the relationship with the customers more than if they had never been surveyed. Even if the only purpose of the survey is to capture the status of customer satisfaction, customers should be given the opportunity to suggest specific input as to how they would like to see interactions improved. In all cases, the end result should be the same: Whatever needs to be fixed should be fixed, and what should be improved must be improved.

    Repeat It. Improvements to the customer experience happen through an iterative process. Responses captured once will be inadequate for determining a trend. If we don’t know whether customers are more satisfied today than last quarter, we are bound to lose customers, and the purpose of the survey will be lost. At periodic business cycles (such as quarterly), as determined by the business, the survey should be repeated, and all the above steps should be done again – including knowing what we want from the survey.

    Let Customers Know. Improvements to customer service as a result of customer feedback should be communicated back to the customers. Let them know that their input is taken seriously and used. Few organizations that conduct surveys do this, yet it is a critical part of customer experience management and of EFM strategies, and it should be done to encourage future participation from customers and to improve customer loyalty.

    So, what do you think?  Am I missing something?  Of course, the bottom line is that you should only do this as part of anEFM strategy… right?

     

Comments

  • Esteban Kolsky 10:11 am on January 29, 2019 | # | Reply

    Reblogged this on thinkJar, the blog! and commented:
    part 3, four part series on doing surveys right, reblogging before it disappears, 10 years later

    if this needs updating, ping me. glad to chat.


Leave a reply to Esteban Kolsky Cancel reply