• 06:14:49 pm on January 15, 2009 | 10
    Tags: , , , , ,

    Surveys Done Right – Part 1 – Point-of-Service Surveys

    I am going to break some very old rules of mine to write this post.  Ever since I introduced the three-layer model for surveys while at Gartner (point-of-service, customer-satisfaction, planning) I have been getting requests for “sample questions”.  I have maintained, and continue to do, that I cannot provide sample questions since all questions need to be created according to the situation, respondent base, strategy and vision for your feedback initiative, as well as the standards and rules you set for your surveys.  Of course, they also have to be personalized to respondent and situation, and be written to match delivery and collection channel.  This is as basic as it gets when writing surveys.  My concern / problem is that when someone gets “sample questions” they become “THE questions” without further tinkering, and that is just wrong.

    So, the counterpoint to that is that I have seen the concept implemented (point-of-service surveys) with some truly horrendous ideas.  I have experienced “short” surveys of 10 questions asking all sorts of things, and questions so badly written that it is almost impossible to answer.  Thus, as a public service (yes, I know I am a selfless philanthropist when it comes to surveys) I am going to break the rule and make this post about two things: a reminder of how point-of-service surveys should be done, and a set of sample questions (which I will regret for a long time, and possibly my grand-kids will as well).

    First, how does this work.  Point-of-service (also called point-of-delivery) surveys are SHORT (yes, needs to be shouted), 2-3 questions surveys aimed to discover the efficacy (not the efficiency) of the service interaction.  In other words, did we do a good job delivering service and was it what you needed.  It is intended to spot any problems during delivery, and to fix them before they become customer service issues or lead to customers not being satisfied.  Simple, huh?

    Now, the main point of doing this is preventing service issues from becoming problems.  Thus, the critical part is not doing the survey, but actually having processes in place to reach out to customers and fix their problems when either of these questions returns a negative answer.  This is where most companies fail, they don’t have documented, specific processes in place to take care of negative answers quickly (yes, speed matters).  The reason I am bringing this up, even if you copy the questions from the bottom of this post – please, please, please make sure you have the necessary processes in place before doing these surveys.

    Final point I want to make, then we move to the actual questions.  Channel of delivery matters.  If at all possible, try to keep the survey in the same channel where the transaction took place, and to follow the interaction immediately.  If the customer called, make the survey an IVR-driven survey post-call (no, don’t have another call for follow up… it does not work that way).  Email came in? email going out (as quickly as possible, not 2-3 days later).  If you cannot maintain the channel of service be the channel of delivery (or you cannot make it immediately following the transaction) then your best bet is using email surveys.  No, not email with links to online surveys — email surveys.  The questions are within the email and they can answer simply and quickly.  OK, getting off the soap box now (yes, I am passionate about this “stuff” being done well).

    CAVEAT: I know I said this before, but please, please, pleaseeeeee customize these for your situation and personalize them. Please?

    Question 1 (this one should never change): Did you receive the answer you needed?

    Question 2 (choose from the three below based on what else you need to measure):

    Q2.1: Did we do a good job delivering the answer? (my favorite, but a little broad in meaning)

    Q2.2: Was our service cordial and polite? (in other words, who needs some training or talking to)

    Q2.3:Was our representative knowledgeable? (again, training or knowledge management issues)

    Q3.3: Was our representative prompt to answer your questions? (do they know what they need to know?)

    You get the idea, depending on what part of the interaction is critical you can change the second question.

    So, please don’t let me down. Customize, personalize and (more important) let me know how it goes…

     

Comments

  • Beth W 11:22 pm on January 15, 2009 | # | Reply

    Thought these ideas were great and will implement same into our future surveys. Thank you for the succint information.

    • Esteban Kolsky 4:06 am on January 16, 2009 | # | Reply

      Thanks for stopping by… please stay tuned for the next article (that will be out Monday) following this series. It will be about the customer satisfaction survey.

  • M Sarasti 10:52 pm on January 16, 2009 | # | Reply

    Thanks Esteban! We met briefly at a Gartner conference in 07, and, having discovered your blog this week, I’m very excited to continue the conversation here on this page!

    I did have a comment/question regarding “Q2.3:Was our representative knowledgeable and prompt to answer?”…

    As far as question design goes…Shouldn’t we separate “was our representative knowledgeable?” from “was our representative prompt to answer” ???

    As far as being able to pinpoint the problem…knowledge or speed…it would be difficult to determine from the way the question is worded. Granted, it’s likely these two components are related… nonetheless, shouldn’t we make sure we’re only asking about one thing as to not confuse the respondent?

    Again, thanks for this amazing resource! It made me very happy to find that these discussions and ideas have a home…

    • Esteban Kolsky 12:24 am on January 19, 2009 | # | Reply

      I do remember you, and it is very nice of you to stop by.

      You are correct in the description of how the question should be phrased. Unfortunately I pushed the publish button one second too soon, and I have already fixed it. Thanks for catching it!

      Looking forward to seeing you back soon… and feel free to drop me an email should you need anything else. my email is ekolsky [at] evergance[dot]com.

  • Rune 10:15 pm on January 21, 2009 | # | Reply

    I agree that the concept of embedded e-mail surveys is a great idea instead of having an invitation e-mail with a link. As far as I know there are too many technical issues with embedded surveys. I know that you embed a google survey in e-mails. Do you have any data on e-mail clients and how well this work in practice.

    Thanks for a great post. Looking forward to the next in the series.

    • Esteban Kolsky 2:18 am on January 25, 2009 | # | Reply

      Rune,

      That is an excellent question, and I do have something further to add. First, the only problem with clients become those that don’t ready HTML – there is about 30% of mail readers out there that don’t enable HTML to read their emails – and you can always include one of those (if you cannot read this email properly, take the survey here) links to replace HTML in the email. In addition, the response rate for inline surveys for email is about 3-5% higher than if you just do emails with link to a web site. Besides, for two simple questions it is better, simpler, and will yield better responses if you just do it via email and now via the web.

      If you need some more info, please let me know.

      Thanks for reading…

  • Surveys Done Right - Part 2 - Customer Satisfaction « eVergance Blog 2:05 am on January 25, 2009 | # | Reply

    […] have been dreading writing this entry since I came up with the idea for the series (have you read part 1 yet?).  It is not that I don’t know what to say, or that I don’t want to do it.  It […]

  • Surveys Done Right - Part 4 - EFM Best Practices « eVergance Blog 7:08 am on February 10, 2009 | # | Reply

    […] so far have covered point-of-service surveys, customer-satisfaction surveys, and best-practices for surveys.  On to the best way to implement […]

  • Surveys Done Right, Part 2 – Customer Satisfaction @ crm intelligence & strategy 8:30 am on August 3, 2010 | # | Reply

    […] have been dreading writing this entry since I came up with the idea for the series (have you read part 1 yet?).  It is not that I don’t know what to say, or that I don’t want to do it.  It is simply […]

  • Esteban Kolsky 10:09 am on January 29, 2019 | # | Reply

    Reblogged this on thinkJar, the blog! and commented:
    10 years later, and saving the content before it disappears.

    four part series on Surveys (back when I was deep into this) with interesting ideas that still apply.

    ping me for updates, if you want to chat about this.


Leave a comment