Questions? (877) 643-7244
FutureNow Post
Tuesday, Feb. 17, 2009 at 5:28 pm

The Really Missing “Online Voice of Customer” Manual (Part 1)

By Bryan Eisenberg
February 17th, 2009

Yesterday, I posted the Missing Google Analytics Manual. That was relatively easy to put together since there are so many wonderful resources already written about it. However, as I tried to put together this post, I realized a real gap in the knowledge base available. I’ll be posting this as an ongoing series, that I might turn into a best practices whitepaper.

The “Voice of the Customer” (VOC), can be obtained in many ways: surveys, reviews, customer requests, interviews,  focus groups, field reports, etc. In order to find those golden nuggets that can lead to improvement you need to start with the segment of customers that like you the least.

In the first case, those that like you the least are more likely to be biased in their observations; that is, there are of course people who actively dislike your company or products or services, though the stronger they feel so the less likely they will respond to a VOC appeal anyway. Rather, those responding to a VOC appeal would be those who “unlike” you: they’ll be quasi-neutral, perhaps ticked-off at some silly thing you’ve done, a mild dislike at worst. Thus, they will also tend to improve their feelings toward you as soon as you engage in an attempt to listen to them, and you’ve given them a chance to vent. Especially about stuff that you’re blind to. So VOC surveys that are meant to reinforce what you think or feel without some a mechanism for real painful insights into your systems’ flaws tend to not yield much benefit.

Voice of customer programs have picked up in adoption in the last couple of years, especially in the last twelve months with free options like 4Q, Kampyle, and GetSatisfaction. Of course you can also use tools like surveymonkey, zoomerang and others to launch surveys as well. Paid options include Foresee, iPerceptions and Opinion Labs.

Since there are so many tools already available, I am going to ignore issues involving “setup” of a VOC solution, and start instead with an exploration of various invitation-to-participate options.

Option 1: Intercept on Arrival – This approach is meant to engage visitors before they interact with your website and have any set expectations. The way this gets launched is typically by some random sample (although I am still seeing too many sites using this on 100% of their visitors – and that is a bad practice) and it presents an invitation to provide feedback after they finish their experience on the website. This can be felt as intrusive to a segment of your audience, especially repeat visitors if the survey invitation keeps popping up (because the cookies that are set were not found again). 4Q works this way. 4Q’s research has shown an increase in conversion and brand impression by using VOC with this method. It would seem to indicate that a website that is listening to its audience seems to instill greater trust in the brand, or at least  it gains more from the listening than it generates in additional irritation from the “in-your-face” interception.

Option 2: Intercept on Action/Behavior – This approach offers survey recipients to engage with a survey based on their previous actions on the website. An example could be launching a survey when someone abandons a shopping cart. These surveys are insightful only towards that limited task and not your audience as a whole, but may provide you with tactical and actionable recommendation on resolving particular task issues encountered. This is can also feel intrusive, and if someone is already dissatisfied with a brand interaction and you pop-up this survey it may feel like rubbing salt into the wound. When it’s done, it has to be implemented with the lightest of touches.

Option 3: Passive – While a passive invitation is non-intrusive to the customer experience, it tends to favor toward those who favor actively providing feedback. Often you will find this as a embedded link, a wdiget in the corner of a page, etc. People who have had a negative experience tend to be the ones who seek out these feedback mechanisms, and are usually used to deal with more tactical issues that occur on the page level. Those who have had positive experiences tend to not leave as much feedback with this method, which causes us to often misunderstand the size and scope of the issue. Response rates tend to be lowest in this format. There’s also a sample bias towards buying modality, insofar as Humanistic personas will make up a larger proportion of this response group as compared to your general audience.

Each of the above options is viable; they are all worthwhile tools for the right job. The important piece to remember is that web analytics is meant to show us the what has happened and VOC is intended to help illuminate why. This is the reason it is important to tie analytics and VOC tools together. Voice of customer is driven by the need to have actual customer feedback woven into your future customer interactions. You can also collect feedback based on what people click and interact with on your website as well (think buttons in a flash demo, filling in a calculator, etc).

The reason we want to collect this information is because we want our customers to have greater satisfaction, an improved experience, and a visit where they achieve what they came to accomplish. The insights provided by VOC should help us in our continuous improvement efforts by helping us align our goals with the customers’ goals and identifying possible friction points.

Voice of the Customer programs are meant to capture the open-ended dialog, because that is where we often see the deeper insights. Like every analytics approach, you gain the most when you can segment by areas that you have already identified as potential weak points through the use of other analytics metrics or usability studies.

In fact, 4Q has just released some segmentation features to their free survey tool and plan on adding additional ones (full disclosure: I am an advisor to iPerceptions, the company behind 4Q). You can also do this by offering a segment-specific survey at a given point in their experience using one of the appraoches outlined earlier in this article observe what possible solutions to the problem may be uncovered.

Next post I’ll cover research design and what kind of questions are best to ask.

Add Your Comments

Comments (25)

  1. Any hints on tools for number 2 and 3? I’m particularly interested in #2, but not sure what is out there that supports this.

  2. Bryan – great summary. I have been trying to get a CEO to stump up the cash for Voice of Customer VOC as I see it as vital Too many unknown unknowns without it – we basically dont know anything about visitor satisfaction and yet spend heeps of cash trying to improve engagement with new features. 4Q is really good and I run it on a number of sites the POV questions are not approporaite to the Web 2.0 Virtual World/Gaming site I am currently working on. Still the advanced segmentation functionality is a great addition. How I would love to be able to afford full iPerceptions, Foresee or Opinion Lab. Perhaps one day.

  3. Great post Brian. I am 100% with you that an encompassing guide to VOC is badly needed. And I can’t think about a better person to write one than you ;-)

    One thing you wrote that called my attention that I do not agree completely:

    “Rather, those responding to a VOC appeal would be those who “unlike” you: they’ll be quasi-neutral, perhaps ticked-off at some silly thing you’ve done, a mild dislike at worst.”

    I am not sure that those are the people most willing to help you. Based on my personal experience, I feel like I can spend hours helping Amazon, or Google, or LinkedIn… sites I am addicted to. I think this is mainly because I know that any improvement I manage to push into their products might improve my life and save me time in the future. So I guess there is an important factor of ‘how much I use this site’, not only ‘how bad/good was my experience’. IMHO it is not about the customer helping the website; it is about the customer helping himself saving some time next time he comes back.

  4. You mention that, as a way to act on visitor behavior onsite, one possible theory might be to have a survey pop up when someone abandons a cart. This is a great idea in theory, but what will actually prompt users to complete the survey? If they abandoned the cart, they’re probably annoyed for some reason, and it seems like they might just get more annoyed with things popping up at them when they’re trying to exit the site. For this to work, there would need to be some kind of incentive. Even though it promotes greater customer usability in the long run, if a visitor doesn’t see any advantage in completing a survey right now, they probably won’t bother to do it. Any ideas?

  5. We’ve tried all three methods above, with varrying levels of success. I don’t want to comment directly on a particular product (it may have changed since our implementation), but we’ve had more success with launching surveys on a particular action, on site exit, and on site entry (meaning our completion rate was between 2-5% of the total population the survey was exposed to) than we have with the passive “click here to offer feedback” method.

    Obviously there are good reasons to run a survey using each of the methods (you don’t want an entry study to ask questions about their experience on the site).

    We currently use Keynote for our survey tool. It can be used for all three methods described above. While it has it quirks, its flexibility in designing questions offers one the ability to use the survey for more than a blunt “metrics” type tool.

    I may be wrong, but the tools mentioned in Bryan’s article are all pre-configured with a set of questions (some have limited felxibility for changing the questions, but not the type of question). These tools are great for general VoC-as-metrics initiatives, but limiting for more focused efforts.

    We’ve had a need to ask multi-select, ranking, single choice, and openended questions. As well as the ability to create different paths through the survey based on the response to a particular question.

    To Rachel’s question: we’ve almost never provided incentive for a survey that is designed to launch at random (meaning it was not sent to a specific group of people). While it is true that someone may be cranked off enough with the cart experience that they will ignnor the survey, chances are they will it out. Or at least you will have enough people who do fill it out to go forward with A/B testing.

    If you make the invitation to the survey welcoming and humble “We see that you’ve left our shopping cart before completing your order. We’d like to hear from you about your experience so that we can continue to improve our site. Please click “continue” to answer a few questions (2-4 minutes).”

    Think of it this way, the person already navigated your site, selected a product, and added it to a checkout cart – they are engaged! Its like waiting in line at the store and then stuffing the item in the impulse isle and high-tailing it out of the store.

    This is a great topic Bryan, will you provide any thoughts on how metrics and VoC can work together?

  6. To clarify what Daniel was asking, this is really meant to focus on feedback and surveys and not reviews. I’ll cover reviews in another post.

    David – I do plan on talking about how VoC and metrics can work together.

  7. [...] Customer” (VOC), as Bryan Eisenberg points out there are at least 2 other methods including passive and action/behavior triggers that are worth exploring and [...]

  8. I used to want to stay away from hearing negative comments from my customers, but that is where you get the most information for improving your site or service. Hearing “great job” or “excellent service” sounds great and can boost your ego and self esteem, but it does nothing for finding real problems. Face the hard reality!

  9. [...] Customer” (VOC), as Bryan Eisenberg points out there are at least 2 other methods including passive and action/behavior triggers that are worth exploring and [...]

  10. Survey’s on site exit from what I hear are pretty powerful. While it does somewhat tick off visitors as they leave, it actually doesn’t harm much because it’s an exiting visitor right, they were already leaving. So even if 1 in 1000 fill out the survey that’s information that we didn’t have already.

  11. This is a great topic Bryan, will you provide any thoughts on how metrics and VoC can work together?

  12. Great site information, Bryan!

    YourNetBiz

  13. [...] Sahlin shared The Really Missing "Online Voice of Customer" Manual (Part 1 … — 16:14 via [...]

  14. Great marketing information1 I’ll definitely use this to determine the voice of my customers.

  15. Very interesting and insightful!

  16. Wonderful article which raises a lot of points!

  17. Very insightful, I’m gonna bookmark it! Thank you Bryan!

  18. great summary. I have been trying to get a CEO to stump up the cash for Voice of Customer VOC

  19. Some good food for thought – something we need to think about to improve our website.

  20. Great marketing information1 I’ll definitely use this to determine the voice of my customers.

  21. Any hints on tools for number 2 and 3? I’m particularly interested in #2, but not sure what is out there that supports this.

  22. To clarify what Daniel was asking, this is really meant to focus on feedback and surveys and not reviews. I’ll cover reviews in another post.

    David – I do plan on talking about how VoC and metrics can work together.

  23. I used to want to stay away from hearing negative comments from my customers, but that is where you get the most information for improving your site or service. Hearing “great job” or “excellent service” sounds great and can boost your ego and self esteem, but it does nothing for finding real problems. Face the hard reality!

  24. I have been trying to get a CEO to stump up the cash for Voice of Customer VOC as I see it as vital Too many unknown unknowns without it – we basically dont know anything about visitor satisfaction and yet spend heeps of cash trying to improve engagement with new features.

  25. Survey’s on site exit from what I hear are pretty powerful. While it does somewhat tick off visitors as they leave, it actually doesn’t harm much because it’s an exiting visitor right, they were already leaving. So even if 1 in 1000 fill out the survey that’s information that we didn’t have already.

Add Your Comments

 

Print this Article
Share

Bryan Eisenberg, founder of FutureNow, is a professional marketing speaker and the co-author of New York Times and Wall Street Journal bestselling books Call to Action and Waiting For Your Cat to Bark and Always Be Testing. You can friend him on Facebook or Twitter.

More articles from Bryan Eisenberg

Marketing Optimization Blog
FREE Newsletter Sign-Up
send it once every: