Everybody asks: How to calculate self-service success?

I have written a lot about the decline of self-service success, based on SSPA benchmark data, and one of my most common inquiry questions is how to track this metric.

Self-Service Success 2003-2009

Self-Service Success 2003-2009

The truth is, accurate tracking of self-service success isn’t easy, and it doesn’t help when knowledgebase vendors make ridiculous claims like “80% of consumers who access one of our self-service sites find what they need.”  How do they figure that?  I’ll tell you: anyone who views content and then leaves the website is counted as a successful visit, even if they left because they were disgusted by the poor quality content. If this is how you calculate self-service effectiveness, your numbers are meaningless.

Here is how I recommend figuring self-service success, and how SSPA members calculate the metric to enter in our benchmark.  There are three approaches, each more detailed:

  1. The easiest and most direct way is to give customers a prompt on every knowledgebase article that says, “This article solved my problem. Yes/No.”  The problem with this approach is response rate. My research tells me that the average response rate for these prompts is under 5%, and some members tell me that their response rate for article prompts is less than 1%.  If you can’t capture enough responses to have a good sample size, go to step 2.
  2. The next approach is to send a survey to every customer that accesses your self-service site asking if they were successful.  The response rate for these surveys can average as high as 30% (according to members), though the current benchmark average response rate for self-service experience is only 7%.  If you still need more details, go to step 3.
  3. The most detailed approach is to use cross-application reporting to see how many customers who accessed self-service had an assisted support interactions afterwards.  In other words, did any of the self-service customers call or email you, or create a support incident online, within 24 hours of the self-service attempt? If so, these can be counted as unsuccessful attempts.

There are obviously challenges. For consumer companies, who don’t require a logon to access self-service, steps 2 and 3 may not be possible. Other companies think the ’24 hour’ rule in step 3 isn’t enough, and look at the following 48 hours. In other words, your mileage may vary.

How do you calculate self-service success?  If there are other accurate approaches I’d love to know! As we speak, Michael Israel, our field service research expert, is working on the overhaul of the SSPA benchmark questionnaire, including beefing up many of the definitions of metrics and how to calculate them. This would be a great time to identify any emerging best practices for calculating self-service effectiveness.  If you have any thoughts, please add a comment or drop me an email.  And as always, thanks for reading!

Explore posts in the same categories: Best Practices, Technology

Tags: ,

You can comment below, or link to this permanent URL from your own site.

25 Comments on “Everybody asks: How to calculate self-service success?”

  1. Nikhil Govindaraj Says:

    Great post John. No magic bullet here I am afraid. If you are like me, I often go to a web site – to look for that 800 number. If self service is available as an option I would definitely try it. What would be really great is if we can correlate that phone call I make with an attempted self service attempt. May be a unique code that will fast forward and reward me with a lower wait time🙂

  2. jragsdale Says:

    I love the idea of a priority code! It would be easier for click-to-chat than phone calls.

  3. David Kay Says:

    (1) I know Intuit has experimented with the priority code for self-service users in its Accounting Professionals Division — it might be worthwhile talking with them about their experiences.

    (2) I cover this topic as part of an article coming out this week in the SSPA publication — stay tuned! But I agree generally with your comments.

    (3) The problem with the document surveys isn’t sample size per se — big companies get a statistically significant number, even at a tiny percentage. The problem is, it’s a biased sample — it only counts people who decided to take the time to answer, and they don’t seem to represent the population well.

    (4) Amen re: cross-system reporting and integration. We shouldn’t have to tell anyone we’re opening a case after self-service, or a community thread view…

    (5) Increasingly, at least for companies that don’t absolutely require a login, Google and Bing *are* the defacto self-service portal. And don’t searchable community posts fit into web success rates? Calculating web success and call deflection in THIS world seems increasingly important, and even harder than the traditional web portal world. This is my current obsession — first hints are in the last line of the article to be published by SSPA this week🙂

    Best,
    David

  4. Haim Toeg Says:

    John – this is an interesting topic and I know many have struggled with it. There are a few points I would like to make. When measuring an on-going initiative like self-service, the key measurement should not be success, which is binary and hard to define, let alone quantify. Rather, it probably is better if we discuss improvement – is is a continuous goal, much easier to quantify and hence the ability to reward successful contributors through pay and recognition.

    As far as the metrics you suggest, it is not surprising that response rate at item 1 on your list is so low. Usually, and especially in the enterprise world, we are asking customers to do some further work before they can try the resolution and verify it works. A good amount of time of time may pass before final conclusion is reached. Similarly, point 2 may be too premature and produce poor results, I am surprised that you quote a 30% response rate, my gut feeling would have been much lower. Last, the correlation method can be used to judge failure of the self-help system, but it would much more difficult to deduce the success of the system by it. Given that, I do not have better metrics to add, but would think that repeat visits by users would indicate that they find the service valuable, therefore they return.

    Last, we can’t really talk about self service without bringing the community support offering into the picture. Ultimately, they represent two adjacent segments of the customer support experience continuum.

    In “Collective Wisdom” Francoise Tourniaire and David Kay dedicate quite a few pages to the myths surrounding self service and discuss at length different methods for measuring it and estimating its usefulness. I highly recommend that anyone seriously involved with such a project get this book and study it thoroughly.

    • David Kay Says:

      Haim – I couldn’t agree more about continuous improvement being easier to measure and (ultimately) more important than absolute success. I mean, it’s not like we’re going to pull the plug on self-service to reduce the power bill, right? For most of us, the question is,, what’s the incremental benefit of incremental improvement? Or, to put it more bluntly, if you have a headcount to fill, is it better for you and customers to stick someone on the queue, or is it better to have them work on knowledge, customer experience, or communities? Incremental improvement helps you answer the question.

      You also bring up an excellent point about the lag time between solution delivery an confirmed resolution. In an ideal world, you’d survey 14 – 30 days after the self-service experience, and allow people to say “I’m not sure yet.”

      ps – thanks for plugging Collective Wisdom! This is an ongoing area of work for me (and, I expect, for FT).

      • Haim Toeg Says:

        Always happy to help!

        Another thing we need to consider relating to the success of any self-service initiative is the need to reach critical mass on at least three different plains. First, contributors, bringing a variety of experiences and problems into the system, second, entries in both quantities and flow, so that users are motivated to visit knowing the likelihood of them finding something valuable and fresh is high and third, users, otherwise the whole effort is in vain.

        Due to these, any self-service project needs to start on its own fumes and not expect rapid adoption rates until certain barriers are crossed. So, by all means, measure everything, just don’t expect results too early.

    • jragsdale Says:

      Great input, as always! Thanks Haim! The 30% response rate seems high to me too, but that’s the average in the benchmark. Someday I’ll slice and dice the numbers, my guess it is much higher for B2B and B2C numbers would be much lower.

      I sure don’t mean that self-service success rate is the most important or only number to track. But I do think it is a good ‘back of the envelope’ look at how good your implementation is. If your number is much lower than average, your site probably needs a good audit.

      As for factoring in community to the discussion, I don’t know where to begin as far as metrics tracking–there aren’t many ‘best practice’ metrics for community management available that I’m aware of. Yet!

      • Haim Toeg Says:

        John – thanks for the feedback. I think we are all saying the same thing re self-service management. All I am suggesting is that we don’t call the metric “success” as success is much harder to achieve than improvement, and much easier to misinterpret for people who don’t necessarily take the time to understand the complexity and unknowns.

        As far as measuring community contribution, I think the first thing we need to do is explore and define the place it takes within the support experience continuum. Once we are able to do that it is much easier to create metrics.

        Maybe this is a good topic for a new blog post where we can all hash it out?

  5. Anne Wood Says:

    This is good timing for me, John, as I’m just commencing my self-help project. I’m looking at metrics which include deflecting Contact Us requests back to our knowledge base if appropriate – thus combining Help & Support and Contact Us for information-only contacts but allowing ‘action’ requests to escalate immediately, either to our contact numbers or to our email channel.

    I’m including several elements on the self-help article: Rate this content. Did this Help. and plus the ability for the customer to contribute/amend the content. I’m also enabling the customer to post ‘favourite’ content onto social media sites such as Twitter, Digg, Delicious, Stumbled Upon. I’m hoping that this combination along with measurement from the email and voice channels will give me sufficient evidence of self-help adoption.

    Lastly, we’re looking at surveys in general on our website but not specifically on Help & Support

    • Allen Bonde Says:

      Anne (and others!)- as you know, this is one of my favorite topics. And I think you are on the right track by including a combination of metrics that span “usage,” “experience” and “outcomes.” Usage type metrics like traffic, or hits on an article, tend to be easy to measure, but only semi-useful in showing progress in solving the actual issue. I view content ratings as more of an experience metric than outcomes metric, since the user thinks its useful, maybe even to the point of sharing with others, but how that information is applied, whether it works in all environments etc is still often uncertain.

      As for deflection, this is a classic SS metric and true “outcome” but as discussed tricky to really measure. I recall some interesting work a few years ago from ServiceXRG, which aimed to define the steps or gates in declaring a true deflection. It went something like:

      deflection =
      problem was resolved +
      prior intent to call +
      entitled to call +
      no call (email) made

      Pretty good formula if you ask me, and still very applicable even in the Web 2.0 world.

      Now, as for how this fits communities or peer support, content (and contributor) ratings become even more important. And given my current work on social media marketing, I am thinking that cross-links and even up-sell and better customer engagement also become key measures of self-service success – when there is a community component in addition to a KB.

      Allen

      • Anne Wood Says:

        Allen. Thanks for that formula. It’ll be a few months before I can use it to measure the success of my forthcoming project (the one I shared with you in London) but I will use it. Also looking at Twitter, CoTweets etc as we’re getting (some) traffic via that route. Lots of excitement there buy when you compare it to ‘traditional’ contacts it’s a mere drop in the ocean. However, not something to be under-estimated.

    • David Kay Says:

      Anne –

      I’m so excited to hear that you’re opening the door to customers’ contributing and amending content! I hope you can share your experiences with that at an upcoming TSW conference. This is a huge opportunity that we, as an industry, are generally letting slip through our hands.

      –David

      • Anne Wood Says:

        Hi Davi. When I have some evidence I will most certainly share my experience. Will be three months before my self-help project goes live so watch this space,


    • Anne, I am also at the start of a self help journey, newly appointed to come in and make massive changes to the effectiveness of our self help for QuickBooks. I like the ways your engaging customers to interact with the self help. Escalating right from an article that didn’t resolve their problem will create a really great “no dead end” experience!

      I’d love to compare notes as we both go down the self help journey!

  6. jragsdale Says:

    OK, I lied! The 30% response rate was for PHONE surveys. The industry average response rate for email surveys is 17%–still pretty high.

  7. Ho Says:

    interesting statistics

  8. Dianne West Says:

    We’ve had success with party surveys that can identify trends to indicate what percentage of your Web Self Service traffic 1) found what they were looking for 2) would have called customer care had the information not been available on support and 3) have no need to contact customer service post-self service event. These surveys, in conjunction with call center “call driver” analysis, Web Support traffic paths, and individual article surveys can help determine if success is being achieved and where gaps are. There is no single measure that can be relied on in a multi channel environment!


  9. Thanks for this article! This is just what I was looking for as I embark on a journey to rethink the way we do self help. What’s perplexing to me is all of the failed attempts at moving article resolution rate… makes me question the measure or lack thereof other measures. I would like to know what to benchmark against – what is world class in self help resolution? 40% is dismal – 60% of customers are left without a resolution and needing to call! Seems like industry can do better than that… maybe we need to challenge the status quo and push for world class online resolutions?

    • David Kay Says:

      Amen to that, Anna. I like to say that our current self-help success rates are like if WalMart, instead of hiring friendly octogenerian “greeters” at its front doors, hired bouncers that threw half of the shoppers out of the store!

      There’s lots we can do better:
      – content captured in the customer context
      – task-based design
      – optimized experiences for high-frequency, high-value scenarios
      – integrated communities (and I love Live Community!)

      Best,
      David


  10. Btw, I would love to connect with you about the research you are doing to figure out what SSPA should focus on for metrics around self service. Please get in touch with me. I’m happy to share what Intuit’s doing and participate in whatever way possible to help shape this work.

  11. John McKay Says:

    And then what happened?


  12. […] Self-service. Even if employees can be trained to use CMS tools, they will never be intuitive enough for customers attempting self-service. This is one of the reasons we are seeing self-service success continually trending down. […]

  13. advice Says:

    Woah this weblog is wonderful i love reading your articles. Keep up the good work! You know, lots of people are searching round for this information, you can help them greatly.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: