The great thing about keeping score is that you can track your progress. We started asking customers who contacted support what they thought about the interaction in 2010. We were thrilled to end that year with just seven out of a hundred being unhappy with the service (and 84% being happy, 9% being OK).
But I’m really proud to announce that we’ve dramatically raised our game in 2011. We’ve gotten the frown ratio down to just three out of a hundred (90% being happy, 7% being OK). That’s less than half of what it was just the year before!
(If you look at just the last six months of 2011, it went even better still: 92% happy, 5% OK, 3% frowns).
Part of this is hiring a bigger team so the average number of emails each person has to answer is less. We’ve gone from needing each person to answer about 80 emails per day to just around 40 (again, on average—there were and are significant swings at times). That of course means that we can spend more time on each response and making more customers happier is the result.
Gains have also come from analyzing the data. Finding out what made people unhappy and trying to do better. We also now follow up with more frowns and try hard to “flip ‘em” by doing what we can to make a bad experience great.
I’m so proud of our support team for what they’ve accomplished this year. Thank you Ann, Chase, Emily, Joan, Kristin, Merissa, and Michael.
Justin Jacksonon 03 Jan 12
Could you share some examples of things that you found in the data that were negatively affecting the customer experience?
I think that insight would be helpful for all of us trying to improve the way we serve customers.
Also, how did you structure the way you received the data? (how were you able to observe trends?)
DHHon 04 Jan 12
Justin, key findings were that frowns often happened around feature requests (one solution: suggest other ways of getting the same result) and user mistakes like deleting things (one solution: make it easier to restore things). We found these just by reading through the comments on the frowns.
epon 04 Jan 12
Shouldn’t this post rather be named “…our customers who have contacted support and clicked on feedback…”? Because this certainly does not reflect the overall happiness of your entire customer base and probably not even 100% of the customers who contacted support. How do you account for these?
RIchon 05 Jan 12
Good point, @ep. How would you measure satisfaction of customers contributing on your Answers site?
Louis Corsoon 05 Jan 12
Congratulations! Those are some really awesome statistics for your team. :)
I have to say, we recently started approaching feature requests more from the “how can they do X with what we have” approach at ZURB too, and it’s worked wonders. I believe it’s a combination of the customer both feeling listened to and more often than not having a solution that will work for them even if they don’t get what they ask for.
Keep it up!
Adrianon 10 Jan 12
@ep: Don’t be so technical. This isn’t law school.
This discussion is closed.