The team at Software Advice, an online consultancy for customer relationship management software, has published the results of a survey on customer self-service channels. The report contains interesting information on the effectiveness of a range of self-service technologies and how companies measure performance. Virtual assistants are included in the study.
To obtain the results, Software Advice surveyed 170 professionals within the customer service departments of firms across a broad range of industries. In order to select the 170 participants for the survey, Research Now, a third-party research partner of Software Advice, narrowed down a larger group of possible participants to just those who had actually implemented self-service channels in their business and who also had direct knowledge of how the business measured the success of those channels.
It turned out that the most commonly offered customer self-service channels are FAQs and Knowledge bases. Interactive Voice Response (IVR) phone systems comprised the next most common channel. The least commonly offered self-service channel turned out to be virtual agents / virtual assistants. Surprisingly, though, over 50% of those surveyed indicated that their companies had virtual assistants. I would have expected the percentage to be lower, given that virtual assistants are still an emerging technology. Then again, the prescreening narrowed the participants down to those who are already fairly advanced in their use of self-service.
The next major point of inquiry was whether the survey participants monitored the effectiveness of their various self-service channels. They were also asked to provide input on what metrics they used to monitor performance and how effective they considered the metrics to be. About 60% of respondents said that they formally tracked the effectiveness of virtual assistants. I would have expected the number to be closer to 90% or more. To the best of my knowledge, most virtual assistant vendors offer out of the box metrics with their solutions. One type of easily implemented metric is a simple yes/no survey at the end of a chat session that asks the user if the virtual assistant answered his or her question. This user satisfaction metric was indeed the measure that survey respondents employed most frequently. A second metric could be whether the assistant found a response to the question in the knowledge database (or on the company website) or if it came up empty. This type of metric is generally captured as part of the virtual assistant’s conversation log file.
Of the respondents who said they tracked user surveys related to virtual assistant interactions, just shy of 75% said they were satisfied that this is an effective performance gauge. To me, that means there is still room for improving how we measure the reliability of virtual assistants and true customer satisfaction levels. It would be beneficial to have a more accurate, less intrusive method than having to ask the customer if the assistant gave them a useful answer.
A final area explored by the survey was the overall effect of self-service channels on the performance of live customer contact centers. As would be hoped, it turns out that when customers have access to self-service channels, fewer of them call the support desk. As a result, live customer support personnel can take the time to improve the service they give to customers who do call in. By lessening the burden on live support agents, self-service channels helped the majority of survey respondents experience measurable improvements in the following areas:
- Speed to answer calls
- Cost per contact
- First-level resolution rate
- First-call resolution rate
- Cost per incident
The Software Advice report is proof that self-service channels are the way to go, right? Well, interestingly, the report references a 2013 Zendesk survey that indicates the majority of consumers would still rather speak to a human than use online self-service channels. It’s important to remind ourselves that we still have a steep hill to climb to convince consumers that calling our support centers should be a last resort. As Millennials and Digital Natives comprise more of the consumer population, this preference for contact with real human support personnel may change. But regardless of how user preferences evolve, our virtual assistant technologies need to continually improve to meet consumer expectations. The only way for us to make sure our technologies are effective is to measure their results, and reports like the one from Software Advice provide insights on how to do just that.