Feeling angry, depressed, or maybe happy as a clam? It may not be long before an intelligent virtual agent you’re conversing with is able to accurately gauge your emotional state. Advances in technology are enabling artificially intelligent software to pick up on subtle queues in human speech patterns. In the article Teaching Computers to Hear Emotions, IEEE reports on recent work by interns at Microsoft Research with spoken language software systems. Their work shows that software can have a surprising success rate at predicting a speaker’s emotional state by examining variations in the loudness and pitch of the speaker’s voice.
The implications for intelligent virtual agent technologies are just beginning to be explored. Obviously, a digital customer support agent that is able to sense when a customer is losing patience or becoming angry will be better equipped to serve the customer effectively. Recognizing the onset of negative emotions could prompt the virtual support agent to take a different approach with the customer. In some cases, a change in emotional state may be a signal to the virtual chatbot to escalate the conversation to a human support agent.
As speech recognition and spoken language systems become more sophisticated in picking up on human emotional queues, the applications in the realm of virtual agents, digital support representatives, and artificial intelligence in general are limitless.