Falling in Love with Your Virtual Agent (Maybe)

Intelligent Virtual AgentThe New Yorker ran an article last month called “Can Humans Fall in Love with Bots?” It’s a rather sensational title intended, I suppose, to grab attention (and yes, I latched onto the hook for this post!). The New Yorker piece, though, covers the topic of virtual agents, chatter bots, and the universe of conversational virtual assistants broadly, not just from an attachment standpoint.

The article was written by Betsy Morals, a frequent contributor to the New Yorker on technology topics. The anchor for her post is the Spike Jonze movie “Her,” which I wrote about previously. Morals viewed a pre-release version of the film and she uses this scifi fantasy story as a jumping off point to explore the current state of virtual agent technology as it exists in the real world today.

Morals references discussions she’s had with Fred Brown, CEO of virtual agent technology company Next IT. Next IT created as virtual assistant called Jenn for Alaska Airlines. It turns out that Jenn attracts users who engage her in conversations that go beyond just asking about airline tickets and flight status. Brown told Morals that data shows people often converse with Jenn for extended periods, especially late at night. They probe her with questions about her likes and dislikes. Some users even seem to be trying to flirt with her, despite her obvious virtual nature.

Morals also spoke with Nova Spivack, who sits on the board of Next IT and who was involved in the research work that eventually led to the creation of Siri. Spivack seems to express the opinion that today’s virtual agents are realistic and believable enough that people might be lured into developing an emotional attachment to them. But Morals is skeptical. Her own interactions with virtual agents haven’t been as impressive. She references Sgt. Star, used by the U.S. Army to answer questions from potential recruits. While the Sergeant can answer most of her questions, she suggests that he clearly doesn’t display a full understanding of the intent of a person’s questions, especially when you try to drill down deeper into a conversation or ask questions about the Sergeant’s own emotional response to certain aspects of Army life. (As a side note, I wrote about the technology behind Sgt. Star in an earlier post).

According to Morals, it turns out that Spike Jonze was inspired to create the film “Her” based on conversations he had with the AIML-based ALICE bot years ago. ALICE’s ability to understand context and adjust her conversation to the desires and habits of dialog partners is extremely limited. But Moral’s discussions with Nuance’s Gary Clayton lead her to be more optimistic about the future capabilities of virtual agents. Clayton believes that as virtual agents have more access to your personal data, they will be better equipped to proactively assist you. He gives the example of a virtual assistant that knows you’re driving on the highway and recognizes that you’re suddenly exceeding the speed limit and asks you if you’re ok. (It may also know that you didn’t sleep well the night before or that you’ve just left from a stressful meeting).

As our virtual agents become more capable and indispensable, it’s not hard to imagine a day when we’ll develop emotional attachments to them. Morals touches on the question: will our virtual agents be able to love us back? But that’s fodder for an even more esoteric discussion. Read Morals’ article for yourself. Just be warned that if you’re planning on going to see “Her,” the article gives away the ending.

Share your thoughts on this topic

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s