Virtual Health Coach for Medical Procedures from 22otters

At Opus Research’s recent Intelligent Assistants Conference, I ran into Ann Thyme-Gobbel and Charles Jankowski from 22otters. 22otters offers an interesting mobile app that provides a virtual health coach for patients preparing for a routine medical procedure. Currently the app is designed to support people preparing for a colonoscopy. Anyone who’s had one knows that the prep is the worst part of the whole experience. So who wouldn’t want a health coach to help you get through it?

22Otters Health Coach22otters can adapt their virtual coach to support a whole range of medical procedures. But let’s take the colonoscopy prep as an example. The health coach assistant provides a complete checklist of steps to guide you through the preparation process. Each step is plotted on a calendar based on your appointment date and time. The health coach proactively alerts you when each step in your regimen needs to begin and uses voice commands to explain what you need to do. For example, on the day prior to your procedure, the assistant tells you to drink only clear liquids from now on until the procedure is complete.

If you need specific instructions on how to prepare your laxative concoction, the health assistant will walk you through the steps using both voice explanations and helpful graphics. The app also provides alarms to remind you when to take doses of liquids or other medications. As you follow each of the steps, you can verify that you’ve completed them. Your doctor can validate your overall status by looking at a portal that shows exactly where you are in the prep process and how well you’ve kept to the instructions.

You can take your health coach with your wherever you go and ask it clarifying questions. Procedures such as a colonoscopy are complicated for the patient, but all the steps are routine. It’s extremely helpful to have all the information in an easy-to-use health coach app, rather than having it in static paper handouts. The proactive alerts help to ensure that you don’t miss a step or forget something important in the process. Having the virtual health coach also reduces your need to pick up the phone and call the doctor.

Thyme-Gobbel and Jankowski showed me another app called HerStory that offers breast cancer patients an opportunity to record audio clips about their own personal experiences. Users of the app can access these crowdsourced survivor stories to receive encouragement and helpful tips from other women who’ve gone through the same experience. This “share you story” feature could potentially be added to any of the health coach apps.

I can imagine a whole range of routine procedures where both patients and healthcare providers could benefit from a health coach. Detox protocols offered by practitioners of functional medicine come to mind, as well as procedures that patients need to follow after certain types of surgeries.  As intelligent assistants become more specialized, the virtual health coach offered by 22otters shows how a focused use case can offer lots of compelling benefits, in this case to patients and healthcare providers.

Life After the Cambrian Explosion in AI

Dag Kittlaus, co-founder and CEO of Viv, recently authored a piece in TechCrunch called “A Cambrian Explosion of AI Is Coming.” Here’s a quick summary of what I took away as the major points of the article. Kittlaus postulates that intelligent assistant technologies are in their adolescence.

Neural NetworksThe iPhone and Siri were both groundbreaking products that “set a new bar for simplicity.” The iPhone was a complete market success and initiated a paradigm shift in the way we interact with digital technology. Siri didn’t have the same success, but it marked the beginning of the next paradigm. Intelligent assistants will simplify how we access applications and in some cases make services completely transparent.

Once intelligent assistants are at the point where they understand our needs and can independently find and access services to fill those needs, a whole universe of easily accessible services will open up to us. We won’t have to even think about how to interact with an app, or even what app to use. All that will be managed for us behind the scenes by our assistant.

Intelligent assistants will be able to more smartly anticipate our needs by connecting the dots between actions we take and other actions we might wish to take as a result. For example, if we’ve just booked a date on an online dating site, we might like to reserve a table for dinner, find out what entertainment is scheduled for that night, and pre-order something for our date. Kittlaus sees this emerging ability of AI to do what he calls “deep linking” as the beginning of a new, powerful marketplace where people will be able to target all kinds of products and services to people who are receptive to their message.

I agree with Dag’s vision. However, I wouldn’t emphasize the commercial aspect of the intelligent assistant-enabled marketplace. Yes, there will no doubt be lots of companies that will leverage an intelligent assistant’s knowledge of your intent and current needs to market their wares to you. But if the future of an AI-empowered world is primarily about more effectively targeting ads, that seems like a pretty dismal world. In fact, if sounds like something out a Philip K. Dick novel, where in his bizarre future persistent advertising drones slide through the cracks in a rolled down car window to bombard the driver, who just complained about the onset of baldness, with a hair growth formula.

I’d like to think that it will be world in which our intelligent assistants will keep us from being pestered by advertisements we don’t want. Maybe there isn’t even a need for advertising anymore, because your assistant will understand you and anticipate what you’re looking for at any given moment and present you with the best choices to address your need. Of course, this anticipation could become a double edged sword. As a prominent panelist mentioned during a discussion at the recent Intelligent Assistants Conference sponsored by Opus Research, do we want to live in a world where as soon as you think about how good a high-calorie muffin might taste, a muffin drops out of the sky into your hands and is charged to your credit card? Will the intelligent assistant also know that you’re trying to lose weight and prevent the muffin from dropping? Do we even want our intelligent assistants making those choices for us?

The AI Cambrian Explosion will happen, and possibly sooner than we think. The conversations are already underway about how this technology may impact our lives.

Wrap up of Opus Research’s Intelligent Assistants Conference

Earlier this week, Opus Research hosted the first ever Intelligent Assistants Conference. Like all first-time events, you have to take a leap of faith that everything will go off smoothly. In fact, the 1-day conference was a huge success, bringing together customers of enterprise intelligent assistants, technology providers, and companies eager to learn more about the business value of intelligent assistants. I was extremely impressed with the company case studies presented by representatives from Hyatt Hotels, Coca-Cola, Domino’s, Schlage, and Windstream Communications. I wrote a guest post for the Opus Research blog that summarizes how each company is employing intelligent assistants  / virtual agent technology to both drive down costs and improve customer experience.

Intelligent Assistants ConferenceThe day was also filled with lively panel discussions on topics ranging from the history of artificially intelligent chatter bots to the future of mobile personal assistants in the image of Spike Jonze’s Her. Dan Miller of Opus was able to attract a stellar group of panelists to the event and the conversations were fabulously entertaining for people interested in this space. The room at the Palace Hotel in downtown San Francisco stayed packed all day and the audience was never stumped for questions.

I came away from the event firmly convinced that enterprise intelligent assistants aren’t just for early adopters anymore. There are powerful use cases where this technology can be applied today and companies that take the leap stand to gain a competitive advantage over their competition. In terms of how personal intelligent assistants will impact the consumer market, so much is happening so quickly that it’s hard to guess where the market will be in a year or two. The growing adoption of wearables, where the form factor aligns better with speech-based communication than with text / typing, may push more usage of intelligent assistants. Or a completely new user interface might emerge.

Regardless of how things evolve, it seems certain that people will always want to engage in conversation. Intelligent assistants can satisfy that desire, while at the same time helping people solve problems and more easily navigate their way through life. I look forward to Opus Research’s 2nd Annual Intelligent Assistants Conference next year. With the pace at which this technology is evolving, there’s bound to be a lot to talk about.

 

Is the Apple Watch the Killer App Siri’s Been Waiting On?

When Apple finally unveiled the long anticipated Apple Watch this week, Siri came along for the ride. With the general disillusionment over Apple’s intelligent assistant, it wasn’t a foregone conclusion that the company would include Siri in their first flagship wearable product. But Siri is baked into the watch and it appears “she’ll” be able to do pretty much what she’s been able to do from the iPhone platform since her appearance with the 4s.

Apple Watch with SiriCould the Apple Watch give new life to Siri? I’m thinking it might just do that. We’re all so adept now at typing on our smartphone keyboards and browsing, reading, and generally interacting with our touch screens that, truth be told, we rarely need to use our voices to ask Siri (or Google Now, or Cortana) for help.

But what will happen when we’re wearing a watch? What if we’re so comfortable with the watch that we leave our phone in our purse or in our briefcase, and what if we want to send a text to a friend? If Siri’s reliable enough to listen to us dictate our message and shoot it off to our friend, are we going to bypass that option to stop, dig out our phone, and type out a message? Maybe. Old habits die hard. But if Siri can do it for us, I’m thinking we might rely on her more and more.

Once we start relying on Siri to write and send our messages from our watch, we may begin to ask her to update our calendar, or reserve a dinner table, or buy flowers for a friend, or recommend a good movie, or tell us how to rebook a flight if the one we’re waiting on just got canceled (once she’s able to do such things). Heck, she’s right there and it’s a lot easier than fooling with our phone.

There used to be lots of talk about killer apps. Are wearables the killer app that intelligent assistants have been waiting on? Wait; that analogy doesn’t really make sense. What I mean to say is: Are wearables the killer platform that will bring intelligent assistants into the mainstream? I guess we won’t find out until sometime in 2015.

Opus Research Hosts Intelligent Assistants Conference Next Week

Opus Research is hosting the first ever Intelligent Assistants Conference next week in sunny San Francisco. The team at Opus Research has done a great job at assembling a host of companies that are actually using intelligent assistants (aka virtual assistants or virtual agents) to improve customer service. If you’re interested in the growing marketplace for web self-service and advanced customer care technologies, this would be a great event to attend.

Intelligent Assistants ConferenceSome of the companies that will be present to tell their intelligent assistant stories include Coca-Cola, Hyatt Hotels, Domino’s Pizza, Windstream Communication, and Schlage. Hearing how they’re employing intelligent assistants and what’s working best for them should provide a wealth of useful information. It will also be interesting to hear how these companies developed their business cases to support the investment in intelligent assistants.

The virtual assistant vendor community will be well represented too. There will also be some speech technology and intelligent assistant luminaries participating in panel discussions. Norman Winarsky, Vice President, SRI Ventures and Liesl Capper, Leader, Watson Life Product Strategy, IBM (Formerly Founder of MyCyberTwin) are just two examples.

I hope to be writing about the conference and posting some of the outcomes to this blog and perhaps in a guest post or two to the Opus Research blog page.

It’s still not too late to sign up, so check out the details on the Opus Research event page for the Intelligent Assistants Conference. The official hashtag for the conference is #IACSF.

Visual IVR – Giving Virtual Agents a Run for Their Money

What is Visual IVR (Interactive Voice Response)? Before SpeechTek 2014, I must admit that I’d never heard the term Visual IVR. It turns out that Visual IVR is an important, relatively new technology for providing remote support to customers and supporting other customer interactions. Visual IVR is a far cry from the old, dreaded phone tree (Press 1 for Sales, Press 2 for Something Else You Don’t Want). So how does Visual IVR relate to intelligent assistant (virtual agent) technologies? We’ll get to that in just a minute.

Visual IVRWhile at SpeechTek, I had the opportunity to speak with Steve Herlocher, Chief Marketing Officer of Jacada, a provider of Visual IVR. Herlocher gave me a demonstration of how the Jacada technology works on a smartphone. I tried the demo myself and you can do the same by accessing the Jacada Visual IVR demo from their website. Here’s an example of how the Jacada solution would support a customer who wants to check up on the status of something she recently ordered.

First, the user calls a customer support number. She’s prompted to Press 1 if she’s calling from a smartphone. So far it sounds like the dreaded phone tree, right? But after the user presses 1, she’s transported into a completely different experience, like Alice in Wonderland sliding down the rabbit hole. Instead having to navigate through more frustrating options and finally getting connected to a human, the user immediately receives an SMS message on her phone. When she opens the message and clicks on the link, she’s taken to a website that fits perfectly on her phone screen, regardless of what type of mobile device or operating system she’s using (thanks to HTML5). The user interface prompts her to select from several common functions. She selects the button that says “Check Order Status.” Next, she types in her order number. In a flash, the system looks up her order and shows her the current status and expected delivery date. She can click another link to go straight to the shipper’s website and view all the most recent tracking data.

In the demo that Herlocher showed me, the Visual IVR was engaged with the customer while they were waiting for their support call to connect them with a human agent. During that wait time, the Visual IVR offered the customer various support options. It can also use the hold time to recommend upsells that might interest the customer or to run a quick survey. According to Herlocher, Visual IVR supports other channels, such as social media sties, chat and email.

How does Visual IVR relate to virtual agents? Based on what I saw of the Jacada solution, Visual IVR might very well be a better option for some use cases than a virtual agent. As Herlocher said during our discussion, it’s all about usability. The Jacada demo, where a user needs to confirm her order status and track shipment information, is a great example of a situation that lends itself well to Visual IVR. The user has access to easy-to-follow self-help options right from her phone or tablet. She can quickly get the information she needs, so there’s really no reason to engage in dialog with an agent. The Visual IVR seems like a more direct way to get the customer the help she needs.

On the other hand, I can imagine situations where a combination of intelligent assistant and Visual IVR could work well together. Just last week I reviewed Nuance’s branded personal assistant “Dom” for Domino’s Pizza. While I don’t think Dom and the supporting application can be labeled Visual IVR, the Dom assistant works in tandem with a very rich user interface that streamlines the pizza ordering and check out process. It wasn’t that long ago that executing a complex transaction like ordering a pizza would have been unthinkable from a smartphone. The user experience on smartphones has been transformed by new UI technologies.

If a user hesitates while interacting with a Visual IVR solution, for example, a virtual agent might speak up to ask if the user needs help with the next step. Or a virtual agent might be used to guide the user through each of the Visual IVR screens. For right now, though, I agree with Herlocher that there’s probably no pressing need to force a quick marriage between virtual agent technologies and Visual IVRs. Each has it’s preferred use case. For companies looking for ways to improve both customer support and customer interactions, it makes a lot of sense for them to investigate both technologies before choosing a direction.