Expect Labs’ MindMeld Powers Voice-Enabled Apple Watch App

I first wrote about Expect Labs over a year and a half ago, when their MindMeld speech recognition and natural language processing technology drove an innovative social listening app tied into Facebook. Since that time, Expect Labs has pivoted their offering into a voice-focused Software as a Service.

Fetch Apple Watch AppMindMeld now provides app owners and developers with what Expect Labs calls an intelligent voice interface. MindMeld uses semantic mapping technology to create a knowledge graph of the application’s content. It then uses the graph to improve the accuracy of its NLP engine in understanding precisely what users are asking when they talk to a voice-enabled interface.

A recent article in Macworld reported on Fetch’s use of the MindMeld service to power their mobile concierge app. Fetch makes it easier for you to buy the things you need by easily connecting you to specialists and personal shoppers. Users can tell Fetch what they need, from a plane ticket to an order of flowers and chocolate for a significant other, and the app will set the wheels in motion to have a specialist fill the request quickly and painlessly.

Now that Fetch has partnered with MindMeld, they’ve been able to create a voice-enabled app that’s optimized for the Apple Watch. Fetch users can use voice commands for on-demand concierge services right from the Watch.

The MacWorld articles cites Expect Labs data showing that people spend 60% of their online time on mobile devices. In contrast, only 10% of purchases are made from mobile devices. In the article, Tim Tuttle, CEO and founder of Expect Labs, voices the opinion that this discrepancy could be accounted for in the current complexity of carrying out purchases from the mobile device form factor.

Intelligent voice-enabled interfaces, like those made possible by MindMeld, are aiming to simplify our interactions with mobile devices. If Fetch’s MindMeld-powered Apple Watch app is any indication, voice interfaces will transform our wearables and smart phones into the personal assistants we’ve always dreamed of. The age of voice is here, and Expect Labs is well-positioned to fuel the positive transition to intelligent natural language interfaces. To see more examples of MindMeld’s technology in action, check out the Expect Labs’ demo page.

 

MindMeld Lets Your iPad Listen As You Talk

MindMeldExpect Labs issued a press release last week about the launch of their MindMeld app for the iPad. The press release describes Mindmeld as an anticipatory intelligenct assistant app. It can listen to you as you talk to it or to one or more friends on Facebook, and then go out and search for related content either within Facebook or on the web.

How does the app work? I watched the short app demonstration of MindMeld on the Expect Labs webpage. It appears that you have to be a Facebook user to take advantage of MindMeld, as the only way to log in is through your Facebook account. It seems that the idea is for you to join in a conversation with friends–either one that’s currently underway or one that you initiate. MindMeld then listens to what you are saying and it starts displaying what it believes to be relevant and helpful content on the MindMeld screen.

If you and your friends are planning a trip to the BCS Championship game to watch Auburn battle Florida State, for example, I suppose that MindMeld would show you Facebook feeds from anyone trying to sell tickets to the game or maybe even offering a place to stay in Pasadena. MindMeld would probably also show things like airline tickets, hotel specials, or driving directions.

I’m a bit confused as to how the conversations work. Are you actually carrying on a live conversation where you can hear all your friends talking, like in a Google circles call? Or are you and your friends each just talking to MindMeld separately and the app listens to each person and pulls out the things it hears that seem relevant or interesting? I’m guessing that it’s the former, and that you can actually hear your friends speaking.

It’ll be interesting to see how MindMeld functions in reality and whether people find it helpful to be bombarded with content that an app thinks you might like. If you say something like “I drank way too much last night,” will it show hangover remedies? Or  will it show you the news feeds of all your other friends that recently typed or said “I drank way too much last night?” Right now when you google that same phrase, the results are mostly in the latter category. Misery loves company, so that might be helpful. But it could just as well be annoying.

I’m confident that there are valid use cases for an app like MindMeld, though. Speech-based search will definitely be a part of our normal lives in the future. The question about how often and under what circumstances we want apps listening into our conversations remains open.