Expect Labs issued a press release last week about the launch of their MindMeld app for the iPad. The press release describes Mindmeld as an anticipatory intelligenct assistant app. It can listen to you as you talk to it or to one or more friends on Facebook, and then go out and search for related content either within Facebook or on the web.
How does the app work? I watched the short app demonstration of MindMeld on the Expect Labs webpage. It appears that you have to be a Facebook user to take advantage of MindMeld, as the only way to log in is through your Facebook account. It seems that the idea is for you to join in a conversation with friends–either one that’s currently underway or one that you initiate. MindMeld then listens to what you are saying and it starts displaying what it believes to be relevant and helpful content on the MindMeld screen.
If you and your friends are planning a trip to the BCS Championship game to watch Auburn battle Florida State, for example, I suppose that MindMeld would show you Facebook feeds from anyone trying to sell tickets to the game or maybe even offering a place to stay in Pasadena. MindMeld would probably also show things like airline tickets, hotel specials, or driving directions.
I’m a bit confused as to how the conversations work. Are you actually carrying on a live conversation where you can hear all your friends talking, like in a Google circles call? Or are you and your friends each just talking to MindMeld separately and the app listens to each person and pulls out the things it hears that seem relevant or interesting? I’m guessing that it’s the former, and that you can actually hear your friends speaking.
It’ll be interesting to see how MindMeld functions in reality and whether people find it helpful to be bombarded with content that an app thinks you might like. If you say something like “I drank way too much last night,” will it show hangover remedies? Or will it show you the news feeds of all your other friends that recently typed or said “I drank way too much last night?” Right now when you google that same phrase, the results are mostly in the latter category. Misery loves company, so that might be helpful. But it could just as well be annoying.
I’m confident that there are valid use cases for an app like MindMeld, though. Speech-based search will definitely be a part of our normal lives in the future. The question about how often and under what circumstances we want apps listening into our conversations remains open.