As reported by Techcrunch, Robotbase has launched its Kickstarter campaign to presell its artificially intelligent personal robot. CEO Duy Huynh was at CES this past week pitching the company and showing off the prototype version of the robot Maya to Techcrunch reporters and others. You can launch the pitch and demo session from within the Techcrunch article.
Robotbase seems to be referring to the product as the “Personal Robot.” Once you purchase the robot, you can call it by whatever name you like. I see a lot of similarities between Jibo, the social robot, and the Robotbase robot. The obvious major difference is that the Personal Robot is mobile, being built atop a platform with wheels. It comes with software that enables it to scan a room and make a map that it can use to autonomously navigate around obstacles.
If you look past its ability to move around a room, the Personal Robot is set to have many features reminiscent of Jibo’s proposed capabilities list. It is supposed to connect to and control your in-home connected devices, recognize faces and other objects, understand speech input, get information from the cloud, act as group photographer, and generally perform the activities of a personal assistant.
During the CES demo, CEO Huynh also demoed the Personal Robot’s abilities as a retail assistant. That’s very similar to the hardware store robot OSHbot that I wrote about last month. Robotbase’s robot is able to understand a customer question, provide an answer, and lead the customer to the location of the desired shopping item if needed (just like OSHbot). It strikes me that this retail use case might be an easier one to succeed at than the broader personal assistant use case. In a retail setting, the robot knows that the vast majority of questions will be about store merchandise. A personal home robot, on the other hand, will need to be able to anticipate and correctly react to a whole host of possible topics and conversational items.
Both Jibo and Robotbase’s Personal Robot promise a lot and they’re both still under development. In the CES demo, Huynh talks about the company’s deep learning algorithms. In fact, it’s this software technology that Huynh lauds as Robotbase’s most significant achievement. These algorithms are intended to give the Personal Robot the ability to get smarter over time based on interactions (unsupervised learning).
I like the concept of Robotbase’s Personal Robot and hope to see the company succeed. It does seem to me that delivering fully on the demonstrated robot capabilities will be tough. In the demo videos, the Personal Robot speaks with a human voice, using natural intonation. It even reads a child’s story, stressing the right words and giving the story emotion by using its voice. Mimicking this type of human intonation is surprisingly difficult for an automated text-to-speech program. It might work marginally well for canned responses, but it would be hard to accomplish for output that’s variable and created on the fly.
A Personal Robot that follows us around everywhere we go in our home would also need to be sensitive to the context we’re in at any given time. It would need to have a good sense for when we want to be interrupted with information and when we don’t. Even in the Kickstarter video, I get the sense that the Personal Robot could become annoying. If it’s with me in the kitchen, I don’t want it bugging me every couple of minutes asking if I need help with a recipe. I also don’t necessarily want it waking me up or making assumptions about how I slept. And I’m not sure I want it ordering lunch for me without checking to see what I want first, even if it knows what I usually get.
I’m assuming you can control these behaviors and that the robot will get to know your preferences over time. But getting all that right in the software is bound to be a challenge. I have confidence that at some point in the future, the vision of truly effective, unobtrusively helpful personal robots will be a reality. Let’s hope that future is right around the corner.