I recently read a story in Wired about the Aethon TUG robot designed for use in hospitals. Apparently the robot has been around for several years, but it’s gotten lots of media attention since the University of California, San Francisco’s Mission Bay wing deployed 25 of the robots.
The TUGs are designed to transport things like patient meals, medicines, medical waste, and linens within the hospital. Hospital staff use a touch screen to send the robot to a specific location. The TUG has built-in sensors and maps that enable it to navigate its way autonomously through corridors, into elevators, and around corners and obstacles.
What especially intrigued me was the fact that the TUG can talk. In fact, Matt Simon, the author of the Wired article, calls the robot “chatty.” I didn’t see any mention of the robot’s speech abilities on the Aethon website, though I could have overlooked it.
I was able to locate a 2011 video of a talking TUG operating in a hospital. To say that the TUG talks is a bit of an overstatement. The robot seems to have a set of pre-recorded scripts that it can broadcast when it needs to convey what it’s doing.
In the video, the TUG makes a few statements around the fact that it has called the elevator, that it’s waiting for the elevator to arrive, and that it’s about to get onto the elevator, so please step aside. It looks like the TUG provides this spoken information whether or not someone is actually there to hear it.
So why does the TUG need to talk at all? Unlike the hardware store robot that I wrote about previously, the TUG is not designed to be customer-facing. It doesn’t interact with patients or answer questions. It just needs to be told where to go.
But giving the TUG a voice seems like the best way for it to communicate its intentions. The robot has to interact within a dynamic social environment. To be successful, it can’t just barge blindly through the hallways and it also doesn’t want to be seen as standoffish and unpredictable. Being able to say what it’s up to goes a long way to making it seem less alien. The TUG doesn’t come across as any less robotic for the fact that it can talk, but it does become more of an accepted part of the social fabric.
It would be interesting to see what new capabilities the TUG might gain if it was enabled with speech recognition and NLP technology. If you were a patient, you might be able to ask it what was for lunch. In the best case, if you didn’t like the hospital menu, you could send it across the street for a burger. But that’s probably wishful thinking.