Last week there was news of a chatbot developed to ferret out suspected sexual predators of underaged victims. The chatbot, called Negobot, was developed by Carlos Laorden and other academics from the University of Deusto in Bilboa, Spain. Negobot is a AIML-based conversational agent that poses as a child on Internet chat forums and social networks and employs various sophisticated methods to draw out those who exhibit pedophile behavior. Not only does Negobot use natural language processing and machine learning, but it also leverages aspects of game theory to achieve its goal of inferring if someone has a high probability of being a sexual predator.
Negobot uses an AIML structure based on the Galaia Project to find appropriate responses to questions. The chatbot was primed with pedophile conversations from an existing law enforcement database. The database of conversations is stored in English, so Negobot translates all input into English before processing. Ongoing conversations are also added to the existing database.
Applying game theory concepts, the chatter bot views each conversation in terms of seven potential levels. In each successive level, the conversational partner shows more interest in the bot. At some point, the conversation transitions to one where sex is discussed explicitly. Negobot analyzes the ongoing dialog and is aware of the level of ‘sliminess’ of the conversation. It bases its responses on this knowledge and adjusts its strategy based on the level, all the while continuing to play the game of drawing out more information and damning conversational evidence from the suspected predator.
Negobot assigns a pedophile probability to the dialog partner based on the substance of the conversation. If the other person tries to end the conversation once it becomes clear that Negobot, or rather the child it’s posing as, is underage, then the probability of pedophile behavior decreases. If the person continues to ask questions of a sexual nature, the probability increases and Negobot poses questions to discover more personal information.
Laorden and his colleagues haven’t deployed Negobot into the real world yet, but they’re continuing to refine the chatbot. It may not be long before would-be sexual predators are being ensnared by virtual agent technologies. A full discussion of Negobot’s technology and capabilities is described in an abstract about the conversational agent published by the creators.