Intelligent Robots Adapt Faster By Natural Language
Intelligent robots learn and adapt quickly to their environment through natural language, according to a new research paper. The study, Learning to Interpret Natural Language Commands through Human-Robot Dialog, was released recently as part of the International Joint Conference on Artificial Intelligence in Argentina.
Researchers from the University of Texas at Austin who conducted the study built a dialogue agent for a mobile robot that was put into a workplace environment and quickly learn to carry out navigation tasks to assist human workers without having to be trained on a large corpus of annotated data. This agent automatically induces training examples from conversations it has with humans, using a semantic parser to incrementally learn the meaning of previously unseen words. This can also account for new language variation, allowing for flexibility in how users say their requests.
“This approach is more robust than keyword search and requires little initial data,” state the researchers in their paper and mentions that it could be deployed in any context where robots are given high-level goals in natural language.“To the best of our knowledge, our agent is the first to employ incremental learning of a semantic parser from conversations on a mobile robot.”
The agent is also capable of multi-entity reasoning when doing navigation tasks such as finding a person’s office by noting it is next to another person’s office. More than 300 users interacted with the agent via Amazon Mechanical Turk web interface and 20 users via a simple Segbot robot on wheels in an office.
It is then that the reseachers note that it quickly learned and adapted to carry out tasks through self-learning on the go without having to undertake extensive training beforehand.
The researchers gives an example, where the agent is able to associate words such as ‘bring and ‘deliver’, and ‘Java’ and ‘coffee’, when doing a delivering task. It also asks the user if he or she is satisfied the request so it learns how to handle requests correctly. The agent can also learn to make the connection between someone’s nickname and their actual name such as ‘Frannie’ for Frances and ‘Bob’ for Robert, so that users do not always have to refer to a person by their full name when talking to the agent. Users were asked to fill in a survey, rating the interaction they had from 0-4. Users rated whether they strongly disagreed, somewhat disagreed and strongly agreed to statements such as ‘robot understood me’, ‘robot frustrated me’, ‘I would use the robot to get items for myself and others’, and the like.
The percentage of navigation tasks it completed over the four days didn’t differ from 90 percent, but delivery tasks completed improved from 20 percent to 60 percent. The agent also corrects users when they make grammatical errors, with one users pointing this out in his/her survey.
According to the researchers the future work on the agent includes applying it to speech recognition software, with the researchers exploring if it can automatically learn to correct consistent speech recognition errors. At the moment, users type requests through the mobile robot and web-based interface. As the robot platform gains access to more tasks, such as manipulation of items, doors, and light-switches via an arm attachment, we will scale the agent to learn the language users employ in that larger goal space, they say. “We also plan to add agent perception, so that some predicates can be associated with perceptual classifiers, and new predicates can be discovered for new words.”
Tech giant Google too has been experimenting with artificial neural networks — software consisting of interlinked nodes, modeled on the structure of biological brains — to help, for example, improve search results. In one of its research, Google hopes to create an artificial intelligence that can interact with, and help, humans using conversational modelling.
Oriol Vinyal and Quoc Le at Google have been working on developing an artificial intelligence that is better at adapting to these conversational twists and turns… and rather than being programmed by a human operator, the artificial intelligence has been teaching itself by analyzng movie subtitles and IT helpdesk transcripts.
More such researchers and experiments will evolve in the coming months where intelligent robots can be put to perform useful tasks.
- 2018: The Year Of Demystifying Unstructured Data
- Top 10 Customer Experience Trends in 2018
- 7 Trends Driving IT Transformation In India
- Microsoft, Accenture Team Up To Help Start-Ups
- Weekly Rewind: Top 10 Stories On CXOToday (Jan 8-13)
- Hyderabad To Host 'Olympics Of IT Industry'
- FinTech Trends To Watch Out For In 2018
- AI To Play A Decisive Role In Cyber Security
- CES 2018: What's In Store For The Enterprise?
- How AI Is Shaping The Future of Fintech