Robots are increasingly becoming common in everyday life but their communications skills still lag far behind. One key attribute that might really help robot-human interactions is if robots could learn to read and respond to human emotional cues.
In that case, they would be able to interfere when they are really needed and not disturb the rest of the time. Now, researchers at Franklin & Marshall College have been working on allowing socially assistive robots to process social cues given by humans and respond to them accordingly, as reported by TechXplore.
“I am interested in designing robots that help people with everyday tasks, such as cooking dinner, learning math, or assembling Ikea furniture,” Jason R. Wilson, one of the researchers who carried out the study, told TechXplore. “I’m not looking to replace people that help with these tasks. Instead, I want robots to be able to supplement human assistance, especially in cases where we do not have enough people to help.”
Wilson’s work was published in a paper on arXiv and presented at the AI-HRI (Artificial Intelligence for Human-Robot Interaction) symposium 2021 last week. It showcased a new method that sees robots autonomously detect when it is appropriate for them to step in and help their human counterparts. This argues Wilson allows the humans being helped to maintain their dignity.
The new method relies on humans conveying that they need help both in verbal and non-verbal ways. Verbal cues could consist of the human simply saying “I am not sure” and non-verbal cues could even be the human gaze. In fact, Wilson and his team conceived of a method to allow robots to automatically process eye-gaze-related cues in useful ways.
Initial testing of the new technique has proved very promising which means that robots’ abilities to detect both verbal and non-verbal cues may soon be coming to a bot near you.
Credit: Source link