Business Standard

Soon, robots could read your mood and actions through your body language

Researchers develop a computer code to help robots decode your actions through your gestures

Representative Image

Automation, Robotics, Robots, Digital, AI

IANS New York

Researchers have developed a computer code that could help robots understand body poses and movements, allowing them to perceive what people around them are doing, what moods they are in and whether they can be interrupted.

"We communicate almost as much with the movement of our bodies as we do with our voice," said Yaser Sheikh, Associate Professor of Robotics at Carnegie Mellon University in Pittsburgh, Pennsylvania, US.

"But computers are more or less blind to it," Sheikh said, adding that the new methods for tracking 2D human form and motion open up new ways for people and machines to interact with each other, and for people to use machines to better understand the world around them.

 

The computer code was developed with the help of the university's Panoptic Studio, a two-storey dome embedded with 500 video cameras.

The insights gained from experiments in that facility now make it possible to detect the pose of a group of people using a single camera and a laptop computer, the researchers said.

To encourage more research and applications, the researchers have released their computer code for both multiperson and hand-pose estimation.

The researchers will present reports on their methods at CVPR 2017, the Computer Vision and Pattern Recognition Conference to be held in Honolulu, Hawaii from July 21-26.

Tracking multiple people in real time, particularly in social situations where they may be in contact with each other, presents a number of challenges.

Simply using programmes that track the pose of an individual does not work well when applied to each individual in a group, particularly when that group gets large.

Sheikh and his colleagues took a bottom-up approach, which first localises all the body parts in a scene -- arms, legs, faces, etc. -- and then associates those parts with particular individuals.

This method helped them to build the computer programme that the researchers believe may have several other application.

The ability to recognise hand poses, for instance, will make it possible for people to interact with computers in new and more natural ways, such as communicating with computers simply by pointing at things.

It could also help a self-driving car get an early warning that a pedestrian is about to step into the street by monitoring body language.

--IANS

gb/vt

(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Jul 07 2017 | 7:36 PM IST

Explore News