Don’t miss the latest developments in business and finance.

Understanding animal 'languages'

The commercial potential of AI-based communication with animals is huge

Image
Devangshu Datta
Last Updated : Jan 18 2018 | 10:50 PM IST
Is it possible that one day in the not-too-distant future we will be able to simply point an app at a dog, or cat, or some other animal, and read a translation of the noises they are making, or receive an analysis of their body language in real-time? At least one animal behaviourist is working on developing an artificial intelligence program that is designed to do this. 

Now, people who work with animals or simply keep pets can “translate” animal attitude with uncanny accuracy. Dog-owners, cat-lovers, elephant-whisperers can all, to some degree, understand the vocalisation and body language of the animals they associate with. People working with chimpanzees and gorillas have even taught sign language to the apes. This is not magic — it is an understanding and empathy developed through long observation and association. Since it is learned behaviour based on physical signals, it should be possible to teach it to an AI.

Biologist and animal behaviour expert, Constantine Slobodchikoff, is based at Northern Arizona University. His area of specialisation is prairie dogs. He has discovered that the rodents have a fairly complicated system of communication. They use different alarm calls to identify different predators and in association with computer scientists, Slobodchikoff has developed a method of translating such calls into English.

He has started to study video footage of dogs and used AI to understand how canines communicate. He hopes to translate their body language, growls and barks, into English. He has founded a company, Zoolingua, which is trying to create an algorithm to translate canine behaviour. He hopes that this will lead to more effective AI-based communication (could this be two-way?) with pets, which would obviously have huge commercial potential.

There are other researchers working in similar areas. Advances in AI techniques and more powerful hardware could lead to breakthroughs in communications. Many animals have communication systems. Some like whales and dolphins, for instance, may even have more sophisticated communications than human beings.

Sperm whales for example, are highly social, as are dolphins and orcas. Cetacean pods communicate in bursts of high-speed clicks, which seem to carry huge amounts of information. Different pods have different conversational styles, which again indicates a high degree of sophistication and culture — that is, socially learned behaviour.

We are hampered by a lack of understanding of the marine environment and hence, the lack of context in deciphering these languages. Whale researchers have been desperately looking for breakthroughs in linguistic understanding because proof that whales have language and culture would be a powerful lever to press for bans on whaling. 

When it comes to human beings, AI based programs seem to have already made important advances in understanding body language, facial expression, vocal tonality etc. Non-verbal communication is an important component of human interaction and it is one reason why emails and instant messages etc can be atonal. We don’t know if a communication is joking or serious.

Carnegie Mellon University’s (CMU) Robotics Institute uses a two-storey dome with 500 video cameras to decipher body language and hand movements of people in groups. This is part of an experiment that could eventually make it easier for robots to co-exist usefully in the middle of large groups and also enable humans to communicate with computers through hand gestures. 

It may also help develop insights into autism and depression, and perhaps, even enable early diagnosis of such conditions. One practical application that may be enormously useful immediately is in training autonomous cars to understand pedestrian behaviour. Another commercially useful application could be sports analytics to identify predictive behaviour during sports, and decipher “tells”, as poker players call it. CMU has released the computer code for multi-person and hand pose estimation and it is now being used by many research groups and in commercial R&D. 

There are commercially available programs that detect emotion. AI algorithms identify key emotional landmarks such as eyebrows, the nose-tip, the corners of the mouth etc. Deep learning analyses those areas and maps them to emotions. If this understanding is combined to programs that analyse speech patterns, analysing tones, speed, loudness etc to decipher emotional content, that could be a powerful way to accurately detect and diagnose emotional states. Among other things, such programs can be used to improve the quality of public speech and presentations as well as analysis of political statements for sincerity. It could also be used to enable a range of unpleasant actions.

It sounds counter-intuitive but the emotionless computer could actually prove to be more accurate at deciphering emotion and linguistic content than even the most empathetic of human beings. Applied to other species, it might eventually lead to a better understanding of consciousness.


Next Story