The system could improve the way people interact with computers and perhaps allow disabled people to use computer-based communications devices, such as voice synthesisers, more effectively.
Karthigayan Muthukaruppan from Manipal International University in Selangor, Malaysia, and colleagues have developed a system using a genetic algorithm that gets better with each iteration to match irregular ellipse fitting equations to the shape of the human mouth displaying different emotions.
They have used photos of individuals from South-East Asia and Japan to train a computer to recognise the six commonly accepted human emotions - happiness, sadness, fear, angry, disgust, surprise - and a neutral expression.
The upper and lower lip are analysed as two separate ellipses by the algorithm.
"In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers especially in the area of human emotion recognition by observing facial expression," the team said in a statement.
The researchers suggested that initial applications of such an emotion detector might be helping disabled patients lacking speech to interact more effectively with computer-based communication devices.
The study was published in the International Journal of Artificial Intelligence and Soft Computing.