With the help of facial recognition technology, Japanese scientists have developed an automated system that can predict when patients are at increased risk of unsafe behaviour in the ICU with an accuracy of 75 per cent.
The study being presented at the 'Euroanaesthesia' congress in Vienna, Austria (1-3 June), suggested that the automated risk detection tool has the potential as a continuous monitor of patient's safety and could remove some of the limitations associated with limited staff capacity that make it difficult to continuously observe critically-ill patients at the bedside.
"Using images we had taken of a patient's face and eyes we were able to train computer systems to recognise high-risk arm movement", said Dr Akane Sato from Yokohama City University Hospital, Japan, who led the research.
"We were surprised about the high degree of accuracy that we achieved, which shows that this new technology has the potential to be a useful tool for improving patient safety, and is the first step for a smart ICU which is planned in our hospital," added Sato.
The study included 24 postoperative patients (average age 67 years) who were admitted to ICU in Yokohama City University Hospital between June and October 2018.
The proof-of-concept model was created using pictures taken by a camera mounted on the ceiling above patients' beds.
More From This Section
Around 300 hours of data were analysed to find daytime images of patients facing the camera in a good body position that showed their face and eyes clearly.
In total, 99 images were subject to machine learning--an algorithm that can analyse specific images based on input data, in a process that resembles the way a human brain learns new information.
Ultimately, the model was able to alert against high-risk behaviour, especially around the subject's face with high accuracy.
"Our end goal is to combine various sensing data such as vital signs with our images to develop a fully automated risk prediction system," said Dr Sato.
Disclaimer: No Business Standard Journalist was involved in creation of this content