Artificial intelligence has far outpaced human intelligence in certain tasks.
Researchers from University Hospital Freiburg in Germany led by neuroscientist Tonio Ball showed how a self-learning algorithm decodes human brain signals that were measured by an electroencephalogram (EEG).
It included performed movements, but also hand and foot movements that were merely thought of, or an imaginary rotation of objects.
The system could be used for early detection of epileptic seizures, communicating with severely paralysed patients or make automatic neurological diagnosis.
More From This Section
"The great thing about the program is we needn't predetermine any characteristics. The information is processed layer for layer, that is in multiple steps with the help of a non-linear function," said Schirrmeister.
"The system learns to recognise and differentiate between certain behavioural patterns from various movements as it goes along," he said.
The model is based on the connections between nerve cells in the human body in which electric signals from synapses are directed from cellular protuberances to the cell's core and back again.
Up until now, it had been problematic to interpret the network's circuitry after the learning process had been completed. All algorithmic processes take place in the background and are invisible.
That is why the researchers developed the software to create cards from which they could understand the decoding decisions. The researchers can insert new datasets into the system at any time.
"Our vision for the future includes self-learning algorithms that can reliably and quickly recognise the user's various intentions based on their brain signals. In addition, such algorithms could assist neurological diagnoses," said Ball, head investigator of the study published in the journal Human Brain Mapping.