According to a new study, the human brain has a capacity that is ten times greater than previously believed.
Salk researchers and collaborators have achieved critical insight into the size of neural connections, putting the memory capacity of the brain far higher than common estimates.
The new work also answers a longstanding question as to how the brain is so energy efficient and could help engineers build computers that are incredibly powerful but also conserve energy.
Co-senior author Terry Sejnowski said that this is a real bombshell in the field of neuroscience, adding that they discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power.
Sejnowski noted that the new measurements of the brain's memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.
Also Read
Researcher Tom Bartol said that the data suggests there are 10 times more discrete sizes of synapses than previously thought. In computer terms, 26 sizes of synapses correspond to about 4.7 "bits" of information. Previously, it was thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus.
The findings also offer a valuable explanation for the brain's surprising efficiency. The waking adult brain generates only about 20 watts of continuous power, as much as a very dim light bulb.
The Salk discovery could help computer scientists build ultraprecise, but energy-efficient, computers, particularly ones that employ "deep learning" and artificial neural nets, techniques capable of sophisticated learning and analysis, such as speech, object recognition and translation.
The study is published in eLife.