Having measured the capacity of synapses - connections of neurons in the brain, American scientists came to the conclusion that the capabilities of human memory are underestimated at least 10 times. Only one synapse is able to accommodate about 4.7 bits of information, which on the scale of the entire brain increases to 1 petabyte. The “paper” equivalent looks even more impressive - it corresponds to 20 million 400 thousand boxes filled with sheets of text information.
The study by American scientists also answers a long-standing question about the unique energy efficiency of the human brain. The findings will help engineers build powerful, next-generation computers with minimal power consumption. According to Terry Seinovski, a professor at the Salk Institute, this is a real "bomb" in the field of neurology:
“We have discovered a clue to how neurons in the hippocampus (the brain region responsible for the formation of emotions and the consolidation of memory) function with the least amount of energy and maximum efficiency. Our latest measurements of the brain's memory capacity increase our previous conservative estimates by a factor of 10, that is, up to 1 petabyte, which is commensurate with the capabilities of the World Wide Web. "