According to a new study, the human memory is capable of storing 10 times more information than previously thought. That puts it "in the same ballpark" as the capabilities of the entire internet, according to the authors.
Researchers from the Salk Institute in California found that synapses, which connect the nerves in our brain, can be divided into 26 different size categories.
Up until now, scientists have considered three sizes for synapses: Small, medium, and large. The more specific measurements mean the researchers have a better idea of how synapses work, and therefor how our memories function. They also help explain how the seemingly sloppy synapses can power our brain's sophisticated abilities.
Researchers used advanced microscopy and computational algorithms they had developed to image rat brains and reconstruct the connectivity, shapes, volumes and surface area of the brain tissue down to a nanomolecular level.
The scientists expected the synapses would be roughly similar in size, but were surprised to discover the synapses were nearly identical.
The difference in size between the synapses pairs was just 8%. The researcher note why that matters:
Because the memory capacity of neurons is dependent upon synapse size, this eight percent difference turned out to be a key number the team could then plug into their algorithmic models of the brain to measure how much information could potentially be stored in synaptic connections… the team determined there could be about 26 categories of sizes of synapses, rather than just a few.
…and how that translates to our memory's capacity:
In computer terms, 26 sizes of synapses correspond to about 4.7 “bits” of information. Previously, it was thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus.
The researchers' paper was published in eLife. Co-senior author Terry Sejnowski called his team's discovery "a real bombshell in the field of neuroscience,” adding, “Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.”
Some, however, are calling that characterization into question. Over at Gizmodo, George Dvorsky points out that one petabyte is still far below the full breadth of the internet, writing:
The Salk researchers appear to be overstating it a bit. If we consider the Big Four—Google, Amazon, Microsoft, and Facebook—their servers alone store at least 1,200 petabytes between them. That excludes the rest of the Web, including storage providers like Dropbox, Barracuda, and SugarSync.
Dvorsky says that a more accurate yardstick would be the Library of Congress which, per the researchers' new calculation, would take up just a quarter of our memory's capacity. Still pretty amazing.
Danielle Wiener-Bronner is a news reporter.