Featured Post

The Declaration of White Independence: Fourth Political Theory

A unilateral assertion offered to and for consideration by the European Descended People of the fifty united States of America and all ...

22 January 2016

Memory 10 Times More Massive Than Thought

Time for a free memory upgrade. Turns out that our brains have 10 times the storage capacity than we previously thought, according to new neuroscience research.

When a team from the Salk Institute and UT-Austin set out to create a 3-D model of every dendrite, axon, glial process, and synapse from a part of a rat's memory center, the hippocampus, that was the size of a single red blood cell, they made an unexpected discovery.

In some cases, a single neuron appeared to have two synapses branching out to connect with another neuron. It seemed that the first neuron was sending a duplicate message.

This was odd but not totally out of left field. Nonetheless, the scientists decided to measure the differences that occurred between these kinds of duplicating synapses and found the number to be about eight percent.

At the time it seemed like a small difference. But memory capacity of neurons depends upon synapse size and after the scientists plugged this number into their calculations, they were astounded.

The math showed that a single brain synapse could hold 4.7 bits of information, and the whole brain could actually hold one petabyte.

“This is roughly an order of magnitude of precision more than anyone has ever imagined,” Salk professor Terry Sejnowski said in a press release about the discovery, which was just published in the journal eLife.

What is a petabyte exactly? For context, the people talking about petabytes are often describing big data, supercomputers, or the entire Web. Technically, a petabyte is 100 million gigabytes. That’s roughly four times all the data in the Library of Congress. Another tangible comparison: Imagine a row of 8.5-inch-wide printed smartphone photos. A terabyte’s worth would stretch for 48,000 miles.

The Salk Institute team thinks that having a better handle on the human brain’s capacity could help us design better artificial intelligence. Applying this new knowledge to building ultraprecise energy-efficient computers that have artificial neural networks might pave the way for huge leaps in speech and object recognition.

And don’t worry if you’re still trying to wrap your head around petabytes. Your brain gets it.