Brain Research Bombshell Could Kickstart New Era of Computing

Kristopher Sturgis

January 22, 2016

4 Min Read
Brain Research Bombshell Could Kickstart New Era of Computing

It turns out that your brain's capacity to remember is 10 times higher than previously thought, according to Salk Institute researchers. The discovery could ultimately help lead to the development super-efficient computer chips.

Qmed Staff

 Image of Terry Sejnowski (left) and two of his colleagues Cailey Bromer (center) and Tom Bartol (right).

Terry Sejnowski foresees a new era of computing based on neuroscience breakthroughs. Image of Sejnowski (left) and two of his colleagues Cailey Bromer (center) and Tom Bartol (right) above.

Imagine being able to remember half of the data held by all U.S. academic research libraries. As it turns out, that may be a possibility according to Salk scientists, who estimate that the brain's memory capacity is at least a petabyte in size, equivalent to one million gigabytes.

Terry Sejnowski, Salk professor and co-senior author of a research paper on the subject published in eLife, called the discovery a neuroscience "bombshell." By shedding light on how hippocampal neurons can support high computational power using comparatively little energy, the research could ultimately usher in advances in neurologic-inspired computing.

"We are on the threshold of a new era in computer architectures and cognitive computing that is based on the computing style of the brain," Sejnowski told Qmed. "These computers will be able to perceive images, understand speech, and use natural language. Even cell phones will become much smarter."

Sejnowski already sees significant advances happening with the design of computer chips that are inspired by neuroscience discoveries. He expects more energy saving breakthroughs to occur when very-large-scale integration (VLSI chips) are built based on principles discovered in neuroscience breakthroughs.

"As line widths on wires in VLSI chips get thinner the noise increases and failures occur.  The next generation of computer architectures will have to live with this noise.  Many synapses in the brain have high levels of noise--most of the time they fail completely to activate, but overall performance remains at a high level,"  Sejnowski says. "We discovered that despite this noise, the strength of a synapse can be set with high precision.  Because signals converge on a neuron from many thousands of other neurons, the noise cancels and the failures save energy.

Scientists continue to make progress in the field of neural-inspired computer chips, and this latest study from Salk could provide a deeper understanding of neural networks and synapses that could drive these technologies even further. Recent research indicates that our memories and thoughts are the results of different patterns of electrical and chemical activity in the brain. A key component of that activity is understanding synapses.

While synapses still largely remain a mystery, the team of scientists at Salk built a 3D reconstruction of rat hippocampus tissue -- the memory center of the brain -- and attempted to measure the differences in similar synapses through advanced microscopy and computational algorithms.

Eventually, the team was able to image rat brains and reconstruct the connectivity, shapes, volumes, and surface area of brain tissue at a nanomolecular level. What they found was that the synapses were nearly identical, which allowed the team to use algorithmic models to measure how much information could potentially be stored in synaptic connections.

"The implications of what we found are far-reaching," Sejnowski told Salk news. "Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us."

The group's findings also suggest a plausible explanation for the human brain's remarkable efficiency. A waking adult brain only generates about 20 watts of continuous power -- which roughly equates to a dim light bulb. Salk scientists believe their findings could help computer scientists develop advanced computer systems with ultimate precision and remarkable efficiency -- technologies that employ "deep learning" techniques that are capable of learning speech, language translation, and object recognition.

Sejnowski says that these tricks of the brain can absolutely point us toward better ways to design computers. In the end, probabilistic transmission requires much less energy and turns out to be just as accurate for both computers and brains, he says.

"There will be even more energy savings when VLSI (Very Large Scale Integration) chips are built based on these principles." Something that Sejnowski and his colleagues hope to move toward as their research progresses.

Learn more about cutting-edge medical devices at MD&M West, February 9-11 at the Anaheim Convention Center in Anaheim, CA.

About the Author(s)

Kristopher Sturgis

Kristopher Sturgis is a freelance contributor to MD+DI.

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like