Bridging the neuron-to-network gap

Using a unique architecture, researchers are working to make computers more efficient and less specialized

Traditionally, there are two ways researchers try to understand the brain. Some poke and prod to see what neurons do and map out their connectivity, watching their activity with the help of small and big fancy equipment, such as fMRI machines.

Chakrabartty

“The other approach towards understanding the brain is to emulate it in silicon,” said Shantanu Chakrabartty, the Clifford Murphy Professor in the Preston M. Green Department of Electrical & Systems Engineering in the McKelvey School of Engineering at Washington University in St. Louis. “Can you recreate some of the low-level dynamics or system level properties of the brain?”

Chakrabartty has recently been awarded $380,000 from the National Science Foundation to address a persistent problem when it comes to recreating these neuronal networks in silicon: energy efficiency.

A single action potential generated by a neuron uses more energy than a neuron implemented on silicon. Yet, a population of neurons in the brain can seamlessly perform diverse recognition and learning tasks while consuming significantly lower energy than its silicon counterpart, which can usually only do one thing well. This “neuron-to-network energy gap” is one of the focus of Chakrabartty’s project.

Dynamic image of neurons after stimulation
Dynamics of a voxel comprised of 10,000 Growth Transform Network neurons after an artificial stimulus.

The project will also address the fact that biological brains are general-purpose while computers are specialty machines. “The state-of-the-art machine learning chips are good at doing one thing at a time,” Chakrabartty said, “but they can’t do anything else.”  To solve this problem, Chakrabartty’s team is considering designing a silicon brain, not as a collection of independent, interacting neurons, but as one, giant, dynamical system.

“It’s going to be a system with billions of neurons, spiking, communicating, inhibiting, exciting, oscillating in a wide variety of patterns” he said. “The scientific challenge will be to ensure that dynamics of this billion-variable system is stable and interpretable. Hopefully, we would then be able to control it to perform biologically realistic tasks.”

Chakrabartty believes that an architecture called growth transform neural network (GTNN) which his research group has previously developed could be the key to implementing this large-scale dynamical system. The open-source software implementation of the GTNN simulator is publicly available and users can already use this tool to implement dynamical networks comprising of more than 10 million neurons, which is 10 times more than the number of neurons in the brain of a honey-bee.

The above simulation can be accessed through GitHub

The McKelvey School of Engineering at Washington University in St. Louis promotes independent inquiry and education with an emphasis on scientific excellence, innovation and collaboration without boundaries. McKelvey Engineering has top-ranked research and graduate programs across departments, particularly in biomedical engineering, environmental engineering and computing, and has one of the most selective undergraduate programs in the country. With 140 full-time faculty, 1,387 undergraduate students, 1,448 graduate students and 21,000 living alumni, we are working to solve some of society’s greatest challenges; to prepare students to become leaders and innovate throughout their careers; and to be a catalyst of economic development for the St. Louis region and beyond.