On-device memory consolidation using FN-synapses. Credit: Frontiers in Neuroscience (2023). DOI: 10.3389/fnins.2022.1050585

Artificial intelligence and machine learning have made tremendous progress in the past few years including the recent launch of ChatGPT and art generators, but one thing that is still outstanding is an energy-efficient way to generate and store long- and short-term memories at a form factor that is comparable to a human brain. A team of researchers in the McKelvey School of Engineering at Washington University in St. Louis has developed an energy-efficient way to consolidate long-term memories on a tiny chip.

Shantanu Chakrabartty, the Clifford W. Murphy Professor in the Preston M. Green Department of Electrical & Systems Engineering, and members of his lab developed a relatively simple device that mimics the dynamics of the brain's synapses, connections between that allows signals to pass information. The artificial synapses used in many modern AI systems are relatively simple, whereas biological synapses can potentially store complex memories due to an exquisite interplay between different chemical pathways.

Chakrabartty's group showed that their artificial synapse could also mimic some of these dynamics that can allow AI systems to continuously learn new tasks without forgetting how to perform old tasks. Results of the research were published Jan. 13 in Frontiers in Neuroscience.

To do this, Chakrabartty's team built a device that operates like two coupled reservoirs of electrons where the electrons can flow between the two chambers via a junction, or artificial synapse. To create that junction, they used quantum tunneling, a phenomenon that permits an electron to magically pass through a barrier. Specifically, they used Fowler-Nordheim (FN) quantum tunneling, in which electrons jump through a triangular barrier, and in the process, change the shape of the barrier. FN tunneling provides a much simpler and more energy-efficient connection than existing methods that are too complex for computer modeling.

"The beauty of this is that we can control this device up to a single electron because we precisely designed this quantum mechanical barrier," Chakrabartty said.

Chakrabartty and doctoral students Mustafizur Rahman and Subhankar Bose designed a prototype array of 128 of these hourglass devices on a chip less than a millimeter in size.

"Our work shows that the operation of the FN synapse is near-optimal in terms of the synaptic lifetime and specific consolidation properties," Chakrabartty said. "This artificial synapse device can solve or implement some of these continual learning tasks where the device doesn't forget what it has learned before. Now, it allows for long-term and short-term memory on the same device."

Chakrabartty said because the device uses only a few electrons at a time, it uses very little energy overall.

"Most of these computers used for machine learning tasks shuttle a lot of electrons from the battery, store it on a capacitor, then dump it out and don't recycle it," Chakrabartty said. "In our model, we fix the total amount of electrons beforehand and don't need to inject additional energy because the electrons flow out by the physics itself. By making sure that only a few flow at a time, we can make this device work for long periods of time."

The work is part of research Chakrabartty and his lab members are doing to make AI more sustainable. The energy required for current AI computations is growing exponentially, with the next generation of models requiring close to 200 terajoules to train one system. And these systems are not even close to reaching the capacity of the human brain, which has close to 1,000 trillion .

"Right now, we are not sure about training systems with even half a trillion parameters, and current approaches are not energy-sustainable," he said. "If we stay on the trajectory that we are on, either something new has to happen to provide enough energy or we have to figure out how to train these large models using these energy-efficient, dynamic-memory devices."

More information: Mustafizur Rahman et al, On-device synaptic memory consolidation using Fowler-Nordheim quantum-tunneling, Frontiers in Neuroscience (2023). DOI: 10.3389/fnins.2022.1050585

Journal information: Frontiers in Neuroscience