This research summary article is based on the paper. 'Phase-change memtransistive synapses for mixed-plasticity neural computations'
The human brain is exceptionally good at remembering and learning new information. This brain capacity is attributed to the way data is stored and processed by building blocks called synapses and neurons. Artificial intelligence uses neural network elements that mimic the biophysical characteristics of these components.
To distinguish the many properties of dynamically changing objects, AI requires extensive training. However, there are still a variety of activities that are simple for humans to perform but require a lot of computational power for AI. Some examples include sensory perception, motor learning, and iterative math problem solving with continuous and sequential input streams.
Researchers at IBM’s Zurich lab have argued that these recognition techniques could be improved by improving AI hardware. The goal is to use the well-known phase change memory (PCM) technology to develop a new type of artificial synapse. The researchers are using a PCM memtransistive synapse, which combines memristors, a non-volatile electronic memory element and transistors into a single low-power device. This shows a non-Von Neumann in-memory computing architecture that offers various powerful cognitive frameworks for ML applications, such as time-dependent plasticity of short-term spikes and probabilistic Hopfield neural networks.
The main goal behind today’s AI hardware is to produce smaller synapses or synaptic junctions so that more of them can fit in a given space. They are designed to mimic the long-term plasticity of biological synapses. Synaptic weights remain constant over time, changing only during updates.
For scalable AI processors, current hardware that can represent these complex plasticity rules and the dynamics of biological synapses relies on complex transistor circuits, which are bulky and waste energy. On the other hand, memristive devices are more efficient, but they lack the required properties to record various synaptic processes.
The researchers wanted to build a memristive synapse that could express various nanoscale plasticity rules, such as long- and short-term plasticity and their combinations, using commercial PCM. They combined non-volatility (of amorphous-crystalline phase transitions) and volatility (of electrostatic changes) in PCMs.
Although phase change materials have been studied separately for memory and transistor applications, they have never been combined for neuromorphic computing. The team’s research demonstrates how non-volatility of devices enables long-term plasticity while volatility enables short-term plasticity. Their combination allows additional calculations of mixed plasticity, similar to the mammalian brain.
The researchers used a curated version of a sequential learning algorithm created to use state-of-the-art networks to learn highly dynamic parameters to verify device functionality. The notion was expanded to show the team’s emulation of different biological processes for meaningful computations. They also demonstrated how homeostatic processes in the brain could be used to design efficient artificial hardware to handle complex combinatorial optimization tasks. This was accomplished using stochastic Hopfield neural networks, in which noise processes at the synaptic junction give efficiency advantages to computational algorithms.
The team’s findings are more exploratory than system-level demonstrations. Although they intend to develop this method further, they also believe that the current proof-of-concept results already reveal substantial scientific interest in the broader areas of neuromorphic engineering – in computer science and better understanding of the brain through more faithful emulations. The devices are simple to build and use and are based on well-documented technology. The large-scale implementation, which consists of linking all the computing primitives and other hardware elements, is the main problem for the researchers. The current results demonstrate the value of mixed plasticity neural computations in neuromorphic engineering. This not only makes, for example, visual cognition more human, but also saves money on expensive training methods.