A team of researchers at the Massachusetts Institute of Technology (MIT) has been working to push the speed limits of a type of previously developed man-made analog synapse, which is cheaper to build and more energy-efficient, promising faster computation.
The multidisciplinary team used programmable resistors, which are the central building blocks in analog deep learning, just as transistors are the core elements for building digital processors to produce “analog deep learning.”
The resistors are built into repeating arrays to create a complex, layered network of artificial “neurons” and “synapses” that perform calculations much like a digital neural network. Such a network can then be trained to realize complex AI tasks such as image recognition and natural language processing.
The researchers used a practically inorganic material in the manufacturing process that allows their devices to run 1 million times faster than previous versions. The study claimed it is about 1 million times faster than the synapses in the human brain.
In addition, this organic material also makes the resistance extremely energy-efficient. Unlike materials used in the previous version of their device, the newly developed material is compatible with silicon fabrication techniques and could pave the way for integration into commercial computer hardware for deep learning applications.
“With that important insight and the very powerful nanofabrication techniques that we have at MIT.nano, we have been able to put these pieces together and show that these devices are intrinsically very fast and operate at reasonable voltages. This work really has these devices to a point where they now look really promising for future applications,” said senior author Jesús A. del Alamo, the Donner professor in MIT’s Department of Electrical Engineering and Computer Science (EECS).
“The device’s mechanism of action is electrochemical insertion of the smallest ion, the proton, into an insulating oxide to modulate its electronic conductivity. Since we are working with very thin devices, we could accelerate the movement of this ion by using a strong electric field, pushing these ionic devices into the nanosecond operating regime,” explained senior author Bilge Yildiz, the Breene M. Kerr Professor. the departments of Nuclear Science and Engineering and Materials Science and Engineering.
“The action potential in biological cells rises and falls on a time scale of milliseconds as the voltage difference of about 0.1 volts is limited by the stability of water,” said senior author Ju Li, the Battelle Energy Alliance Professor of Nuclear Science and Engineering and professor of materials science and engineering. “Here, we apply up to 10 volts to a special nanoscale-thickness solid glass film that conducts protons without permanently damaging them. And the stronger the field, the faster the ionic devices,” he added.
Said programmable resistors considerably increase the speed at which a neural network is trained, while considerably reducing the cost and energy for performing the training.
The latest development could help scientists develop deep learning models much faster, which can then be applied in applications such as self-driving cars, fraud detection and medical image analysis.
“Once you have an analog processor, you no longer train networks that everyone is working on. You train networks of unprecedented complexity that no one else can afford, and as a result, outperform them all. In other words, this is not a faster car, this is a spacecraft,” adds lead author and MIT postdoc Murat Onen.
The findings of the study have been published in the journal Science.