Networks of nanoscale resistors that function similarly to nerve cells in the body could outperform digital machine learning.

Node artwork in a network Segundo Kiyoshi Takahase/Alamy

A resistor that functions in the same way as nerve cells in the body could be used to construct neural networks for machine learning.

Many large machine learning models rely on increasing amounts of processing power to achieve their results, but this consumes a lot of energy and generates a lot of heat.

Analog machine learning, which works like a brain by using electronic devices similar to neurons to act as model parts, is one proposed solution.

However, these devices have not yet been fast, small, or efficient enough to outperform digital machine learning.

Murat Onen and colleagues at the Massachusetts Institute of Technology developed a nanoscale resistor that transmits protons from one terminal to another.

This works similarly to a synapse, a connection between two neurons in which ions flow in only one direction to transmit information.

However, these “artificial synapses” are 1000 times smaller and 10,000 times faster than biological synapses.

Machine learning models could run on networks of these nanoresistors, just as the human brain learns by remodeling the connections between millions of interconnected neurons.

“We’re doing things that biology can’t,” says Onen, whose device is a million times faster than previous proton-transporting devices.

The resistor employs powerful electric fields to transport protons at extremely high speeds without damaging or breaking the resistor itself, which was a problem with previous solid-state proton resistors.

Systems containing millions of resistors will be required for practical analog machine learning. Onen admits that this is a technical challenge, but the fact that the materials are all silicon-compatible should make integration with existing computing architectures easier.

“This looks really impressive for what they achieve in terms of technology – very high speed, low energy, and efficiency,” says Sergey Saveliev of Loughborough University in the United Kingdom. However, the device’s use of three terminals rather than two, as a human neuron does, may make running certain neural networks more difficult, he adds.

Pavel Borisov, also of Loughborough University, agrees that the technology is impressive, but he points out that the protons come from hydrogen gas, which could be difficult to keep safely in the device as the technology scales up.

Reference: Science, DOI: 10.1126/science.abp8064

Leave a Reply

Your email address will not be published.