Artificial Brain with a Single Neuron

Researchers at Technische Universität Berlin (TU Berlin) have succeeded in simulating a neural network of thousands of nerve cells on a computer with only a single neuron programmed into the software code.

Image credit: geralt (Free Pixabay license)

The neuron is activated and read out with a time delay and can thus take on the roles of all virtual neurons within fractions of a second. This creates a completely new class of neural networks distributed through time rather than space.

Such an approach would allow entirely new solutions in the future to integrate artificial neurons directly into hardware components, such as through optoelectronic building components. AI hardware using this method could also save energy and thus be more climate-friendly as it requires less power. These results were recently published as an “Editors’ Highlight” in Nature Communications.

About 86 billion nerve cells in our brains ensure that we can perform extraordinary mental feats compared to the rest of the animal kingdom. These neurons are connected by synapses and communicate with each other in a vast network via electric and chemical signal transmission.

For decades researchers have been developing artificial neural networks that can “learn” like their human counterparts by establishing connections between individual neurons to varying degrees. Which neurons communicate especially well with each other depends on the respective task the neural network was trained for. Following its training, the network is able, for instance, to recognize objects, categorize images, or independently write texts.

One training as harmful as driving 700,000 kilometers by car

“Our research addresses two limitations current neural networks face,” says PD Dr. Serhiy Yanchuk, head of the Applied Dynamical Systems Research Area at TU Berlin. The first is the high energy consumption of neural networks simulated by supercomputers.

According to a study from the University of Copenhagen, a single training cycle for one of the best AI programs for language production uses as much electricity as the carbon equivalent of driving 700,000 kilometers by car [1]. If such an AI program is used in industry following its training – which would be very frequently – the entire power usage can be much higher [2].

“On the other hand, there are also neural networks where the neurons are built as real physical systems,” explains Yanchuk. “These can be achieved as purely electrical semiconductor devices or with the aid of optical technologies based on lasers. And, of course, there are limits to the number of these hardware neurons and the connections between them.”

While it is already possible to program billions of neurons on supercomputers, the newest hardware implementations have only achieved a few thousand artificial nerve cells thus far.

A single guest at the dinner table

Using the approach developed by the TU Berlin researchers, the solution for both of these challenges could lie in using a single neuron that assumes the role of all neurons in a neural network in a time-delayed input/output feedback loop.

This would be akin to a single guest simulating the conversation at a large dinner table by switching seats rapidly and speaking each part. Special delay circuits record the state of the neuron, modulate it appropriately and send back the resulting delayed signal.

“We have now demonstrated with the computer that this is in principle possible. Laser-based circuits would be particularly suitable for implementation in hardware, because they are so fast that the time delays are particularly short,” explains Florian Stelzer, lead author of the study.

In this concept, the regular spatial distance between two neurons in the network would be replaced by a time delay. “The option to influence the strength of individual neural connections, which is essential to the training, can be achieved here by further manipulating the time delays,” continues Stelzer.

Less effort and lower energy consumption

It is clear that hardware implementation will benefit if only one neuron really needs to be manufactured. However, based on other results in the field of so-called time-delay reservoir computing, the researchers also suspect that their method could reduce the energy consumption of artificial neural networks.

Here, a physical system or reservoir is assigned an input signal. The output signal is transmitted as an input signal again via a feedback loop. By using this method for calculations, it is possible to achieve good performance and less energy consumption than with conventional software solutions.

Task: recognizing items of clothing

“Our system functions as a sort of expansion to reservoir computing with an artificial nerve cell at the core,” says Stelzer. “This artificial nerve cell is nothing more than a mathematical, non-linear function that we can precisely define.”

New is that it can learn as a result of the changeable time delays. “Additionally, we are able for the first time to simulate the different network levels of a deep neural network.” The researchers refer to this as a “Folded-in-time Deep Neural Network” (Fit-DNN). In their computer model, their solution has already succeeded in mastering a task typical of neural networks: It was able to reconstruct images of clothing items which had been made unrecognizable through overlayed noise.

New insights also possible

The researchers attribute their success to the interdisciplinary make-up of their team of mathematicians, physicists, and computer scientists. The team included researchers from the University of Tartu in Estonia as well as the University of the Balearic Islands in Palma, Majorca.

In addition to clear practical improvements to AI systems, further investigation of the folded-in-time neural network could offer additional insights. If the time delay between two neurons “located” directly next to each other in time is shortened further, it would theoretically be possible to create a limitless number of neurons, explains Serhiy Yanchuk.

“This is a completely new feature of folded-in-time networks and one which achieves the transition from discrete, individual nodes in the network to a sort of continuum. What this could lead to is still open.” Several international physics and computer science working groups have already announced their plans to implement the new neural network as quickly as possible, including as hardware circuits.

Stelzer et al. in Nature Communications, „Deep neural networks using a single neuron: folded-in-time architecture using feedback-modulated delay loops“.

Sources

[1] “Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models”:  https://arxiv.org/abs/2007.03051

[2] “Accounting for the Neglected Dimensions of AI Progress”: https://arxiv.org/abs/1806.00610

Source: TU Berlin