laitimes

Inspired by microscopic nematodes, MIT introduced "liquid" neural network parameters that are variable over time, and microscope nematodes are accurately predicted as inspiration sources, saving a lot of computational costs in small sizes

author:New Zhiyuan

Recently, the team of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) launched a "liquid" neural network, which can learn in addition to the training process, and can constantly update the model equations with new data inputs, which is well adapted to the variability of real life. Interestingly, this inspiration was drawn from observations of neurons of nematodes under a microscope.

"Liquid" neural networks?

What is this sacred?

I guess there's a good chance you've heard the term for the first time, and you'll be curious about what this "sensational" neural network is all about.

Recently, researchers at the Massachusetts Institute of Technology have developed a neural network that can learn in practice in addition to learning during the training phase.

These flexible algorithms, known as "liquid" networks, can constantly change their basic equations to accommodate new data inputs.

Inspired by microscopic nematodes, MIT introduced "liquid" neural network parameters that are variable over time, and microscope nematodes are accurately predicted as inspiration sources, saving a lot of computational costs in small sizes

We know that some data streams change over time, such as those involving medical diagnostics and autonomous driving. And this progress is helping the decision-making process based on these data streams.

As a result, such new neural networks could aid decision-making in autonomous driving and medical diagnostics.

Lead author Ramin Hasani of the study said:

"This is a big step towards any form of time series data processing such as robot control, natural language processing, video processing, etc. in the future, and has great potential."

The study will be presented at the AAAI Artificial Intelligence Conference in February.

In addition to Hasani, a postdoctoral fellow at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), co-authors at MIT include Daniela Rus, director of CSAIL, Professors Andrew and Erna Viterbi of electrical engineering and computer science, and Doctoral student Alexander Amini.

Other co-authors include Mathias Lechner of the Austrian Institute of Science and Technology and Radu Grosu of the Vienna Polytechnic University.

Inspired by microscopic nematodes, MIT introduced "liquid" neural network parameters that are variable over time, and microscope nematodes are accurately predicted as inspiration sources, saving a lot of computational costs in small sizes

<h1>

Parameters are variable over time, and microscopic nematodes are a source of inspiration

</h1>

Hassani said that time series data is ubiquitous and an important reference to help us understand the world.

"The real world is made up entirely of sequences. This is true even with our perception: you are not perceiving images, you are perceiving a series of images."

"So, in effect, time series data creates reality"

He noted that video processing, financial data, and medical diagnostic applications are all examples of time series that are critical to society, and that the changing flow of data is unpredictable. Analyzing this data in real time and using it to predict future behavior can drive the development of emerging technologies such as self-driving cars.

So Hassani created an algorithm suitable for this type of task. He designed a neural network that could adapt to real-world variability.

As we all know, a neural network is an algorithm that analyzes training data to identify patterns, and it is often said that it can simulate the processing process of the brain.

And Hassani is directly from the microscope of the nematode C. Inspired by elegans:

"Its nervous system has only 302 neurons, but it can produce unexpectedly complex dynamics."

Inspired by microscopic nematodes, MIT introduced "liquid" neural network parameters that are variable over time, and microscope nematodes are accurately predicted as inspiration sources, saving a lot of computational costs in small sizes

Under a closer look at how nematode neurons are activated by electrical impulses and communicate with each other, Hassani encodes the neural network he created.

In the equations he uses to build neural networks, he allows parameters to vary over time based on the results of a set of differential equations.

This flexibility is the key – most neural networks behave in a fixed manner after the training phase, which means they are not very good at adapting to changes in the incoming data stream.

Hassani said the fluidity of the "liquid" network he created makes it more resilient to unexpected or noisy data — such as heavy rain that obscures the view of cameras on self-driving cars.

Inspired by microscopic nematodes, MIT introduced "liquid" neural network parameters that are variable over time, and microscope nematodes are accurately predicted as inspiration sources, saving a lot of computational costs in small sizes

"So it's more dynamic," he said.

He added that the flexibility of the network has another advantage: "It's easier to understand"

Hassani says his "liquid" network avoids the uncanny common to other neural networks:

"Just by using differential equations to change the representation of a neuron, you can explore some level of complexity, otherwise, you will never be able to do so."

Thanks to the few but highly expressive neurons in this neural network, it has become easier to observe the process by which the neural network makes decisions and to determine the reasons for the classification of the network.

"This model has richer expressive power," Hassani said, so this feature can help engineers better understand and improve the performance of liquid networks.

Accurate forecasting, small size saves a lot of calculation costs

Liquid networks performed well in a range of tests:

Spanning from atmospheric chemistry to traffic pattern applications, the model is several percentage points ahead of other state-of-the-art time series algorithms at accurately predicting future values of datasets.

Inspired by microscopic nematodes, MIT introduced "liquid" neural network parameters that are variable over time, and microscope nematodes are accurately predicted as inspiration sources, saving a lot of computational costs in small sizes

Hassani said: "In many applications, we see reliable high performance"

In addition, because the size of the network is small, it does not have to spend high computational costs when completing tests.

Hassani said: "Everyone is talking about expanding their network. And what we want is to scale down and have fewer but richer nodes."

The research was funded in part by Boeing, the National Science Foundation, the Austrian Science Foundation and European Leaders in Electronic Components and Systems.

Hassani's plan is to continue to improve the system to prepare it for industrial applications:

"Inspired by natural phenomena, we have a more expressive neural network, but that's just the beginning."

"Next, we have to face a very obvious challenge: how to develop it further? We believe that this kind of network may become a key element of the intelligent system of the future."

Reference Links:

https://news.mit.edu/2021/machine-learning-adapts-0128