女生小视频

Technology

AI models can't learn as they go along like humans do

After their initial training phase, AI algorithms can鈥檛 update and learn from new data, meaning tech companies have to keep training new models from scratch

By Alex Wilkins

21 August 2024

AI programs quickly lose the ability to learn anything new

Jiefeng Jiang/iStockphoto/Getty Images

The algorithms that underpin artificial intelligence systems like ChatGPT can鈥檛 learn as they go along, forcing tech companies to spend billions of dollars to train new models from scratch. While this has been a concern in the industry for some time, a new study suggests there is an inherent problem with the way models are designed 鈥 but there may be a way to solve it.

Most AIs today are so-called neural networks inspired by how brains work, with processing units known as artificial neurons. They typically go through distinct phases in their development. First, the AI is trained, which sees its artificial neurons fine-tuned by an algorithm to better reflect a given dataset. Then, the AI can be used to respond to new data, such as text inputs like those put into ChatGPT. However, once the model鈥檚 neurons have been set in the training phase, they can鈥檛 update and learn from new data.

This means that most large AI models must be retrained if new data becomes available, which can be prohibitively expensive, especially when those new datasets consist of large portions of the entire internet.

Researchers have wondered whether these models can incorporate new knowledge after the initial training, which would reduce costs, but it has been unclear whether they are capable of it.

Now, at the University of Alberta in Canada and his colleagues have tested whether the most common AI models can be adapted to continually learn. The team found that they quickly lose the ability to learn anything new, with vast numbers of artificial neurons getting stuck on a value of zero after they are exposed to new data.

Free newsletter

Sign up to The Daily

The latest on what鈥檚 new in science and why it matters each day.

New 女生小视频. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

鈥淚f you think of it like your brain, then it’ll be like 90 per cent of the neurons are dead,鈥 says Dohare. 鈥淭here’s just not enough left for you to learn.鈥

Dohare and his team first trained AI systems from the ImageNet database, which consists of 14 million labelled images of simple objects like houses or cats. But rather than train the AI once and then test it by trying to distinguish between two images multiple times, as is standard, they retrained the model after each pair of images.

They tested a range of different learning algorithms in this way and found that after a couple of thousand retraining cycles, the networks appeared unable to learn and performed poorly, with many neurons appearing 鈥渄ead鈥, or with a value of zero.

The team also trained AIs to simulate an ant learning to walk through reinforcement learning, a common method where an AI is taught what success looks like and figures out the rules using trial and error. When they tried to adapt this technique to enable continual learning by retraining the algorithm after walking on different surfaces, they found that it also leads to a significant inability to learn.

This problem seems inherent to the way these systems learn, says Dohare, but there is a possible way around it. The researchers developed an algorithm that randomly turns some neurons on after each training round, and it appeared to reduce the poor performance. 鈥淚f a [neuron] has died, then we just revive it,鈥 says Dohare. 鈥淣ow it’s able to learn again.鈥

The algorithm looks promising, but it will need to be tested for much larger systems before we can be sure that it will help, says at the University of Oxford.

鈥淎 solution to continual learning is literally a billion dollar question,鈥 he says. 鈥淎 real, comprehensive solution that would allow you to continuously update a model would reduce the cost of training these models significantly.鈥

Journal reference:

Nature

Topics:

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox. We'll also keep you up to date with New 女生小视频 events and special offers.

Sign up
Piano Exit Overlay Banner Mobile Piano Exit Overlay Banner Desktop