Brand New Continual Learning Will Cure AI’s “Dementia”

brand learning, continual learning, ohio state university, ai,

Ohio State University researchers showcased their progress on “continual learning” at the 40th annual International Conference on Machine Learning in Honolulu.

  • It’s a process enabling AI to continuously learn new skills without forgetting previously acquired knowledge
  • “Catastrophic forgetting” in AI systems is when artificial neural networks lose information from earlier training as they take on new tasks.

Ohio State University researchers took the stage at the 40th annual International Conference on Machine Learning (ML) held in Honolulu to discuss their work in creating an AI that can mimic human learning.

The researchers showcased their progress on “continual learning,” a process that allows a computer to continuously learn new skills without forgetting anything previously taught, just like we humans do.

The research team emphasized the importance of overcoming AI’s “catastrophic forgetting.”

This phenomenon occurs when artificial neural networks forget information from earlier training as they take on new tasks. The implications of this issue could be critical for applications like self-driving cars, where the retention of previous knowledge is crucial for safety.

Professor Ness Shroff, an Ohio Eminent Scholar and leader of the study, emphasized the need for AI systems not to forget the lessons they’ve learned as they encounter new scenarios. He explained, “As automated driving applications or other robotic systems are taught new things, it’s important that they don’t forget the lessons they’ve already learned for our safety and theirs.” He also clarifies that their research “into the complexities of continuous learning in these artificial neural networks, and what we found are insights that begin to bridge the gap between how a machine learns and how a human learns.”

The team discovered that artificial neural networks better retain information when trained on diverse and dissimilar tasks rather than those sharing similar features. Just like humans recall contrasting facts about different scenarios more easily than similar ones, AI networks benefit from a variety of tasks during training, enabling them to absorb new information more effectively.

The implications of this research are immense, as it brings scientists closer to achieving AI with lifelong, human-like learning capabilities. Machines exhibiting such dynamic learning could be scaled up faster and adapt to evolving environments and unexpected situations seamlessly.

The study conducted by the Ohio State University research team received support from the National Science Foundation and the Army Research Office. As understanding the similarities between machines and the human brain continues to grow, it could pave the way for a deeper understanding of AI.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.