Purdue University and the Georgia Institute of Technology researchers reported that rethinking computer architecture using brain-inspired algorithms could cut AI energy worldwide by integrating neuromorphic computing chips and processing to overcome a growing hardware bottleneck, according to Frontiers in Science.
AI systems are expanding at a pace that conventional computers are struggling to sustain, prompting researchers to explore brain inspired AI and neuromorphic AI.
As pathways toward more resilient and energy efficient AI hardware are more efficient. As AI models consume ever-larger datasets, the constant shuttling of information between a computer processor and its memory has become a costly choke point, draining time, power, and efficiency.
The new study frames this challenge as both technical and urgent. “Language processing models have grown 5,000-fold in size over the last four years,” said, Purdue University computer engineering professor and the study’s lead author, Kaushik Roy.
“This alarmingly rapid expansion makes it crucial that AI is as efficient as possible. That means fundamentally rethinking how computers are designed,” said Roy.
Breaking the Memory Wall
Most modern neuromorphic computing chips still rely on the von Neumann architecture, first proposed in 1945, which separates memory from processing. As AI workloads intensify, this division has produced what engineers call the “memory wall,” where memory cannot keep pace with processing speeds.
Researchers argue that new neuromorphic hardware and neuromorphic computing hardware approaches could ease this strain by bringing computation closer to data. The study highlights “compute-in-memory” systems, which dramatically reduce data movement.
“CIM offers a promising solution to the memory wall problem by integrating computing capabilities directly into the memory system,” according to the paper.
IBM researchers have similarly warned that the von Neumann bottleneck dominates energy use in AI systems with neuromorphic computing chips and limits performance gains, especially as the industry pushes toward neuromorphic computing at scale.
Inspired by the Brain
To move beyond this limitation, scientists are increasingly turning to biology. The human brain stores and processes information in the same place, communicating only when something changes. This principle underpins spiking neural networks, neuromorphic algorithms, and more advanced neuromorphic computing algorithms designed to mimic neural efficiency.
“The capabilities of the human brain have long been an inspiration for AI systems,” said Adarsh Kosta, a Purdue University researcher and co-author. “Now we want to take this to the next level and recreate the brain’s efficient processing mechanisms.”
These neuromorphic computing chips systems extend beyond computation alone. Neuromorphic sensing allows machines to respond only to meaningful changes in their environment, using neuromorphic sensors that mirror how biological senses work.
A second layer of neuromorphic sensing, enabled by advanced neuromorphic sensors, could help autonomous vehicles, drones, and medical devices interpret the world in real time, while additional neuromorphic sensors reduce data overload at the source.
Such advances open broad opportunities for neuromorphic computing algorithms and applications.
In particular, researchers seek neuromorphic computing at scale – beyond power-hungry data centers. “AI is one of the most transformative technologies of the 21st century,” said Tanvi Sharma, co-author and Purdue researcher. “However, to move it out of data centers and into the real world, we need to dramatically reduce its energy use.”
AI neuromorphic computing chips demand is accelerating, and incremental fixes are no longer enough. Neuromorphic approaches, they argue, may be essential to making powerful AI practical, affordable, and sustainable far beyond the cloud.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.