Europe Is Teaching Robots to See Like Human

European researchers created a machine-vision system inspired by human eyesight, using edge-computing hardware and neuromorphic sensors.

European team led by Finland’s VTT research organization has created a new kind of machine-vision system inspired by human eyesight, using edge-computing hardware and neuromorphic sensors to help drones and robots work harder without constant network access in critical missions.

The Multispectral Intelligent Vision System with Embedded Low-Power Neural Computing (MISEL) project started in 2021 and gathers experts from various countries to build self-thinking devices. With the focus on handling data directly at their origin, neuromorphic sensing opens the door to systems that would react quickly and safely to ever-changing environments.

Smart Neuromorphic Sensors

The project brings together advanced semiconductor design and neuromorphic computing at scale to develop a better way for machines to process visual information.

“Our goal is to build truly smart devices that can make observations and decisions on their own, without sending data to supercomputers or the cloud. Neuromorphic computing can be hundreds or even thousands of times more energy-efficient than conventional digital processing,” explains Jacek Flak, Research Team Leader at VTT, coordinator of the project.

It reduces the constant connectivity burden and makes devices more secure, while also highlighting the advantages of neuromorphic computing in battery-powered machines.

“Imagine a drone searching for survivors after an earthquake through smoke, dust, and debris. It needs to interpret its surroundings and make decisions instantly. There may be no network connectivity, and battery life is limited,” says Flak.

Nature-Inspired Machine Vision System

These neuromorphic sensors are an important part of the project, employed in building neuromorphic vision sensors that mimic the way the human eye and brain work together.

The team also explored neuromorphic sensing to help devices detect motion and patterns even in fog, dust, or low light. At Kovilta, a system-on-chip was developed in which image sensing and processing are combined to push the field toward neuromorphic computing.

“Unlike a conventional video camera that actually captures static frames, this sensor detects motion and changes in time and space—just like a biological eye,” Flak says.

The effort also uses neuromorphic chips supporting fast pattern recognition and low power consumption. Infrared-range vision is furthered by research on quantum-dot sensors, which extends the applications of neuromorphic computing in robotics and safety tools.

They also worked on new memory units and hardware AI accelerators, created with the help of a neuromorphic engineer on the team.

These ideas, Kovilta says, will support neuromorphic computing chips in next-generation autonomous robots and vehicles.

“A superior ability to observe the surroundings and accurately interpret observations is a must for robots and vehicles,” Kovilta’s Mika Laiho says.

Another related focus that is emerging would be embodied neuromorphic intelligence; this will enable devices to react with the world in a much more natural way. The project also touches on neuromorphic engineering regarding the development of Industrial tools in the future.

The team finally hopes neuromorphic sensors will make machines that see, think, and act with high, and detailed efficiency, pointing to broader applications of neuromorphic computing ahead.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.