ELEANOR

ELEANOR

Industrial robots see with neuromorphic eyes

ELEANOR

Following on from the INRC3 project, where a robotic arm is taught to insert an object using only force feedback, the ELEANOR project (Energy and Latency Efficient Object Insertion Using a Robotic Arm Equipped with Event-Based KAmera and NeuromOrpher Hardware) uses an event-based camera to make the arm approach the slot. Optical flow and 3D reconstruction via Intel's Loihi research chip is used for this purpose.

Project description

We propose to solve the complex and currently widely researched problem of robotic insertion of objects (e.g., plugs) by using low-latency event-based image processing and neuromorphic hardware to precisely position the object in a fast control loop, establishing a new state of the art in image processing-driven robot control. Spiking neural network-based robot control meets the requirements of low latency and precision of visual analysis of the insertion process, adaptability to different object shapes and insertion dynamics, and energy efficiency of AI processing. This also enables mobile robotics applications.

Our approach is based in the use of optical flow and event-based 3D reconstruction in a spiking neural network on a neuromorphic research chip. This enables real-time control of precise centering of the object, relative to the slot in preparation for a force-based insertion process. The relevance for Bavarian companies in robotics and image processing is very high, as it can give them a significant edge in AI technologies.

Research contribution

The innovation of our project lies, first, in the use of an event-based camera for precise vision-guided arm control. Such cameras have been extensively used in research on mobile robots, such as drones (see work by Prof. D. Scaramuzza from Univ. Zurich), and enable fast maneuvers and 3D reconstructions. The extraction of geometric and kinematic measurements of an object from event streams has not yet been demonstrated.

Second, we will develop the event-based image processing algorithms and adaptive controller as a spiking neural network on a neuromorphic chip, further reducing latency and power consumption and enabling adaptivity of the resulting solution. This approach will contribute to the growing field of neuromorphic computing in robotics.

Finally, the integration of event-based image processing and spike-based control will open up the possibility of using this fast vision sensor for other robotic tasks that require precise positioning of the robot relative to the object, such as grasping, manipulating, or placing. This will take event-based vision beyond the realm of mobile robotics, into industrial robotics and arm control.

Project duration

01.10.2021 - 31.09.2024

Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and other countries.
Dr. Evan Eames

Your contact

Dr. Evan Eames

+49 89 3603522 161
eames@fortiss.org

Project partner

Publications

  • 2023 Generating Event-Based Datasets for Robotic Applications Using MuJoCo-ESIM Camilo Amaya , Gintautas Palinauskas , Evan Eames , Michael Neumeier and Axel von Arnim In Proceedings of the 2023 International Conference on Neuromorphic Systems, volume 1 of ICONS 23, pages 7, New York, NY, USA, 2023. Association for Computing Machinery, Association for Computing Machinery. Details URL DOI BIB