FAMOUS2

FAMOUS2

Asset identification with neuromorphic vision on a drone

FAMOUS2

We are exploiting neuromorphic computing‘s properties to develop optical asset identification on a drone. This project is porting to hardware a proof of concept developed in simulation in a previous project, combining classical clustering and tracking algorithms to biologically inspired ones from IBM research.

Project description

Drone in the fortiss Labs
The drone with an embedded camera is used for the use case.

This project is the port to hardware of a proof of concept realized in simulation, in which IBM Spiking Neural Units (SNUs) and neuromorphic vision were exploited in an asset monitoring use case. An existing vision algorithm for object identification was transferred in the neuromorphic paradigm, integrating optical flow computed with SNUs. This method will be implemented in hardware and optimized to highlight neuromorphic hardware’ s benefits: great sparsity, small latency and low power consumption. With those properties, this approach offers an alternative to existing object identification methods, such as WLAN or Bluetooth.

 The simulated use case is assets monitoring with a drone equipped with a neuromorphic vision sensor. Light beacons emitting a visual pattern are attached to assets, such as workers or packages, and are blinking with a pattern, like Morse code, to transmit useful information. The aim of this project is to create a similar use case in hardware and use it as a demonstrator showcased in the fortiss labs, with an embedded event camera on a drone.

Research contribution

After a successful implementation in simulation, this project aims to apply SNU based optical flow estimation with hardware event camera input, as well as achieving a real time object tracking and identification.

To the best of our knowledge, the low latency characterizing neuromorphic hardware has not been exploited yet for asset monitoring. This project will bring the first approach using this property to decode moving variable visual patterns, combining spiking optical flow estimation with an object identification method.

In this nine-month project, the asset monitoring solution should be applied to different real world use cases, especially involving variable transmitted payload.

Project duration

01.08.2022 – 30.05.2023

 Jules Lecomte

Your contact

Jules Lecomte

+49 89 3603522 188
lecomte@fortiss.org

More information

Project partner

Publications

  • 2023 Neuromorphic Optical Flow and Real-time Implementation with Event Cameras Yannick Schnider , Stanislaw Wozniak , Mathias Gehrig , Jules Lecomte , Axel von Arnim , Luca Benini , Davide Scaramuzza and Angeliki Pantazi IBM Research Zürich, IEEE Conference on Computer Vision and Pattern Recognition Workshops 2023. Details URL DOI BIB