FAMOUS

FAMOUS

Asset detection and identification with event-based vision

FAMOUS

We use an event-based camera, embedded on a flying drone, to detect and track emitting beacons on the ground. The scope of this project is to implement this sensor in simulation and develop a real-world use case in asset monitoring.

Project description

The general goal of the project FAMOUS (Field service and Asset Monitoring with On-board SNUand event-based vision in Simulated drones) is to prove the applicability of IBM‘s Spiking Network Units in real-world vision applications based on event cameras. The projects wants to apply SNU to a drone use case where camera-equipped drones will, in simulation, detect, identify and localize assets on the ground and build a map thereof. The objects will be detected, identified and localize thanks to a previously developed „active optical identification sensor“ that perfectly fits together with event-based cameras. The whole will be developed in simulation in this project.

The innovation herein lies in the use of spiking AI and event-based cameras, that both enable for extreme energy-efficiency when compared to traditional AI and cameras. Indeed, spikingAI can run on neuromorphic hardware, which has proven to beat conventional AI hardware‘s energy consumption by orders of magnitude.This proof of concept project, entirely in simulation, should then lead to a more ambitious project, where hardware implementation can be investigated. IBM propose to complement the SNU toolset to support event-based vision sensors and algorithms, while fortiss will take the responsibility of the simulated experiment set up, the implementation of the active optical identification virtual sensor and the integration of SNU into the application. We expect a simulated demonstrator in the end, that will be showcased in the Munich Highlight tower.

By activating this video, you consent to transmitting data to YouTube.

This video shows four 3D views of the FAMOUS simulation scene. The use case is a simulated construction site, with some objects (packs on the ground) and persons equipped with an "active identification sensor", to be identified and tracked by the FAMOUS drone. The drone is first raising in the air regularly, then flying south-east and slightly down, then rotating on itself and finally flying vertically down. Upon trajectory changes, the drones is jerking a lot, which a perfectly simulated behaviour that happens on hardware drones. The top left view is the "end-user" view, where objects and persons equipped with the active identification sensor are identified (pack 0, worker 1, etc) and tracked. The top right view shows tracked objects with tracking rectangles around and camera events that are potential tracks.

These events explode upon trajectory changes, which is expected. Tracks are then being lost for a very short time and recaptured very soon, which proves the low latency of the system. The bottom left view shows sparse optical flow, computed from IBM's Spiking Neural Units from events, which is used to track identified objects. The bottom right view is an outside view of the scene, where the drone is to be seen flying, as well as logging data.

Research contribution

The major breakthrough here is the use of spiking and event-based real time sensing. This will prove that SNU are able to cover this field of technology. The low-latency and low-energy capabilities of neuromorphic hardware and event cameras will enable AI-based field service and asset monitoring applications.Technical risks are that the SNU toolset be hard to complement withevent-based sensing. The implementation of the AOI sensor in spiking might also take time per se and delay project results, if not monitored closely enough.

In this six months project that is all in simulation, we should be able to reach at least the complete 3D experiment, the camera grabbing AOI signals and localizing them. So part of the asset map would be available. If the object identification can be done in the project timeframe, that will be the optimal result.

Project duration

01.10.2021 - 31.03.2022

Dr. Axel von Arnim

Your contact

Dr. Axel von Arnim

+49 89 3603522 538
vonarnim@fortiss.org

More information

Project partner

Publications

  • 2024 Dynamic Event-based Optical Identification and Communication Axel von Arnim , Jules Lecomte , Stanislaw Wozniak , Naima Elosegui and Angeliki Pantazi Frontiers in Neurorobotics, 18():, 2024. Details URL DOI BIB
  • 2022 Spiking Neural Units ermöglichen effiziente ereignisgesteuerte Kameras Axel von Arnim and Angeliki Pantazi blog, 2022. Details URL BIB