Frontiers in Neurorobotics, 18
February 2024 · doi: https://doi.org/10.3389/fnbot.2024.1290965
Optical identification is often done with spatial or temporal visual pattern recognition and localization. Temporal pattern recognition, depending on the technology, involves a trade-off between communication frequency, range and accurate tracking. We propose a solution with light-emitting beacons that improves this trade-off by exploiting fast event-based cameras and, for tracking, sparse neuromorphic optical flow computed with spiking neurons. In an asset monitoring use case, we demonstrate that the system, embedded in a simulated drone, is robust to relative movements and enables simultaneous communication with, and tracking of, multiple moving beacons. Finally, in a hardware lab prototype, we achieve state-of-the-art optical camera communication frequencies in the kHz magnitude.
subject terms: Neuromorphic Computing, Event-Based Sensing, Optical Camera Communication, Optical Flow
url: https://www.frontiersin.org/articles/10.3389/fnbot.2024.1290965