Energy-efficient human-machine interaction sensor via edge AI for learning AR/VR
The aim of this project is to research and develop a new type of human-machine interaction sensor for augmented and virtual reality. The sensor will be tested using the challenging use case of motion detection from an egocentric perspective including sensor fusion. Processing sensor data directly in an extended reality device without device-to-server communication is a must to ensure low latency and sensor reactivity. Neuromorphic hardware and event-based cameras enable low-power and low-latency processing on the device (edge AI), in contrast to energy-intensive classic AI-based image analysis. Based on the latest theoretical findings on neuromorphic gesture and motion recognition, an integrated sensor will be demonstrated on AR glasses. The aim is to explore the performance, potential and limits of neuromorphic technologies for future applications in the metaverse.
fortiss contributes directly to the generation of event-based data sets for action recognition in AR/VR and researches neuromorphic algorithms for this purpose. These are implemented and benchmarked on neuromorphic hardware. In addition, fortiss is involved in building and testing the prototype.
01.11.2024 – 31.10.2026