Proceedings of the AsiaPacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), pp. 1–10
December 2014 · Siem Reap, Cambodia · doi: 10.1109/APSIPA.2014.7041588
Human activity recognition is crucial for intuitive cooperation between humans and robots. We present an approach for activity recognition for applications in the context of human-robot interaction in industrial settings. The approach is based on spatial and temporal features derived from skeletal data of human workers performing assembly tasks. These features were used to train a machine learning framework, which classifies discrete time frames with Random Forests and subsequently models temporal dependencies between the resulting states with a Hidden Markov Model. We considered the following three groups of activities: Movement, Gestures, and Object handling. A dataset has been collected which is comprised of 24 recordings of several human workers performing such activities in a human-robot interaction environment, as typically seen at small and medium-sized enterprises. The evaluation shows that the approach achieves a recognition accuracy of up to 88% for some activities and an average accuracy of 73%.
subject terms: robotics, james, smerobotics