Unsupervised Dance Motion Patterns Classification from Fused Skeletal Data Using Exemplar-Based HMMs
In this paper, we propose a method for the partitioning of dance sequences into multiple periods and motion patterns. The proposed method deploys features in the form of a skeletal representation of the dancer observed through time using multiple depth sensors. This representation is the fusion of skeletal features captured using multiple sensors and combined into a single, more robust, skeletal representation. Using this information, initially we partition the dance sequence into periods and subsequently into motion patterns. Partitioning into periods is based on observing the horizontal displacement of the dancer while each period is subsequently partitioned into motion patterns by using an exemplar-based Hidden Markov Model that classifies each frame into an exemplar representing a hidden state of the HMM. The proposed method was tested on dance sequences comprising multiple periods and motion patterns providing promising results.