UMONS-TAICHI: A multimodal motion capture dataset of expertise in Taijiquan gestures.

Author: Tits M1, Laraba S1, Caulier E2, Tilmanne J1, Dutoit T1
Affiliation: <sup>1</sup>Numediart Institute, University of Mons, Belgium. <sup>2</sup>University of Nice Sophia Antipolis, Nice, France.
Conference/Journal: Data Brief.
Date published: 2018 May 23
Other: Volume ID: 19 , Pages: 1214-1221 , Special Notes: doi: 10.1016/j.dib.2018.05.088. eCollection 2018 Aug. , Word Count: 243


In this article, we present a large 3D motion capture dataset of Taijiquan martial art gestures (n = 2200 samples) that includes 13 classes (relative to Taijiquan techniques) executed by 12 participants of various skill levels. Participants levels were ranked by three experts on a scale of [0-10]. The dataset was captured using two motion capture systems simultaneously: 1) Qualisys, a sophisticated optical motion capture system of 11 cameras that tracks 68 retroreflective markers at 179 Hz, and 2) Microsoft Kinect V2, a low-cost markerless time-of-flight depth sensor that tracks 25 locations of a person׳s skeleton at 30 Hz. Data from both systems were synchronized manually. Qualisys data were manually corrected, and then processed to complete any missing data. Data were also manually annotated for segmentation. Both segmented and unsegmented data are provided in this dataset. This article details the recording protocol as well as the processing and annotation procedures. The data were initially recorded for gesture recognition and skill evaluation, but they are also suited for research on synthesis, segmentation, multi-sensor data comparison and fusion, sports science or more general research on human science or motion capture. A preliminary analysis has been conducted by Tits et al. (2017) [1] on a part of the dataset to extract morphology-independent motion features for skill evaluation. Results of this analysis are presented in their communication: "Morphology Independent Feature Engineering in Motion Capture Database for Gesture Evaluation" (10.1145/3077981.3078037) [1]. Data are available for research purpose (license CC BY-NC-SA 4.0), at https://github.com/numediart/UMONS-TAICHI.

PMID: 30225286 PMCID: PMC6139536 DOI: 10.1016/j.dib.2018.05.088