Effective collaboration in a team is a crucial skill.
When people interact together to perform physical tasks, they rely on gestures
to convey instructions. This thesis explores gestures as means to assess
physical collaborative task understanding. This research proposes a framework
to represent, compare, and assess gestures’ morphology, semantics, and
pragmatics, as opposed to traditional approaches that rely mostly on the
gestures’ physical appearance. By leveraging this framework, functionally
equivalent gestures can be identified and compared. In addition, a metric to
assess the quality of assimilation of physical instructions is computed from
gesture matchings, which acts as a proxy metric for task understanding based on
gestural analysis. The correlations between this proposed metric and three
other task understanding proxy metrics were obtained. Our framework was
evaluated through three user studies in which participants completed shared
tasks remotely: block assembly, origami, and ultrasound training. The results
indicate that the proposed metric acts as a good estimator for task
understanding. Moreover, this metric provides task understanding insights in
scenarios where other proxy metrics show inconsistencies. Thereby, the approach
presented in this research acts as a first step towards assessing task
understanding in physical collaborative scenarios through the analysis of
gestures.
Funding
See-What-I-Do: Increasing Mentor and Trainee Sense of Co-Presence in Trauma Surgeries with the STAR Platform
Congressionally Directed Medical Research Programs