Head-gestures mirroring detection in dyadic social interactions with computer vision-based wearable devices

Juan R. Terven, Bogdan Raducanu, María Elena Meza-de-Luna, Joaquín Salas

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

During face-to-face human interaction, nonverbal communication plays a fundamental role. A relevant aspect that takes part during social interactions is represented by mirroring, in which a person tends to mimic the non-verbal behavior (head and body gestures, vocal prosody, etc.) of the counterpart. In this paper, we introduce a computer vision-based system to detect mirroring in dyadic social interactions with the use of a wearable platform. In our context, mirroring is inferred as simultaneous head noddings displayed by the interlocutors. Our approach consists of the following steps: (1) facial features extraction; (2) facial features stabilization; (3) head nodding recognition; and (4) mirroring detection. Our system achieves a mirroring detection accuracy of 72% on a custom mirroring dataset.

Original languageEnglish
Pages (from-to)866-876
Number of pages11
JournalNeurocomputing
Volume175
DOIs
StatePublished - 2016

Keywords

  • Dyadic social interaction analysis
  • Head gestures recognition
  • Mirroring detection
  • Wearable devices

Fingerprint

Dive into the research topics of 'Head-gestures mirroring detection in dyadic social interactions with computer vision-based wearable devices'. Together they form a unique fingerprint.

Cite this