The MULTISIMO multimodal corpus of collaborative interactions
File Type:
PDFItem Type:
Conference PaperDate:
2017Access:
openAccessCitation:
Koutsombogera, M. & Vogel, C., The MULTISIMO multimodal corpus of collaborative interactions, 19th ACM International Conference on Multimodal Interaction (ICMI 2017), Glasgow, UK, November 13-17 2017, ACM New York, NY, USA, 2017, 502 - 503Download Item:
icmi17demo-demo-124-p-13b3f30-34372-final.pdf (Accepted for publication (author's copy) - Peer Reviewed) 649.2Kb
Abstract:
This paper describes a recently created multimodal corpus that has been designed to address multiparty interaction modelling, specifically collaborative aspects in task-based group interactions. A set of human-human interactions was collected with HD cameras, microphones and a Kinect sensor. The scenario involves 2 participants playing a game instructed and guided by a facilitator. Additionally to the recordings, survey material was collected, including personality tests of the participants and experience assessment questionnaires. The corpus will be exploited for modelling behavioral aspects in collaborative group interaction by taking into account the speakers' multimodal signals and psychological variables.
Sponsor
Grant Number
European Union (EU)
701621
Author's Homepage:
http://people.tcd.ie/koutsommhttp://people.tcd.ie/vogel
Author: Koutsombogera, Maria; Vogel, Carl
Other Titles:
19th ACM International Conference on Multimodal Interaction (ICMI 2017)Type of material:
Conference PaperCollections:
Availability:
Full text availableKeywords:
Multimodal corpus, Group interaction, Collaboration, PersonalitySubject (TCD):
Creative Technologies , Digital Engagement , Digital Humanities , Computational linguisticsDOI:
https://doi.org/10.1145/3136755.3151018Licences: