Introducing: The Bielefeld Multimodal Interaction Lab & the mint.tools

At Bielefeld University, we have built a lab –the multimodal interaction lab or mintLab –that allows us to record human–human and human–computer interactions multimodally, that is, to record not only speech and video, but also for example information about eye gaze or body posture. Our guiding design principle was to try to minimise “invasiveness” of the recording situation, meaning that we tried to keep the interaction situation as closely as possible to normal interaction outside the lab. One way we are trying to achieve this is by relying as far as possible on remote sensors that do not need to be affixed to a participant’s body.

In this document, we describe both the hardware setup that we have constructed as well as the software that we have designed to let the components play together. Our hope is that other researchers can profit from what we’ve learned in putting the lab together, both from our experiences with the hardware as well as, hopefully more generally applicably, from our software.

mintLab is run jointly by the Phonetics and Phonology Group (Prof Wagner) and the Applied Computational Linguistics / Dialogue Systems Group (Prof Schlangen). Some of the software described below was built in cooperation with the AI group (Prof. Wachsmuth; Dr. Pfeiffer).

The lab is used / described / evaluated in the following publications

  • Kousidis, S., Pfeiffer, T., Malisz, Z., Wagner, P., & Schlangen, D. (2012). Evaluating a minimally invasive laboratory architecture for recording multimodal conversational data. Proceedings of the Interdisciplinary Workshop on Feedback Behaviors in Dialog, INTERSPEECH2012 Satellite Workshop (pp. 39–42). Stevenson, WA. .bib and .pdf
  • Kousidis, S., Malisz, Z., Wagner, P., & Schlangen, D. (2013). Exploring Annotation of Head Gesture Forms in Spontaneous Human Interaction. Proceedings of the Tilburg Gesture Meeting (TiGeR 2013). .bib and .pdf
  • Kousidis, S., Pfeiffer, T., & Schlangen, D. (2013). MINT.tools: Tools and Adaptors Supporting Acquisition, Annotation and Analysis of Multimodal Corpora. Proceedings of Interspeech 2013. .bib and .pdf
  • Kousidis, S., Kennington, C., & Schlangen, D. (2013). “Investigating speaker gaze and pointing behaviour in human-computer interaction with the ‘mint.tools’ collection”, in: Proceedings of Short Papers at SIGdial 2013. .bib and .pdf
* Last Update: 2013-08-16 *

About...

This website reports on some results of the "Multimodal Interaction Lab", which is located at Bielefeld University and is led by David Schlangen. The information contained in this website is for general information purposes only; we make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability or availability with respect to the website or the information, products, services, or related graphics contained on the website for any purpose. Any reliance you

... Continued

place on such information is therefore strictly at your own risk.

Through this website you are able to link to other websites which are not under the control of us or Bielefeld University. We have no control over the nature, content and availability of those sites. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.