A series of multimodal datasets to analyze non-verbal social signals. Data from different experiments on string quartet, orchestra, and audience scenarios are available.
The repository also includes
- Repovizz, an online editor/browser of multimodal data
- A set of algorithms and software modules to process, analyze and visualize multimodal data using Matlab and/or the EyesWeb XMI open software platform.


To offer the scientific community multimodal data and algorithms to study techniques for automated analysis of non verbal social interaction. Examples of results obtained in SIEMPRE on these data are available on Publications and on the Experiments section (see also link on the right side).

How to use the dataset

Each experiment includes a multimodal dataset, which can be used autonomously by accessing the raw data (see Experiments). The platform for synchronized multimodal recordings has been developed in SIEMPRE and is freely available at InfoMus website (see latest builds of the software). The following tools have been developed in SIEMPRE to process, analyze and visualize the multimodal data set resulting from the recordings:

  • EyesWeb XMI interactive applications
  • Matlab scripts
  • Repovizz-Online Multimodal Recordings Repository and Visualizer