EyesWeb is an open software research platform for the design and development of real-time multimodal systems and interfaces.
EyesWeb is an open platform to support the design and development of real-time multimodal systems and interfaces. It supports a wide number of input devices including motion capture systems, various types of professional and low cost videocameras, game interfaces (e.g., Kinect, Wii), multichiannel audio input (e.g. microphones), analog inputs (e.g. for physiological signals). Supported outputs include multichannel audio, video, analog devices, robotic platforms. Various standards are supported, including OSC, MIDI, FreeFrame and VST plugins, ASIO, Motion Capture standards and systems (Qualisys), Matlab. EyesWeb supports real-time synchronized recordings of multimodal channels, and includes a number of software libraries, including the Non-Verbal Expressive Gesture Analysis and the Non-Verbal Social Signals Analysis libs. Users can develop proprietary software libs using the EyesWeb development environment. The EyesWeb software includes a development environment, a distributed run-time system (supporting Windows, Linux, and mobile platforms) to create distributed or networked real-time applications, and an open set of libraries of reusable software components. The development environment supports the design process of multimodal interactive systems, enabling users to build systems by means of a visual programming language, which presents some analogies with computer music languages inspired to analog synthesizers or to software systems like Simulink. EyesWeb is conceived, designed and developed by InfoMus Lab. The EyesWeb project started in 1997, as a natural evolution of the HARP Project. The current release of the open software platform is EyesWeb XMI (eXtended Multimodal Interaction). The EyesWeb software platform has been adopted in EU projects in the 5th, 6th and 7th Framework Programme (ICT), and by thousands of users worldwide for scientific research, education, and industry applications. For example, EyesWeb was selected by INTEL in 2008 for their hardware for "independent living", and was adopted at the New York University Summer Program on "Music, dance and new technologies" (2004 - 2006).
Recenti Produzioni della Community di EyesWeb
Open Closed Open
In collaboration with Liat Grayver, Yair Kira and Amir Shpilman
Awarded the first DAGESH Art Prize, Jewish Museum Berlin 2019
"Open, Closed, Open" is a multimedia installation that explores a multitude
of perceptual positions realized as a result of the continual movement of organic materials.
Sand, light and human voice are shaped by a layer
of technological interventions that impose varying degrees of control
and predetermined roles upon both themselves and the resulting response of the materials.
A thin membrane between resiliency and fragility
for the European WhoLoDancE project presentation,
RomaEuropa Festival, Rome, Italy, October 7, 2018
Collaboration K. Danse (Toulouse, France k-danse.net)- Instituto Stocos (Madrid, Spain stocos.com)
Choreographers: Muriel Romero and Jean-Marc Matos
Dancers: Muriel Romero and Marianne Masson
Music and sound synthesis: Pablo Palacio
Technologies for real-time movement analysis: Casa Paganini - InfoMus and Instituto Stocos.
Interactive Laser technology: Instituto Stocos.
from matos jean-marc, holst anne
An interactive choreographic performance,
stages a dialog between a dancer and her multiple selves,
embodied in an autonomous and unpredictable visual and aural creature,
which emanates from her psyche as interpreted by optical and body sensors.
Music: Robert Crouch, Ipek Gorgun, Nils Frahm, Franck Vigroux, Daniel Brandt, Hauschka, Klara Lewis & Simon Fisher Turner, Biosphere
Lights: Fabien LeprieultCostumes: Benjamin Haegel
Collaboration on the technology: Research Center Casa Paganini - InfoMus, Dir. Antonio Camurri.
Automatic analysis of movement qualities made with the EyesWeb platform.
Europa: Gesture of History - Dancing Science and Art not to forget EU identity"
Europa:Gesture of History, e' una performance scientifica, svoltasi il 23 Marzo 2017 a Roma.
Questa performance ha avuto luogo in occasione della cena per la celebrazione dei "Trattati di Roma", presso "La Lanterna" in Roma,
su invito della "DG Connect" (Reti di comunicazione, contenuti e tecnologie) dell'Unione Europea.
L'obiettivo e' stato quello di raccontare metaforicamente la narrazione della nascita e della evoluzione dell'Europa,
attraverso una breve azione coreografica,
arricchita da una sonificazione interattiva che amplifica le qualita' del movimento della danzatrice.
Performance/Dimostrazione nell'ambito del Progetto Europeo "DANCE"
Giulia Mureddu, danza
Virgilio Sieni e Giulia Mureddu, coreografia
Andrea Cera, sonic design
Stefano Piana, software per l'analisi in tempo-reale delle qualita' del movimento
Un progetto del Centro di Ricerca "Casa Paganini-InfoMus", DIBRIS-Universita' di Genova, coordinato da Antonio Camurri
Questo progetto ha ricevuto finanziamenti dal Programma per la ricerca e l'innovazione "Horizon 2020" dell'Unione Europea, con il Contratto No. 645553.
Produzioni artistico/scientifiche di Andrea Cera, con l'analisi del movimento.
Estratti dai suoi lavori, basati sull'analisi del movimento e l'utilizzo di EyesWeb.
weDRAW Movement Analysis Library (2019)
This consists of a collection of EyesWeb XMI patches for extraction of movement features from video and Kinect data
The library was developed in the framework of the EU-H2020-ICT project weDRAW.
weDRAW has received funding from the European Union's Horizon 2020 Research and Innovation Programme under Grant Agreement No. 732391.
Our YouTube channel is available at the following link
Third Party Projects Based on EyesWeb
Research Project at CSC - University of Padova on Preservation of Interactive Multimedia Installations link
Multimedia content and EyesWeb patches link
The Associazione Carlo De Pirro link
University of Miami - Master of Science in Music Engineering Technology - M.Sc. thesis, Matan Ben-Asher: "Toward and Emotionally Intelligent Piano: Real-Time Emotion Detection and Performer Feedback via Kinesthetic Sensing in Piano Performance."link