The MEGA project is centered on the modelling and communication of expressive and emotional content in non-verbal interaction by multi-sensory interfaces in shared interactive Mixed Reality environments.
In particular the project focuses on music performance and full-body movements as first class conveyors of expressive and emotional content.
Main research issues are:
- Analysis of expressive gestures
- Synthesis of expressive gesture
A main output of the project is the MEGA System Environment, an environment for multimedia and performing arts applications where different software modules for real-time expressive gesture analysis and synthesis are interconnected with each other.
The research results have been used in a number of artistic performances and multimedia events.