Abstract
This project aims to examine how a program can detect emotions in a user’s dance
movements, and how this can be utilized in an interactive scenography. The project utilizes machine-learning to predict which emotion is portrayed through the user’s dance moves via a live video-feed. Each classification of the user’s emotion is set to trigger a corresponding graphical animation. The final program manages to create an interactive relation between the user and the scenography’s visual appearance.