Peter A C Nelson, Roberto Alonso Trillo, Chen Jie.

A performance for violin, dance, and machine learning (2022).

Descendent is a performance conceived for violin and dance, augmented by a sophisticated motion synthesis system. It began as a weekend project between the Hong Kong dancer Sudhee Liao, Dr Roberto Alonso Trillo (HKBU Music Department  & Augmented Creativity Lab member) and Dr Peter A C Nelson (HKBU Academy of Visual Arts & Augmented Creativity Lab member). It was then integrated into the Theme Based Research project and joined by Dr Chen Jie (HKBU Computer Science). On a technical level, the project is innovating a number of approaches to music-to-motion synthesis. We recorded a unique motion capture dataset which we use to synthesise movements in real time. We also created a ‘live’ stage, where the touch of the dancer connects to a musical synthesiser, allowing her to ‘play’ the stage back to the violinist, who functions as the director of the performance. Together, we are striving for a performance that communicates critical philosophical concepts in art and technology in a way that is transparent and understandable for an audience. By using two identical digital avatars to compare the real-time movement of a human dancer with the synthesised movement of an algorithmic system, we encourage the audience to speculate on authorship and human agency. When a human dancer is dancing with a synthesised version of herself and a violinist is playing with his own sounds synthesised back to him, who is leading the performance? If such a system is then augmented with machine learning, could a performance create the illusion of artificial creativity and agency? The exploration of these questions is made possible by the close collaboration between our interdisciplinary Professors, where we constantly invent new tools and experiment together on how to best use them to communicate with our audience.