Inter_agency is an artistic research project funded by the Austrian Science Fund (PEEK AR 483) and hosted at the Institute of Electronic Music and Acoustics (IEM) of the University of Music and Performing Arts of Graz. The project addresses the desideratum of human-computer symbiosis in composed electro-instrumental music and aims to enhance human-machine communication in compositions for acoustic instruments and electronics by incorporating intelligent agent-based systems in them. The interaction model that serves as a frame of reference for the project is Rowe's player paradigm - as opposed to the instrument paradigm; a model in which the musician and the computer are co-actors in a reciprocal interaction. In the player paradigm, the software agent perceives human actions through machine listening and acts in response to them and according to internal generative processes1.
The player paradigm has been explored in several interactive improvisation systems, but has found comparatively fewer applications in composed music. Inter_agency aims to bridge this gap by exploring intelligent agent-based human-computer interaction from a compositional perspective. In particular, the type of human-computer interaction we aim to explore is:
(1) sound-based, i.e. based exclusively on (human and machine) listening,
(2) composed - as opposed to improvised, and
(3) reciprocal, that is interaction in its literal sense: a process of mutual adaptation between the musician and the software agent.
The research question of the project is twofold:
How can the focus of the compositional process be shifted from composing sounds to composing sonic (agent) interactions? And what are the implications of this shift for composition and Music AI?
This research question will be explored in 8 different compositions/case studies which will be developed by the two main researchers, Artemi-Maria Gioti and Gerhard Eckel, and three artists-in-residence, in collaboration with the musicians of Klangforum Wien and Schallfeld Ensemble.
Imitation game is an interactive composition for human and robotic percussionist exploring decision-making processes in the context of composed, scenario-based interaction. The composition is based on a dynamic rather than a fixed form, shaped by decisions made by the musician and the computer in real-time. Using a Neural Network trained to recognize different instruments and playing techniques as a main auditory processing unit, the robotic percussionist makes short- and long-term decisions based on metrics of rhythmic, timbral, and dynamic contrast. The score of the piece is non-linear, enabling the musician to adapt to and interact with the robotic percussionist in real-time.
1. ROWE, R. (1993) Interactive Music Systems: Machine Listening and Composing. London: MIT Press.