RESEARCH PROJECT

Transmutable Music - Composing Recorded Music that is fluid

Transmutable Music - Composing Recorded Music that is Fluid

For over 15 years, Dr. Tracy Redhead has explored how interactive technologies can reshape the way we experience music. In her new book Interactive Technologies and Music Making: Transmutable Music, she introduces Transmutable Music—an overarching term encompassing various forms of adaptive, generative, algorithmic and interactive composition being created today, where data becomes central to the compositional process. These forms of music are dynamic and fluid, changing with each listen based on factors such as chance, an algorithm, or any kind of data, including the listener’s location, time, environment, and even their heartbeat, breaking away from the fixed, static nature of traditional recorded music.

As technology advances and better mirrors real life, music can regain the fluidity it once had before analogue recording technology locked it into static forms.

Transmutable Music isn’t just about adaptive music playback—it provides new tools to inspire songwriting and storytelling techniques. By embracing these interactive approaches, artists can push the boundaries of what a song can be. They can blend narrative, experience, and emotion in ways that traditional recorded music forms cannot.

The book provides an accessible model for composers, musicians, and technologists to create music that can change and adapt, breathing new life into the experience of composition and listening. It’s particularly suited for video games, apps, and interactive media, where music can be an integral part of expansive, evolving stories.

Dr. Redhead’s book serves as both a theoretical exploration and a practical guide for musicians interested in pushing the boundaries of interactive or Transmutable music. While traditional recorded music remains important, she argues that embracing technologies like VR, AR, and data-driven systems allows us to reimagine how music is composed, performed, and experienced in the 21st century, ultimately expanding the possibilities of what songs and musical storytelling can achieve. 

The Semantic Machine - An App for Contextual Listening

The Semantic Machine is a proof of concept to show a new approach to songwriting and Transmutable Music. The project was developed by Dr Tracy Redhead and Dr Florian Thalmann. It uses new technologies to create a song or song system that changes based on the weather, time of day, and listener location —it’s like it has a mind of its own. 

The Semantic Machine app Originally developed to explore music and semantic AI technologies, the mobile app offers a unique listening experience by adapting music and lyrics to the user’s contextual environment. The song was composed by Dr Redhead and uses a multi-hierarchical mapping system and feature based mixing, ensuring that no two listening sessions are the same.

The Semantic Machine was initially developed as part of the Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption (FAST) project. Funded by the EPSRC, this five-year initiative included partners like Queen Mary University of London, The University of Nottingham, and the BBC. The project was designed to enhance both the production and consumption of recorded music, making these processes more engaging, efficient, and automated.

The app’s prototype was launched at Abbey Road Studios in London in October 2018, during the FAST Industry Launch, and has since been developed further through residencies at Queen Mary University, London, and Kyoto University. Recently, additional funding from the Western Australian Government has helped push the app into its final stages of development, with its official release expected soon.

Read more here

Exploring the Ethical Dimensions of AI and Data

Beyond the technical innovation, conceptually, The Semantic Machine explores broader ethical questions related to AI, data profiling, and social privilege. The project is a metaphor for surveillance capitalism, raising critical concerns about the monetisation of personal data and the resulting loss of autonomy. The app serves as both a technological and conceptual artwork, where the ever-changing music reflects the influences of location, weather, and time, and serves as a warning about how our personal data is used in today’s digital economy.

Through this changing artwork, users are reminded of the importance of staying informed and engaged in discussions around the ethics of AI and data collection. By embodying these themes in the form of music, The Semantic Machine provides a thought-provoking commentary on the relationship between human autonomy and AI algorithms. It invites listeners to reflect on how much of our inner lives are freely given away to surveillance-driven business models that shape modern society.

Teaser: This project involves teaching music to children through imitation and play as part of the international Harmony Signing program

Short and long esc: This UWA Conservatorium of Music research project explores teaching music to children through imitation and play as part of the international Harmony Signing program.

This project looks at how, as technology advances and better mirrors real life, music can regain the fluidity it once had before analogue recording technology locked it into static forms.

This research project by Dr Tracy Redhead explores how adaptive technologies reshape music, introducing dynamic compositions that change with each listening experience. 

Contact Dr Tracy Redhead

Created with Sketch.

Phone

08 64888163