Stitched – audience adaptive composition performed with the LCO soloists at Bridgepoint Rye
This piece, performed for the opening of a new installation by Tim Hopkins showing the Hastings Tapestry, was audience adaptive. As the audience moved around the hall the music altered and changed in response. I performed it with a small ensemble from the London Contemporary Orchestra of Viola Da Gamba, Alto Flute, Violin and Sackbut.
I also created a soundtrack for the wider installation which runs over the summer. Here is a video by the BBC about it.
The Eternal Golden Braid – audience adaptive composition collaborating with AI, performed by the LCO soloists at Barbican
I composed a piece which ended this performance lecture by Marcus Du Sautoy at the Barbican on March 9th 2019.
In a pre-recorded interview with Douglas Hofstadter, whose book Gödel, Escher, Bach inspired this event, Du Sautoy posits Bach as a musical coder, applying rules to musical material to grow something complex and beautiful. His works were then fed through a machine learning process created by computational artist Parag K Mital, which uses the data in Bach’s writing to ‘compose’ a piece of its own. I then adapted and developed some of the AI output into a piece which could react to the audience.
The piece was performed by a string trio of London Contemporary Orchestra soloists.
WDCH Dreams with LA Phil, Refik Anadol and Google Arts & Culture
I collaborated to create this soundtrack with Kerim Karaoglu, Parag Mital and Refik Anadol studio on this project with the LA Phil for their 100th anniversary gala event. The music and sound design features archival audio from the huge catalogue of the LA Phil, along side machine learning generated new audio material, using a wide range of cutting edge techniques and also new original music.
The event itself was a vast projection mapped spectacle on all surfaces of the Walt Disney Concert Hall in Los Angeles. The visuals were also generated using many elaborate machine learning and data art techniques.
Fantom x Mezzanine – Massive Attack
I created custom algorithmic remixes for this sensory remixer app release of Mezzanine the seminal album from Massive Attack on its 21st anniversary. This was a collaboration with Rob Del Naja, Marc Picken, Andrew Melchoir, Yair Szarf, The Nation, Joel Vaiser, N2K, Hingston Studio and Blokur. The app lets the user navigate custom remixes of each song based on how they interact with it using your phone sensors. For example you can film an Instagram video and the music adapts in realtime to the activity in the image. It also responds to many other sensor inputs including movement, touch, face expression recognition, and some secret hidden easter eggs. The app is an early experiment exploring in the possibilities for music in augmented reality. It also tracks all playback of the core musical stem material in the blockchain.
Pixel Records Sci Fi Supermall – Soundtrack to event in Berlin composed with AI systems. With Boiler Room and Google.
I composed music using AI / machine learning systems for a soundtrack to an amazing launch event in Berlin by Boiler Room and Google for their Pixel 2 phone.
We used many machine learning processes in the music composition – including Google Magenta / Tensorflow. There was a website system which took recordings of peoples voices and through a Tensorflow style transfer type algorithm created a new snippet of music from the contour of their voice.
Rebel Queen – soundtrack for VR experience. Richard Mills & Kim-Leigh Pontin, Sky UK
This was a soundtrack to a VR experience about Queen Nefertiti and ancient Egypt. It won a ‘Commendation’ at this year’s Venice Biennale. The soundtrack is adaptive to user interaction at a number of points in the narrative.
Ecliptic – Infinite 3d generative soundscape – for L-ISA Island by L-Acoustics
I was commissioned to write an infinite generative soundscape by L-Acoustics for their incredible Island sound system. Reminiscent of the soundtrack to a strange very long slow paced art sci fi movie – it is a hypnotising piece perfect for calm contemplation and focus.
The system sounds amazing and features 23 speakers completely surrounding the listener, 2 subwoofers and 24 power amplifiers!
On Your Wavelength – Pulling Out All The Stops, by Marcus Lyall with soundtrack by Robert Thomas. For Leeds Light Night Festival.
This was a new version of Marcus Lyalls laser artwork On Your Wavelength for Leeds Night Light. We used an EEG headset to control the algorithmic playback of a piece for Leeds Town Hall pipe organ and a vast laser light display. This was played using an Orgamat automatic organ playing machine directly from Pure data over MIDI. The EEG controlled the composition in a very detailed note by note way – moving through different modes, keys and melodies based on the brainwaves of the participant.
Above are some images of the EEG > BrainwaveOSC > Pure data > MIDI > Orgamat setup. Big thanks to Klaus Holzapfel from Organola.de for this!
On Your Wavelength – Canary Wharf Winter Lights Festival, by Marcus Lyall in collaboration with Robert Thomas and Alex Anpilogov
I did a new adaptive soundtrack for this piece which has been dramatically developed since its last incarnation over a year ago. This version uses over 30000 LEDs controlled by your brain – the music system creates a unique composition which you ‘play’ using how much you concentrate.
BBC Click Live
I composed an adaptive soundtrack for part of the BBC Click Live show in November 2016. The score adapted to the biometric data of the audience. It became more intense as their emotions became heightened. This project used data from XO Studio’s wristband sensors which 80 members of the audience were wearing.
Pzizz, Sleep App Music System
I worked with Pzizz to create the latest version of their hybrid music system. The app features human composed but algorithmically remixed music and sound. In this case the human composer was the very talented Ethan Cohen. The system learns over time to prioritise to music which the user likes. The app has over 500,000 users and even J.K.Rowling is fan! Try it out on iOS
Hear the City App, CISCO – Adaptive Soundtrack to the city of Rio – Porto Maravilha, Rio Olympics 2016
In Aug 2016 I was commissioned by CISCO to compose an adaptive soundtrack to the city of Rio – Porto Maravilha as part of CISCO’s connected city platform. This was part of their 2016 official supporter of the Rio Olympic and Paralympic games. The soundtrack creates music in realtime, responding to live city data about connectivity, traffic, happiness, weather and time. This project was in conjunction with yDreams Brasil. More details here.
Fantom Sensory Music App, with Massive Attack
I worked together with Rob Del Naja, Marc Picken, Andrew Melchior, Yair Szarf and The Nation on this sensory music experience app which features new material from Massive Attack. The app analyses live video image from the device camera, the users heart rate ( via the Apple Watch ), accelerometer activity, time of day and social media streams to create a unique remix of the music for each listen of each user.
Check out an interview I did along side Rob Del Naja from Massive Attack in VICE about the project here
Hear and Now, Mindful Breathing App, by Biobeats
I created the adaptive music system for this app which helps you do breathing exercises to reduce stress. The app monitors your Heart Rate Variability by using the light and camera, to find out how well you are doing the exercise. If you are doing it well – the music expands in reward. The music system algorithmically creates a different arrangement for each session. You can read more details and get the app here.
On Your Wavelength, by Marcus Lyall in collaboration with Robert Thomas and Alex Anpilogov
An interactive laser and music composition you control with your mind. On this project I collaborated with Marcus Lyall, the artist responsible for the award winning stage visuals for the Chemical Brothers and Metallica. It was commissioned for MERGE 2015 arts festival in Bankside, London. Participants wear an EEG headset which monitors their attention levels. This then controls many aspects of the lasers and the soundtrack. Alex Anpilogov created a method to directly control Marcus’ laser compositions based on the EEG signals and I composed and built the adaptive soundtrack using Pure data. More details here.
This piece was exhibited from Sept 18th to Oct 18th 2015.
Mindsong, EEG controlled meditation app with Mikey Seigel
Mindsong is an app which helps you meditate better. It uses meditation signals from an EEG headset to control a realtime adaptive music soundtrack which I composed. This project was created by Mikey Siegel and Beau Silver is tech lead. The project was built in Pure data and distributed using LibPd on iOS.
The Brain Show, National Geographic Channel
I composed adaptive music for and sound design for this virtual reality exhibit for the National Geographic show Brain Games. Visitors spoke their name into the software and had their face scanned in 3d. Then they wore an Oculus Rift headset which incorporated an EEG headset. This allowed them to explore the power of their brain in virtual reality. As they concentrated more the music became more intense and their head movements and visual focus re-constructed the geometry of their face around them. It featured 3 different adaptive music compositions. Each piece reacted to many things, including brain activity type, attention, head movement and body position. The system was built in Pure data.
Biobeats, Adaptive music composition and sound design, iOS, Android
I am working with Biobeats to create engaging adaptive audio experiences which are healthy. Get on Up is an app which makes any music react and adapt to your running pace in realtime. Hear and Now is a mindful breathing app which helps you reduce stress levels by monitoring your heart rate. Both apps sync data to the BioBeats machine learning API which creates realtime feedback on health. They were built in Pure data and distributed using LibPd on iOS.
“Future User” in DIS magazine by Lil’ Data – PC MUSIC
A lil collaboration with @lildata @Atour, @enzienaudio. The first web deployment of an interactive piece using the Heavy audio procedural audio pipeline by Enzien Audio. This deployed a Pure data patch, via the Heavy compiler into a browser to run on the web. In addition to this, thanks to @atour ‘s code, multiple users can change the music simultaneously by toggling words in the text on a webpage. All the audio is running procedurally in the browser.
Source : https://github.com/lil-data/futureuser
Arboreal Lightning – Imogen Heaps Reverb Festival
Commissioned by Imogen Heap for her Reverb 2014 festival. I worked with Artists & Engineers and Sennheiser to create an adaptive and interactive 3d soundscape using ambisonics for the large LED tree structure designed by Atmos which was the centrepiece of the festival.
Visual software design for the tree was by Adam Stark.
Projects while CCO of RjDj :
Imogen Heap – Run-Time App.
This was a prototype adaptive music running / jogging app I worked on with Imogen Heap. It was the first example of an adaptive music system specifically for running.
The Dark Knight Rises Z+ App, iOS.
The Dark Knight Rises app featured multiple cues from the movie reimagined to work adaptively and respond to many aspects of your life. It also contained The Bat a very dynamic sound design piece and multiple reactive remixes including Junkie XL.
Dimensions – Augmented Audio Reality game, iOS.
Pioneering Audio Augmented reality game by RjDj.
I created the music, sound design, directed voice acting and designed the music system. The app also featured the Ghost Dimension composed by Hans Zimmer.
Inception the App – Augmented Audio Reality App, iOS.
Official app experience extension of the film Inception.
With Christopher Nolan, Hans Zimmer and Michael Breidenbrucker.
6 Million + downloads worldwide. No 1 in the App store in many countries.
RjDj iPhone App, iOS
Reactive Music Production with many artists including AIR, Carl Craig, Little Boots, Bookashade, Jimmy Edgar, Chiddy Bang, Fabrice Lig, Kirsty Hawkshaw, Acid Pauli / Console and many more.
Love by AIR iPhone App, iOS.
Working with AIR to produce an interactive version of their song Love. Released on Valentines day.
Little Boots Reactive Remixer iPhone App, iOS.
Working with Little Boots to produce a interactive versions of 3 songs from her debut album Hands.
Rj Voyager iPad App, iOS.
Reactive Music Production / UX design on this touch control synthesis / music exploration app for the iPad.
Kids on DSP iPhone App, iOS.
Reactive Music Production on this, the first iPhone only album as app release. This app pioneered the release of an album as mobile app software.
All projects at RjDj were built using RjDj’s series of custom ports of Pure data to iOS. These had some commonality with LibPd but also featured other optimisations specific to iOS and the apps in question.
Older projects :
Fluidity – Wii game early stage soundtrack, Curve Studios, eventually published by Nintendo
Explodemon! – PC / PS3 GDC announcement trailer soundtrack, Curve Studios
Czech National Radio – Radio news, weather and traffic themes
Revlon – Advertising music, with Sugar Cube Studios
Equanimity – Sound installation audio engineering for the first official holographic portrait of HRH Queen of England, with Leanda Brass
Visit Mexico Virtual 3d Soundtrack – The first soundtrack to the official virtual 3d presence of a country, commissioned by Mexico
Costa Rica virtual 3d soundtrack, commissioned by Costa Rica
PARSEC – multi user voice reactive music game, commissioned by Manning Press
The Dunes – co-writer, arranger, producer in this duo with singer Sarah-Jane Taylor. Working with Dave Rowntree’s ( Blur ) Transistor Project Label