Home Artists Posts Import Register
Join the new SimpleX Chat Group!

Downloads

Content

A dispatch from the Historio-Salvaging and Upcycling wing of 65LABS today. This is the first of some unknown number of 'Research Releases' (unless you count Procedural Piano During a Storm 300621 in which case it is the second) which are defined here at the lab as pieces of music that deserve better than to be carelessly thrown into the ephemeral daily churn of the internet, but are too niche or experimental in one regard or another to qualify as a fully fledged release into the wider world.

Conceptually, The MIDI Thief is something like an audio monograph. It is exploring one very specific thing, and that thing is an algorithm of our design. It is a tale of historical materialism, a lack of respect for musical authenticity and a general scepticism of ‘A.I music’ but a warm embracing of human-machine interactions.

So where to begin?

BACKSTORY

Google are terrible for many, many reasons. Nevertheless, their Magenta project has attracted a lot of music-oriented computer scientists to do a lot of research with neural nets and training models to teach computers ‘what music is’, so then the computer can write music by itself.

A good example of music made using neural nets and 'artificial intelligence' is the output from Dadabots.  They fed a neural net a lot of death metal, and now look, there’s an infinite stream of algorithmically-generated death metal: https://youtu.be/MwtVkPKx3RA.

Some researchers at Magenta were doing something similar with lots of classical piano music, teaching a computer how to make music that sounds like serious, long-dead white guys playing the piano. And then the computers could spit out endless iterations of serious classical piano music. On many levels this is valuable research, but is it valuable music? Here at 65LABS, we demand more from our definitions of music. It is more than just audio information dictating speaker vibrations. We feel that the actual sound moving through the air into our ears is only a tiny fraction of what music is. It also necessarily has to include intention, agency, feeling, and can never exist as some objective, material thing, but can only manifest itself through a social context, embedded in shared affect, understanding, emotions, and relations. The neural nets are not being trained on any of this though. They're just into notes and waveforms and stuff like that.

SO. While looking at some Magenta research, because we’re suckers for music making techniques we don’t really understand (see: the entire 65daysofstatic back catalogue), we saw that there was a massive zip file of all of the piano MIDI that they used to train their neural net. (Side note: apparently this data was gathered by exploiting virtuoso pianists, alienating them from their cultural productions, under the guise of an international piano competition where they recorded all the performances. Classic Google behaviour, scraping data from the unpaid labour of people just trying to exist in the world.)

For non-musicians: a MIDI file contains no actual sound, only digital data. If you play a note on a MIDI keyboard, a computer recording it can catch the note’s pitch, its velocity, and its ‘note on’ and ‘note off’ times relative to all the other notes. (There’s other information too but that’s not important right now). So a computer recording a virtuoso pianist can capture the human expression of the piece in a highly detailed way. Not the dry, quantised notes that you would hear if you programmed MIDI in a sequencer, but the loose timing and dynamics and those milliseconds of human fallibility that even virtuosos are subject to. And unlike recording the audio, by recording the MIDI data it is trivially easy to subsequently edit or change this data on a note-by-note basis.

So. 65LABS emancipated (well, downloaded) the files from the Magenta archive and then we started scheming.

THE MIDI THIEF

We don’t have any artificial intelligence at 65LABS. Often, we barely have any actual intelligence. But what we do have is a desire to bend computers to our will. And so we took the MIDI archive and we fed it into an algorithm of our own making. Like most of 65LABS algorithms, the logic itself is deceptively simple.

1. The MIDI Thief algorithm opens up one of the MIDI files from the archive and starts to play it at a random point.

2. The Thief also has a pool of possible notes, chosen by us.

3. It creates an empty dictionary to hold data. (A data dictionary, much like a regular one, holds multiple entries. Each entry has a ‘key’ and and ‘value’. If you give a dictionary a ‘key’ and it matches one of the entries, the dictionary will return the associated value.)

4. Every time a MIDI note reaches The Thief, it checks its dictionary using the note as a key, to see if there is already an entry for it. If there isn’t, it creates a new entry, sets its value as a random note picked from the pool of possible notes (and transposes it to the appropriate octave based on the incoming note’s octave), then it returns this value. If there is already an entry for that note’s key, then it returns that key’s value.

5. What this means is that the notes of the MIDI file are swapped out in such a way that they create a brand new set of phrases and melodies, but then those phrases and melodies remain consistent for the rest of the piece. And, although it swaps out all of the note pitches, it retains their velocities and timings.

6. The result of this is perhaps the most human-sounding generative music we have made so far, because The Thief stole the humanity from those poor, unwitting virtuoso pianists, and disguised it as its own.

If the 65LABS Artist Manifesto wing wasn't closed (due to cuts), we could tell you how this piece of art is a profound statement on the unpaid digital labour we are all always doing for Google, and in particular showing the precarious and exploited nature of contemporary musicians, forced to pit themselves against each other in the name of competition. But the truth is we only came up with that framing just now so not sure it counts.

THE RECORDING

The Midi Thief as a piece of music that we are presenting to you is a realtime recording, edited lightly for clarity and to remove some evil frequencies. The Thief is doing its thing, disguising stolen ideas as new music (spoiler: THIS IS HOW ALL MUSIC IS WRITTEN), and we are tinkering with the instrumentation as it plays. The background drone is a secondary, simple generative patch, similar to the more soundscape-y Wreckage Systems you can hear on the stream.

THIEF FUTURES

As mentioned previously, getting The MIDI Thief algorithm into the stream for real was originally a goal of ours, but our research shows that it is definitely a machine that benefits from constant human curation. With the recording presented here we got lucky, but some outputs are a bit muddled, and leaving The Thief to its own devices on the stream feels a bit risky. In the meantime, there is a Thief Cuts system running on V2, which uses various bits of The Midi Thief recorded output as samples to feed a more generic Wreckage System algorithm.


[Note: If you would like a Bandcamp code for this Research Release, please DM us here. If a lot of you do, then we will send out emails codes for everybody as it'll be easier than replying to you all individually. Otherwise we'll sort through the DMs in the next couple of days - 65LABS DOWNLOAD CODE MANAGEMENT BRANCH]

*****

[UPDATE: OK, DOWNLOAD CODES WILL BE EMAILED OUT IN THE NEXT 24HRS OR SO - 65LABS DOWNLOAD CODE MANAGEMENT BRANCH]

*****

Files

Comments

Michael Hogan

Bandcamp code would be great. Thanks!

Veselin Nikolov

I will also appreciate Bandcamp code, please. Thank you!

adrian curtis

Yes please for a bandcamp code!

Anonymous

Another +1 for BandCamp code please chaps

65daysofstatic

OK UPDATE - DOWNLOAD CODES COMING VIA EMAIL IN NEXT 24HRS OR SO

Richard Bennett

All the Bandcamp codes, always, please 🙂

Anonymous

Can't wait to listen to this in the bath.

Anonymous

Me, too - the Bandcamp codes, that is.

Peter Hollo

Always les codes de Bandcamp are appreciated.

Marshalsea

Bandcamp us up!

SolSys

Wonderful post, and wonderful approach to generative computer music!! Really appreciate the care taken against settling for sticking 2sec waveforms into a markov chain. Each wreckage system has its own flavor of 65Labs cyberhumanity, ofc, but these thieved midis are a big step forward!

Anonymous

A Bandcamp code would be great. Thanks!

Stephen Hurley

The transition from plucks to bells (20->22 minute mark) is so well composed, with a clear sense of building to something and great dynamic shifts. really human sounding. and then the computer botches it by abruptly dropping the plucks straight out of the mix 10 seconds short of the 22 minute mark, revealing the dead hand of the algorithm mindlessly making choices in the background.

Chris Adam

But what if the algorithm likes that? Who are we to impose our own restrictions on its budding young creativity?

adrian curtis

The bandcamp version is shorter than the one posted earlier in case anyone wanted to know!