MIDI and RDF logos

When I first heard about Albert Meroño-Peñuela and Rinke Hoekstra’s midi2rdf project, which converts back and forth between the venerable Musical Instrument Digital Interface binary format and RDF, at first I thought it seemed like an interesting academic exercise. Thinking about it more, I realized that it makes a great contribution to both the MIDI world and to musical RDF geeks.

MIDI has been the standard protocol for integrating synthesizers and related musical equipment together since the 1980s. I’ve only recently thrown out a book with the MIDI specs that I’ve owned for nearly that long because, as with so many other technical specifications, they’re now available online.

Meroño-Peñuela and Hoekstra’s midi2rdf lets you convert between MIDI files and Turtle RDF. I love the title of their ESWC 2016 paper on it, “The Song Remains the Same” (pdf)–I was pretty young when Led Zeppelin’s Houses of the Holy album came out, but I remember it vividly. The song remains the same because the project’s midi2rdf and rdf2midi scripts provide lossless round trip compression between the two formats, which makes it a very valuable tool: it gives us a text file serialization of MIDI based on a published standard, which makes MIDI downright readable. Looking at these RDF files and spending no serious time with the MIDI spec, I worked out which resources and properties were doing what and used this to create my own MIDI files.

As a somewhat musical RDF geek, this was a lot of fun. I wrote Python scripts to generate different Turtle files of different kinds of random music, then converted them to MIDI so that I could listen to them. (You can find it all in github.) The use of random functions means that running the same script several times creates different variations on the music. Below you will find links to MP3 versions of what I called fakeBebop and two versions of some whole-tone piano music that I generated, along with the MIDI and RDF files that go with them.

Each MIDI file (and its RDF equivalent) starts with some setup data to identify information such as the sounds that it will play and the tempo. Instead of learning all those setup details for my program to generate, I used the excellent Linux/Mac/Windows open source MuseScore music scoring program to generate a MIDI file with just a few notes of whatever instruments I wanted and then converted that to RDF. (This ability to convert in both directions is is an important part of the value of the midi2rdf package.) Then, keeping the setup part of that RDF, I deleted the actual notes and had my script copy the setup part and then generate new notes that it appended to the setup part.

In RDF terms, the note generation meant two things: adding a pair of mid:NoteOnEvent resources (one to start playing a note and one to stop) and then adding references to those events onto a musical track listing the events to execute. So, for example, the first mid:NoteOnEvent in the following pair defines the start of of a note at pitch 69, which is A above middle C on a piano. The mid:channel of 0 had been defined in the setup part, and the mid:tick value specifies how long the note will play until the next mid:NoteOnEvent. (I was too lazy to look up how the mid:tick values relate to elapsed time and picked some through trial and error.) The mid:velocity values essentially turn the note on and off.

p2:event0104 a mid:NoteOnEvent ;
    mid:channel 0 ;
    mid:pitch 69 ;
    mid:tick 400 ;
    mid:velocity 80 .


p2:event0105 a mid:NoteOnEvent ;
    mid:channel 0 ;
    mid:pitch 69 ;
    mid:tick 500 ;
    mid:velocity 0 .

As my script outputs noteOn events after the setup part, it appends references to them onto a string in memory that begins like this:

mid:pianoHeadertrack01 a mid:Track ;
    mid:hasEvent p2:event0000,
        p2:event0001,
        p2:event0002,
        p2:event0003,
        # etc. until you finish with a period

After outputting all the mid:NoteOnEvent events, the script outputs this string. (While the triples in this resource are technically unordered, rdf2midi seemed to assume that the event names are “event” followed by a zero-padded number. When an early version of my first script didn’t do this, the notes got played in an odd order. Maybe it’s just playing them in alphabetic sort order.)

That’s all for just one track. My fakeBebop script does this for three tracks: a bass track playing fairly random quarter notes in the range of an upright bass, a muted trumpet track playing fairly random triplet-feel eighth notes (sometimes with a rest substituted), and a percussion track repeating a standard bebop ride cymbal pattern. You can see some generated Turtle RDF at fakeBebop.ttl, the MIDI file generated from the Turtle file by midi2rdf at fakeBebop.mid, and listen to what it sounds like at fakeBebop.mp3.

By “fairly random” I mean a random note within 5 half steps (a major third) of the previous note. Without any melodies beyond this random selection of notes, I think it still sounds a bit beboppy because, as the early bebop pioneers added more complex scales to the simple major and minor scales played by earlier jazz musicians, it all got more chromatic.

I have joked with my brother about how if you quietly play random notes on a piano with both hands using the same whole tone scale, it can sound a bit like Debussy, who was one of the early users of this scale. My wholeTonePianoQuarterNotes.py script follows logic similar to the fakeBebop script but outputs two piano tracks that correspond to a piano player’s left and right hands and use the same whole tone scale. You can see some generated Turtle RDF at wholeTonePianoQuarterNotes.ttl, the MIDI file generated from that by rdf2midi at wholeTonePianoQuarterNotes.mid, and hear what it sounds like at wholeTonePianoQuarterNotes.mp3.

Before doing the whole tone piano quarter notes script I did one with random note durations, so it sounds like something from a bit later in the twentieth century. Generated Turtle RDF: wholeTonePiano.ttl; MIDI file generated by rdf2midi: wholeTonePiano.mid; MP3: wholeTonePiano.mp3.

I can think of all kinds of ideas for additional experiments, such as redoing the two piano experiments with the four voices of a string quartet or having the fakeBebop one generate common jazz chord progressions and typical licks over them. (Speaking of string quartets and Debussy, I love that Apple iPad Pro ad that NBC showed so often during the recent Olympics.) It would also be interesting to try some experiments with Black MIDI (or perhaps “Black RDF”!). If I had pursued these ideas, I wouldn’t be writing this blog entry right now, because I had to cut myself off at some point.

I recently learned about Supercollider, an open source Windows/Mac/Linux IDE with its own programming language that several serious electronic music composers use for generating music, and I could easily picture spending all of my free time playing with that. At least midi2rdf’s RDF basis gave me the excuse of having a work-related angle as I wrote scripts to generate odd music. Although I was just slapping together some demo code for fun, I do think that midi2rdf’s ability to provide lossless round-trip conversion between a popular old binary music format and a readable standardized format has a lot of potential to help people doing music with computers.