There’s an amazing and easy to use music production and performance capability built into your iPhone or iPad that you probably have no idea is there.
The technology is called CoreMIDI, and was brought to iOS from the Mac OS way back when iOS 4.2 was released back around 152 B.C. (aka 2010)
MIDI is an industry standard that describes music in terms of the performance events that happen in real time instead of the actual audio waveforms. MIDI messages describe the physical act of playing the instrument, i.e. play this note, stop playing this note, change the volume, bend the pitch.
Since it’s an industry standard, this created a huge industry of hardware synthesizers from companies like Roland and Moog, and more recently pure software synthesizers for desktop computers from companies like Native Instruments, and iOS apps from companies like IK Multimedia and ThumbJam.
For every app I write that makes sound, like Trapezoid, Irish Flute, World of Bagpipes, and the Hohner SqueezeBox accordion apps, I also release a CoreMIDI controller version that lets you use the skills you already have playing these instruments in real life to play other kinds of sounds instead of having to use a piano keyboard.
These apps are called controllers because they control something else instead of making sound on their own. They are the equivalent of the keyboard you’re using to type on your computer. A computer keyboard does nothing on it’s own until it’s plugged into something else (your computer) that can take the stream of letters you type and turn them into something useful.
A MIDI controller or keyboard is exactly the same. If you want to use it to make music, you have to plug it into something else that can take the stream of event data coming from the controller and turn those note-on and note-off events into musical tones.
The something else in the case of my apps are synthesizer apps like “ThumbJam” and “SampleTank” (and many, many others) that take the stream of MIDI data generated by my apps and render the music note-on and note-off events described in the stream.
It is also possible, using a hardware MIDI interface (like the IK Multimedia iRig MIDI) to send the MIDI data from the iPad to another hardware device, like a rack-mounted Roland synthesizer module, or even a PC or Mac running it’s own synthesizer program like Kontakt and have it convert the note-on and note-off events into music.
So, in Trapezoid MIDI, when you hit the A4 string on the treble bridge, my app sends out a command to iOS that says “He hit the note A4”.
The other synthesizer apps listen for these events and when they receive them, they play the notes described in the command.
So if ThumbJam is running, and in ThumbJam you’ve selected a harp sound, you’ll hear a harp play the note A4 when it receives the message from my app.
At the same time, if SampleTank is also running, and you’ve selected a brass ensemble sound, you’ll here the note A4 played by a brass ensemble.
MIDI messages also have a channel number (1-16) associated with each message that can be used to selectively play different notes.
So, for example, you can setup Trapezoid to put out the notes on the bass bridge on channel 1 and the notes on the treble bridge on channel 2. In an app like SampleTank, which can play multiple sounds, you can tell it to play all notes received on channel 1 with a string bass sound, and all the notes received on channel 2 with a harp sound. Now when you play, you’ll have a bass and harp ensemble instead of just one sound.
The same general principles apply to my MIDI bagpipes chapters and other MIDI controller apps, you play them just like the real instruments and they generate the note-on/note-off MIDI data streams and send them to whatever you want to use as the sound generator.
Linda says I should write a more extensive article about using CoreMIDI on the iPhone and iPad, but until then, here’s a great tutorial explaining more what the heck this is all about and what it makes possible: