In many ways, MIDI 1.0 has changed a lot since the specification made its public debut in 1983. You can measure the progress by how MIDI-compatible instruments connect, the types of devices and software programs that employ it, and how much easier the technology has become for musicians to use.
Connecting your MIDI controller through USB, drawing MIDI hits directly into a DAW's drum pattern grid, or instantly mapping something like an Ableton Push to Ableton Live's functions and parameters—such things were far outside the realm of possibility in the early '80s.
But in other fundamental ways the logic and limitations of MIDI 1.0 have remained unchanged: for example, MIDI's maximum of 16 channels of information, its low resolution, its myriad timing issues, or its unidirectional flow of data (see: your synth's separate MIDI input and MIDI output).
For more than a decade, the MIDI Manufacturers Association (MMA), Japan's Association of Music Electronics Industry (AMEI), and companies from Roland and Yamaha to Ableton and Native Instruments have been working on MIDI 2.0, the first time the MIDI specification has gotten a true overhaul in more than three decades.
Earlier this week, Roland announced its first MIDI 2.0-ready controller, the A-88MKII, and many other brands are sure to be showing off their MIDI 2.0 goods soon. However, because the new specification hasn't been fully ratified by the members of the MMA and the AMEI, many details and implications of the new spec are still unknown.
In an effort to understand what lies ahead for musicians in the very-near-future of the ascendent MIDI 2.0, we spoke with Art + Logic's Brett Porter—a software engineer who's been working on the spec for the past 18 months—and the Roland Technical Development Division's Koichi Mizumoto, who leads an AMEI FME-CI (Future MIDI Expansion with Capability Inquiry) working group.
"There are two constituencies that come at this with really different end-goals," Porter says. "On the one hand, manufacturers, obviously, want to make and sell new stuff, which is one set of priorities. But on the other hand, there are the people who are going to buy the gear who are like, Is this important to me? Is this something that's worth potentially getting rid of something I love for new capabilities that may or may not be compelling?"
But don't fear that your current drum machines and keyboards are about to become obsolete. MIDI 2.0 will be backwards compatible, meaning all new MIDI 2.0 devices will be able to use and send MIDI 1.0 data. And if, in time, you create a hybrid setup of old 1.0- and new 2.0-equipped machines, your rig's MIDI 2.0 models will interact together with the fresh capabilities of the new spec, while the MIDI 1.0 models will continue to work as they always have.
So why will you, as a musician, likely want to use MIDI 2.0-capable instruments and software? Because the new standard goes well beyond the limitations of MIDI 1.0—greatly improving resolution and responsiveness—and will solve persistent issues around timing.
"The two noisiest [problems] that people really would like to see go away are the fact that MIDI 1.0 doesn't know anything about time," Porter says. "It's very common in MIDI systems to have very sloppy timing, and obviously that's something that's really important. If you hit a V-drum, you want the sound to come out as fast and as precisely as it can."
The second major problem is a result of the hardware available in the early '80s, when the first specification was created: 5-pin DIN cables could only handle slow bit rates, and processors couldn't handle high-resolution data even if you could send it. So now, even though processors are exponentially faster and the MIDI 1.0 protocol can and does work beyond 5-pin connections, the protocol itself is still shaped by the legacy of outdated technology.
"A controller on a MIDI 1.0 device can send out values ranging from a value of 0 to 127," Porter says, citing numbers anyone who tweaks their velocity levels in a DAW's piano roll will recognize. "The difference in data between the softest thing that could happen and the loudest thing that could happen is only 127. Whereas in MIDI 2.0, the resolution is just blown off the charts. We have some events that are now from zero to 65,535 and others that are full 32-bits, so, it's zero to 4-billion-something."
But say you don't really care about bit rates, protocol logic, or modes of transport. What do these raw numbers add up to?
"What that really means is you're dividing that minimum to maximum up into much smaller pieces, so you start being able to build controllers that are expressive in the same way that maybe analog controllers are, where you're able to make changes that are smaller than what humans perceive," Porter says. "That's the biggest change: The performances will get more fluidity, more expressivity, at that level."
The MMA makes the distinction between the MIDI 2.0 protocol and what it calls the "expanded environment" of MIDI 2.0, which includes a new Universal MIDI Packet, MIDI Capability Inquiry, Profiles, and Property Exchange. What this means is that the new protocol—or the language of MIDI 2.0—is being paired with better ways of sending and receiving messages as well.
MIDI 2.0's new Universal MIDI Packet, for instance, expands the 16 channels of MIDI 1.0 to 256 channels of MIDI 2.0, allowing for functions like per-note pitch bend or articulation data, with plenty of space for other hi-res messages of all kinds to be communicated.
When you connect devices together, the Capability Inquiry will immediately determine what each instrument is able to do: Your new MIDI 2.0 controller will automatically know which pieces of your rig are equipped with MIDI 1.0, which are capable of 2.0, and tailor its messages accordingly.
Perhaps the biggest benefit that will come from this Capability Inquiry and the related work in Profiles and Property Exchange is that MIDI 2.0 controllers will automatically map to other device's functions and parameters.
Think about how the knobs, pads, and buttons of an Ableton Push 2 are intuitive as soon as they're connected to Ableton Live, but if you're using a generic controller, like an Akai MPK, you'll need to map its knobs and faders to whatever specific DAW functions you want them to control.
The type of instant-matching that is, as of now, still based on proprietary messages between Ableton hardware and software (or similar systems from other companies) will instead be universally available through MIDI 2.0. It will also work the other way, with DAWs being able to pull patches from hardware synths, if compatible.
As Roland's Koichi Mizumoto tells Reverb, this is "a significant workflow improvement." Rather than spending time matching user interfaces, musicians will be able to focus on making music.
MIDI controllers, even before MIDI 2.0, have come a long way in this new millennium. They're a dominant fixture of live performances and are often the first instrument contemporary producers grab when beginning to work on a track.
We've touched on the performance benefits of MIDI 2.0 above—the greater fluidity, the more natural pitch-shifting—but what if you're a beatmaker or a producer working with a simple rig, maybe just one controller and a DAW with soft-synths and samples? What will MIDI 2.0 allow you to do that you can't do with MIDI 1.0?
Mizumoto says that, beyond the great aid of automatic mapping, MIDI 2.0 will heighten the musical qualities of such a setup.
"While it is true that this kind of producer may not use thousands of tones and many parts, they frequently make use of precise control over level, pan, pitch, and filter for their loops and samples," Mizumoto says. Instead of relying on drawing automation within their DAW, these producers can use a MIDI 2.0 controller's high-resolution knobs to shape their sounds with "a more musical and inspiring method."
For professional composers who use orchestral libraries like those from EastWest to create scores and arrangements, the ease and precision with which they'll be able to craft note articulation will be far better than what's possible with MIDI 1.0.
Currently, Porter says, you may use the keyswitch function in string libraries—"where in a bottom octave, the C means pizzicato and a C# means col legno"—to control the articulation of audible notes while that lower key is held down. "Under MIDI 2.0, we can send that articulation data on every note event." By using a controller's ability to sense where a finger lies on a key, you could conceivably play notes pizzicato, col legno, or some other articulation just by where you're compressing the keys.
Mizumoto seconds this point, and adds, "While it is true that many DAWs provide support for per-note control, having this be part of the MIDI standard means that this level of articulation can now be offered within future hardware instruments and controllers once implemented." And again, the benefits of workflow enabled by MIDI 2.0 devices easily discovering and assigning such features will be enormous.
One last scenario I posed to Mizumoto and Porter concerns the type of electronic music artist that has a substantial amount of hardware instruments—maybe in a hardware-only rig and maybe in a hybrid hardware-software setup. As of now, you need to assign a master device to command the others and navigate a sometimes confusing mix of daisy-chained devices.
"We're not really going to understand the implications of what's usable until people start buying things and plugging them together."
Will some future rig of only MIDI 2.0 instruments be allowed to all communicate freely together at once, each perhaps commanding different parameters of another? It may be a vague idea, but could MIDI 2.0 go beyond the unfortunately named master-slave conventions of MIDI 1.0 and allow multiple devices to behave as a network?
"MIDI 2.0 is basically assuming a one-by-one connection with an initiator and responder. Maybe future MIDI interfaces will act as responders to the DAW and as initiators to connected instruments," Mizumoto says, though there would be other technical kinks to work out.
With Roland's A-88MKII being the first MIDI 2.0-capable device unveiled publicly, we're still a long way from seeing what's possible when multiple MIDI 2.0 devices are used in tandem.
When asked about a potential MIDI 2.0 network rig, Porter says, "Unfortunately I think a lot of things like that, we're not really going to understand the implications of what's usable until people start buying things and plugging them together. Think about from the very first date that MIDI 1.0 gear was shown at NAMM to even 10 years later, it was people saying, 'You know, it would be cool if we could do' whatever and then building the layer to make that work."
Brett Porter is presenting about the new MIDI 2.0 standard as part of the A3E (Advanced Audio + Applications Exchange) program at the NAMM convention Saturday, January 18. If you'll be at NAMM, find out more about his session, "MIDI 2.0 for Developers, a Technical Deep Dive," here.
Another MIDI 2.0 session, featuring other members of the MMA working group, will take place Friday, January 17. For more info on this session, click here.