By Austin Weimer
Digital audio workstations have the ability to edit sound in a variety of powerful ways. Audio wave forms such as .mp3s and .wavs, covered previously in Audio Experimentation: Easy Editing and Basic Creative Tools, are fixed pieces of sound. This means the waveform can be played slower or faster, stretched, chopped, filtered, and tuned up or down. While the range of possibilities for waveforms is extensive, another type of sound data takes writing music to a new level.
Musical Instrument Digital Interface (MIDI) is a form of data used in the communication of sounds from a controller to a computer. MIDI controllers began in the late 1980s as normal keyboards with an additional set of wires for MIDI notes. Later, they evolved into grids of pressure-sensitive push pads in conjunction with computers and audio interfaces.
Composers of the past would use a pencil or pen to write music. Now, mouse tools like the pencil and brush enable composers to write much faster than handwriting notation. A MIDI note is essentially a virtual placeholder of pitch, rhythm, and volume for sounds. But the sounds produced by an arrangement of MIDI notes is determined by the user with the help of a computer. For example, if a composer records a song in MIDI data, they can go back to it afterwards and change the playback sound of any note, making any section of a piano arrangement play a trumpet sound.
Individual sounds for each note, called samples, are saved in groups for each instrument. Each group or track of MIDI data has its own set of editing tools that surpass the limits of waveform editing tools. MIDI notes are velocity-sensitive, which means the computer will assign individual notes with a value between 0 and 127 based on how hard or soft each note is pressed on the controller. Note timbre and effects change depending on the velocity value.
Sequencers handle the playback of MIDI notes after they’ve been recorded. Quantization is the process of aligning MIDI notes to the set tempo of a song. If a musician plays a song out of time, quantization will place the start point of a rhythmically incorrect note in the right place. Even if the original timing is correct, quantization can be applied to specific areas of MIDI data. In the case of drums, quantizing drum cymbals and auxiliary percussion with variable amounts of swing can create complex rhythmic patterns.
Arpeggiators repeat notes when a piece of MIDI data is triggered. Each time a note is triggered, the individual parameters of that note are visible in windows called steps. Everything from the order of notes to the velocity can be edited in equal step intervals. The rate of repetition is adjustable and relative to the tempo of the song. Chords are played with an additional interface called a chord trigger. By pressing a single note, chords will play in conjunction with the arpeggiator.
Sequencers and MIDI data evolved into complex instruments within themselves. Writing and expanding on musical phrases is now an interactive experience. As programs continue to develop, depth in editing follows. The most powerful instruments of the twenty-first century are the computer and MIDI sequencer.