Stereo Reconsidered: Mid/Side Thinking

Tips & Tutorials  | 

Introduction

In recorded music, there has been an informal, generally held notion that the stereo image is a re-creation of a stage, an imaginary space stretching between the two speakers, on which the recorded artists appear, reproducing the reality of their live performances. In classical music recording, this notion has some validity, because the recording techniques used to create the recording usually include the use of a concert hall and a stage, and the producers make the evocation of that concert hall ambience a central part of their vision.

However, such an approach is only one of many ways to use stereo to evoke the sense of space and presence that makes it such an effective entertainment medium. In popular music (i.e. music created via multi-track recordings, with overdubs, effects and extended production and remix processes), the "stage" is a much more artificial and highly stylized idea. In fact, it is fair to think of popular multitrack recording as a fairly well-developed musical style itself, a style that is characterized by use of the Center phantom image which interacts with sounds panned Left and Right in a musical counterpoint that is unique to stereo loudspeaker playback. Such a style has no direct analog in the world of live or acoustical musical performance.

However, it does have roots in a particular derivation of that Left/Right pair of stereo signals, a derivation called (among other things) a "Mid/Side" signal pair. That Mid/Side signal pair consists of a mono sum of Left and Right (the Mid signal) and a mono difference (subtraction) of Left minus Right (the Side signal). This is simple to Do and Undo, if you have a console or mixer with Polarity Reversal switches.

Physical and Stylistic Reasons to Think About Mid/Side

Many of the elements of this style are driven by both the limits and the predisposition of the recording technology in use during the decades in which this style has been developing. Placement of the sound of the kick drum and electric bass in the Center (Mid) of the stereo image was originally necessitated by analog disc mastering requirements. In fact, all significant low-frequency energy was placed in the center (Mid) of the stereo image for the same reason. Similarly, placement of the lead vocal (often the loudest element in such recordings) in the Center (Mid) was also desirable. Electric guitar solos, horn leads, and other primary musical highlights followed suit. The reason for this was that the stylus motion on an analog disc that represents any signal other than Mid has a vertical motion component- and if it's too loud the stylus bounces out of the groove, which in those days was called a "skip." Consumers not willing to stack dimes (quarters for particularly loud Side moments) on their phono cartridges would return the records that "skipped". Such events were not lost on the mastering engineers of the day, whose livelihood depended on happy customers with "skipless" records. As a result, mastering engineers developed a passionate interest in scrutinizing Left, Right and (particularly) Side signals, in addition to the normal stereo recording.

At the same time, to enhance the sense of space and to prevent lead parts from being psychologically "cluttered" or obscured, supporting parts came to be panned to Left and Right. A spatial interplay between lead and supporting parts, between the solo vocal and the harmony vocals, etc., has developed, becoming a stylistic signature of stereo pop/rock recordings. The musical relevance and force of each of these elements is reinforced by this counterpoint. The presence everywhere of inexpensive amplitude-based pan-pots have effectively limited the effect of panning to generalized zones called Left, Center and Right except in particular cases where the engineer or producer was willing and able to devote the resources necessary to create something a little fancier. As you may know, conventional pan-pots cannot be used to reliably position a phantom image at any specific point between Left and Center or Right and Center, thanks to our hearing localization system's basic insensitivity to small amplitude differences.

Listening In Side

I started thinking about this some years ago after I had the good fortune to serve as host for mastering engineer Bob Ludwig when he came to talk to my students at SUNY/Fredonia. In the course of some fairly extended conversation, he told me that he always listened to the Side version of recordings that he mastered and that he'd come to find them very interesting from a production standpoint as well as a technical one. He found various producers' characteristic practices and tricks were much more audible in Side, and that many elements of the original tracks were more clearly revealed in Side. He suggested that my students might find it useful to listen to recordings this way in order to quickly identify and comprehend various producers' styles.

I spent some time after Bob's visit listening in Side and found that he was, as usual, right. I found something else, too. Once I began to study recordings in this way, I began to recognize stereo multitrack recording as a style in its own right - a style in which the recording can be thought of as having two primary elements: Mid and Side, as opposed to the more traditional view in which stereo is thought of as that recreation of a concert or club stage. This Mid/Side point of view, it turns out, leads to powerful and effective stereo recordings, recordings that also work really well in mono and that are particularly well-suited to a wide variety of playback situations.

Further, it became obvious to me that many producers have picked up this same sensibility, and are intuitively utilizing these elements very effectively as a central organizing approach to their multitrack recording production efforts. This has been a very useful insight for me, and it has led me to rethink my own approach to recorded music. I pass it on to you here, for your own use and edification.

There are three parts to this. First, you can use the Side component of the signal as a tool for the study of recordings, including your own. Such study will reveal many interesting, useful and sometimes startling things about recordings that are simply not audible in stereo or mono (Mid). Second, Mid/Side can be treated as an aesthetic approach to recording, and it is being used with great success by many producers today. We'll spend some time talking about that aspect of it as well. Third, you can use Mid/Side processing in mastering to enhance your mixes in some wonderful ways that just aren't possible in conventional stereo.

As I mentioned earlier, Mid is the audio sum, or (mono) mix, of left and right channels, and Side is the audio difference (also a mix, but with the polarity of the right channel reversed). To generate the Side signal, you have to mix Left and Right together, but first you have to reverse the polarity of the right channel (it doesn't have to be the right channel but traditionally it is). Many consoles make this easy, by including polarity reversal as a feature on the input modules. Ozone makes it effortlessly available in four of the six processing modules, by selecting Mid/Side instead of Stereo. Here you can choose the Solo control next to the Side button to "listen in side" with Ozone.

What To Listen For

Anyway, once you have Side listening available, there is lots of stuff to listen for. To begin with, all the Mid stuff is going to be nulled out, so you aren't going to hear it (which is also to say that you have just created a "vocal eliminator"!).

What are the Mid components? The three primary elements in the recording: Lead Vocal, Bass and Kick Drum, are generally placed dead center in the mix (i.e. they have identical levels in both Left and Right channels). When these are gone, you are left with both some pretty obvious stuff (everything that was on the left or right) and some less obvious stuff, like the reverb trails and delays of the lead vocal, effects, stereo filler, etc.) Some of this is pretty interesting. Numerous production effects and practices will be revealed, including various "tricks" with occasionally doubled words, rhythmic bounces between channels, special out-of-polarity effects and other elements of multi-track stereophony. 

audio-mastering-tips-example-midside

Figure 1: The multitrack stereo field reconsidered. A (Left) and B (Right) can be thought of as special cases of Side that come strictly from the Left and Right speakers respectively. The Side (A-B) stereo information and filler goes in the zones between Left & Center and Right & Center, as well as outside the speakers. Mid (A+B) contains the core elements: lead parts, bass and kick.

Listening in Side, you will hear a lot more of the stereo ingredients in a sound. A mono phantom image will be nulled out and a stereo phantom will have components remaining. Early delays added in mixing and panned left and right will be clearly revealed. Phasey stereo effects will show up as flanging. If they are keyed to the rhythm of the music, it will be obvious.

You also can have a lot of fun picking out things that the producers never intended you to hear: production flaws such as sloppy edits (often the Mid component masks the edit), poor azimuth alignment on analog recordings (high frequencies won't be nulled and so you'll hear wispy traces of Mid elements "leaking" into the Side signal), artificial stereo reverb added to mono reissues (the mono will be gone, leaving you to listen to just the stereo reverb returns), and so on. Old records can be a lot of fun in this regard.

As you get familiar with listening this way, you will begin to more easily pick out the separate strands of the Side element and the musical tension and interest that their interaction with Mid creates. You will often find there is an approximately equal balance in interest between Mid and Side, even though the most central musical elements are usually in Mid.

You will notice the high-frequency rhythmic "framing" (often or usually pure Left, Right) that surrounds and supports the Mid music. This is usually equally balanced left and right, and is often doubled rhythm guitars, keyboards, and/or high-frequency percussion.

Side parts often "answer" (as in harmony vocals) Mid parts (as in lead vocals). Sometimes, the line of musical interest "bounces" in and out, back and forth from Mid to Side on alternating notes, beats or measures, giving an elastic spaciousness to the recording.

Finally, uncorrelated unisons panned Left and Right (so-called "doubling") have proved to be one of the most beautiful and powerful effects in multitrack recording, while out-of-polarity high-frequency signals panned Left and Right yield a powerful spatial ambiguity that contrasts strongly and effectively with the monaural Center image.

Recreating the Stage vs. Loudspeaker Music

It will become apparent as you spend time listening to Side that multi-track production does not often evoke the image of a concert stage (except for special effect), but rather has evolved into a particular and quite identifiable "loudspeaker music" style. This has happened in response to the realities of the various ways people listen to recordings, such normal living-room stereo, mono radios & TV, in cars, through boom-boxes, and over headphones. Music that sounds good over all of this array of different systems must be "mono-compatible" while also being interesting and entertaining stereo that is easily audible off the median plane. Music produced with an Mid/Side approach, with its strong phantom image, strong A and B side components, and active spatial interplay, satisfies these requirements pretty thoroughly. The fact that it isn't a realistic reproduction of a live performance is comparatively unimportant in pop/rock recording, because (a) pop music is created primarily for playback over loudspeakers and (b) the vast majority of artists and listeners listen primarily or exclusively to loudspeakers for their music. So, I recommend that you experiment aggressively with Mid/Side to better take advantage of our medium. Consider using it as a primary production technique, right from the beginning, including in the way you compose, preproduce, score and record your music.

Mastering with Mid/Side Processing in Ozone

In mastering, with a tool such as Ozone, there are a wide variety of techniques you can try. Let me briefly touch on them.

In the EQ module, you can add spaciousness, airiness and envelopment by gently boosting the bottom and top two octaves in Side, while EQing the Mid the enhance the sound of the lead vocals, and overall mix. 

In the Reverb module, you can "dry out" the Mid signal (no reverb at all?, maybe a very short reverb?) while adding envelopment and spaciousness with a "wet" Side signal. 

In the Exciter and Multiband Dynamics modules, you can accomplish a lot by enhancing the Side elements of the lowest and highest bands while enhancing the Mid elements in the two mid-range bands. 

Interestingly, the Stereo Imager module simply allows you to adjust the relative gain (compared to Mid) of the Side signal for each of the bands. That's what it does!

Listening to True Stereo Recordings

It's also possible to listen productively to classical or other acoustic recordings in Side. In true stereo recordings, listening in Side can give you insights into the stereo miking techniques used. True stereo recordings using coincident microphone pairs will usually have an increase in reverberance in Side, while recordings made with spaced-apart omnidirectional microphones will change little, if at all, when the Mid and Side signals are compared. If the spaced pair is relatively close together, a loss of low-frequency information will be observed in Side. The closer the pair are to each other, the less low-frequency information will be present. You can even calculate, approximately, how far apart the two mikes were by figuring out at what frequency the low frequency sounds begin to roll off. Frequencies with wavelengths much longer than the distance between the two mics (about twice the distance or more) will essentially be "in phase" at both mics and will therefore cancel each other out when summed for Side listening. At the same time, signals coming from the side (which is the worst case for phase shift) will, at some frequency, be 180° out of phase. In Side, that frequency will be in phase, and will consequently be boosted by 6 dB. To apply this insight: if the level is clearly down at 100 Hz., but slightly increased around 200 Hz., then the mikes are almost certainly about half a wavelength apart at 200 Hz. (2.5 feet) and approaching or less than a wavelength apart at 100 Hz. (greater than 5 feet), so you can reasonably infer a 5-foot spacing between the mics. Clever, eh?

Orchestral recordings that involve the use of numerous spot microphones ("highlight mics") will reveal the presence of those mics in two different ways: highlights that are panned to the center will disappear, while those that are panned Left or Right will be revealed more strongly. Gain-riding of highlight microphones panned Left or Right becomes quite apparent as well, sometimes embarrassingly so.

Often, in such multi-microphone concert-hall recordings, reverberance will be reduced when listening in Side, because the reverberance is being primarily picked up by the main stereo pair, and that will be reduced in level in Side, leaving the highlight mics to stand out with lower reverb levels.

 

This article was adapted from an article for Recording Magazine in 1993 and updated for Dave Moulton's website in 2005.

Mid/Side Thinking 1 February 2, 2009
(c) 2009 by David Moulton Moulton Laboratories

©Copyright 2001-2017, iZotope, Inc. All Rights reserved.
License Agreement | Privacy Policy