Consistency Across an Album
Consideration also has to be made for how the individual tracks work together when played one after another in an album sequence. Is there a consistent sound? Are the levels matched? Does the collection have a common “character” and play back evenly so that the listener doesn’t have to adjust the volume?
This process is generally included in the previous step, with the additional evaluation of how individual tracks sound in sequence and in relation to each other. This doesn’t mean that you simply make one preset and use it on all your tracks so that they have a consistent sound. Instead, the goal is to reconcile the differences between tracks while maintaining (or even enhancing) the character of each of them, which will most likely mean different settings for different tracks.
Preparation for Distribution
The final step usually involves preparing the song or sequence of songs for download, manufacturing and/or duplication/replication. This step varies depending on the intended delivery format. In the case of a CD, it can mean converting to 16 bit/44.1 kHz audio through resampling and/or dithering, and setting track indexes, track gaps, PQ codes, and other CD-specific markings. For web-centered distribution, you might need to adjust the levels to prepare for conversion to AAC, MP3 or hi-resolution files and include the required metadata.
The History of Mastering
The earliest forms of mainstream recording technology did not require the recording, mixing, and mastering processes to be separate disciplines.
Rather, the recording was cut directly to a wax disc via a stylus connected to a diaphragm, which was in turn driven by an acoustic horn through which the sound was captured. These wax discs were then used to make stampers, which themselves were used to press shellac-composite 78 rpm discs.
The introduction of the 331/2 rpm long play (LP) vinyl record in 1948, and the 45 rpm in 1949 contributed to a change over time in the record making process. Recordings were being made to tape and engineers were tasked with preparing a master disc from the tape recording. When cutting master discs, these engineers now had to watch for and reduce loud transient peaks present in the tape recording. The energy of these peaks could potentially burn out the disc cutter head or cause the stylus to pop out of the groove when the record was playing.
In order to detect and reduce these peaks, dynamics processing tools such as compressors and limiters were introduced. This was the first time sonic adjustments began to impact the audio after the recording and mixing processes. The need to monitor these tools and adjust the settings for an optimum playback experience without compromising the sound quality was the earliest form of mastering.
The introduction of the standardized RIAA curve meant that equalization (EQ) became part of the mastering discussion. Intended to allow records to be cut with narrower, tighter grooves (and thus, a longer playing time), one side effect of this curve was that the pre-emphasis curve applied to the recording could enhance high frequency transient peaks, and the de-emphasis applied upon playback could cause a boost in low frequency energy that would cause the stylus to pop out of the groove.
Slowly but surely, the necessity of these tools to ensure a positive consumer experience meant that the skills of those who could utilize them effectively became highly prized. Some engineers (notably Doug Sax, Bob Ludwig, Bob Katz, Bernie Grundman and others) began to focus exclusively not just on the practicality of these tools, but also ways in which they could be used to further enhance the listening experience.
Thus, the art form was born. To this day, mastering remains a combination of practical and aesthetic processes. Though there isn’t any one ‘correct’ way to master, there are many recommended practices that mastering engineers follow.