Interview: Mastering Engineer Alex Saltz on the Final Stage of a Record's Production

Alex Saltz has been taking songs from final mixes to final products since 1997 at his APS Mastering studio. Using a combination of analog hardware equipment and state-of-the-art digital gear, he has mastered releases for a diverse group of artists, including Vampire Weekend, Deer Tick, Steven Van Zandt, Stryper, and Janis Siegel of The Manhattan Transfer.

Because mastering continues to be a bit of mystery to musicians—even those who know how to record and mix themselves—we took the opportunity to ask Alex some questions to help demystify this final stage of the record-making process.

What are some of the most common mixing mistakes you encounter as a mastering engineer?

Frequency issues—most commonly in the lower octaves.

Over compression. Dynamics control can be done tastefully and artistically (as part of the sound), but when it's done in excess, it reduces the breadth of the music, making it sound congested and less immersive. Too much compression can also affect the rhythmic pulse of the track, making the groove soft and vague.

Limiting and loudness maximization. It's common practice for mixing engineers to send their clients loud mixes for reference, but for the mastering engineer—9 times out of 10—it's more effective to send non-limited versions. Just today, I got in two mixes that were heavily limited with extreme distortion and fragmentation. I got the mixing engineer to send me non-limited versions and the difference is stunning.

Overuse of an effect like delay or reverb.

Panning and imaging problems. Creative use of the stereo field can contribute to a spacious mix, but an element that leans too far to one side might distract the listener from the song as a whole. Reference your mixes on headphones to make sure the stereo field is balanced.

What can be fixed with a mix in the mastering process and what can not?

EQ imbalances can generally be corrected, but when the components of a mix require conflicting EQ moves, the mastering engineer's scalpel can be limited. For instance, if a vocal is too bright and shrill and the rest of the mix is dull, it can become a spectral wrestling match. In these instances, it's often best to remix.

Clicks, pops, and sibilance issues can often be addressed using restoration tools, but if (for example) a kick drum is over-saturated with transient, folding-type distortion, it would be time-consuming and costly to fix it in mastering, because each kick drum instance would have to be manually repaired at sample level. Longer duration events like distortion, fragmentation, or crackling might or might not be fixable.

Are the loudness wars a thing of the past?

It's no longer a war, but still a convention. Streaming platforms like Spotify, Apple Music, Tidal, YouTube, and Pandora use volume normalization, which brings a track up or down in level to meet a target loudness. When a client's main channel of distribution is online streaming, I encourage them to consider a more reasonable loudness objective. No need to crush a master with limiting if it's going to come down in level in streaming.

People still listen on non-streaming platforms where competitive level is relevant. But even then, the immediacy and impact of extreme loudness comes at a cost. The listener will likely experience ear fatigue and might be less inclined to revisit the music over the long-term.

It's also important to consider the loudness limitations of the material on hand. Mastering engineers have many methods to attain loudness, but often one track on an album will have a certain tonal hollowness—or at the other extreme, a density—that will prevent it from going as loud as the other tracks.

Alex Saltz (All photos by Scott Rudd)

What’s the biggest challenge facing high-fidelity recordings in the world?

Data compressed delivery formats and streaming services are a compromise, but an even bigger barrier is the consumption of music in noisy environments. Cars, streets, subways, airplanes, cafes, bars, restaurants, parties, or loud households interfere with the full listening experience.

What is the most essential component in your signal chain?

I think of my signal chain as three parts: conversion, processing, and monitoring.

For digital to analog conversion into my analog chain, I most often use a Forssell MDAC-2a. Fred Forssell built this unit with a modification to the analog output stage. It renders the complex densities and timbres of instruments in a strikingly natural way.

For processing, one of the centerpieces of my analog chain is a Sontec MEP 250-EX equalizer that was modified by tech, Dan Zellman. I find this EQ indispensable for drawing out the most musical elements of a mix.

At the end of the signal path, the amplifier and monitors are the vital link between the electronics and our ears. I spent over 10 years going through many speaker configurations before finding the optimal system: Lipinski L-707 monitors powered by a Cello-Mark Levinson HTA-2 power amplifier and a B&W ASW 2500 subwoofer. It's important to know that every room interacts with speakers in its own unique way, so a setup that works in one room might not work in another. I encourage engineers to spend ample time testing different monitors to find the right fit.

How do you choose a component for your signal chain?

I usually go through a long process of evaluation with different types of music and production aesthetics. Each piece of gear has a personality that reveals itself over time.

How have you introduced software over the years into your mastering process?

Plugins offer indispensable levels of control and precision. I often use them conjunctively with analog processing. For clients that have low budgets, I sometimes offer in-the-box mastering.

What are some essential software products you use for mastering?

Samplitude Pro X3 Suite with restoration tools, Izotope Ozone, and plugins from Newfangled Audio, DMG, FabFilter, PSP, Sonnox, Goodhertz, Slate Digital, and Waves. I recently started using the wonderful Lindell Audio TE-100 plugin that beautifully reproduces the complex tonal and transient characteristics of the vintage Klein & Hummel UE-100 equalizer.

What’s the one thing everybody should be doing when recording?

Capturing the very best performances. The best sounding productions start with musicians and composers who have a natural sense of arrangement, tone, and feel, who can create parts and sounds that serve the artistic vision.

What does the future of audio mastering look like?

Because there is so much music being produced these days, the industry is more competitive than ever. Musicians are realizing the importance of high-quality work and recognizing the skill and artistry that goes into mastering. It's the final stage to get a production to sound as good as possible and to prepare it for a vast sphere of delivery channels. If proper time and sensitivity is put into mastering, it can make a world of difference.


If you're interested in having your recording mastered or have a question about the process, you can email Alex at alex@apsmastering.com.


comments powered by Disqus