Since starting iZotope in 2001, I’ve observed countless musicians and engineers as they mix sessions, and I’m constantly surprised by how much time is spent opening and closing plug-ins, scrolling through a mix, and switching back and forth between individual tracks. As a composer and musician, I’ve imagined a workflow that allows the visualization and manipulation of an entire mix in one integrated view—with a goal of staying in the creative space.
We don’t hear music as tracks, so why should we work with music as tracks? I think this is one of the places where recording technology has gotten stuck. We moved to a workflow of separate tracks based on the paradigm and constraints of analog recording. Traditionally, control and visualization happened in hardware routed physically through cables in real time, so we ended up with the standard that each track has a single plug-in and the audio is processed separately, without access to the entire track or any context about the rest of the mix. As we moved to a digital paradigm, we didn’t rethink the approach—or reimagine what’s possible with the power of new technology.
With the introduction of the Visual Mixer in Neutron 2, we’re trying to change the paradigm of mixing. We’re imagining a future with creative possibilities driven by new classes of processing and control that work across multiple tracks—not to mention faster workflows.
We started down this path with Insight in 2012, which allowed visualization of multiple tracks in one window.