Collaborating with Your Computer Using Ableton Live, Part 2

Collaborating with Your Computer Using Ableton Live, Part 2
What will the two of you come up with next?

With non-linear DAWs such as Ableton Live, Bitwig, and Tracktion Waveform, your computer can play an active role in your workflow. Whether it’s a creative task like writing bass lines, or something as specific as choosing where to boost or attenuate frequencies, your software and hardware can step up to the plate and start making suggestions. 

If you’re new to non-linear platforms, check out part one of this article to get yourself acquainted. In this second piece, we’ll continue to look at creative and technical methods to spice things up in the studio. 

What is an envelope in music production?

If you’re reading this blog you’ve probably run across an envelope or two in your day. They might not seem like the first place you head if you’re looking to get a little crazy, but today that changes. 

While envelope is a scientific term, for our purposes an envelope in music production means change over time. Something is changing—be that volume, cutoff frequency, send amounts—it’s changing over a set amount of time like milliseconds or bars. There are envelopes on most of the devices you insert on a track, and automation lanes themselves are envelopes that are linked to the timeline. Since we’re living that non-linear life and working in a non-linear DAW, there are also envelopes on a region/clip level, and those envelopes can be unlinked from the length of the region/clip. 

Have I said envelope enough times for it to stop making sense? Let’s see this in a musical context in order to clarify. 

Envelopes in practice

Continuing on from the examples in part one of this series, I added a piano loop and transposed it to match the key, and opened the volume envelope at the clip level. Clip envelopes live inside of the regions/clips, so whenever they’re played—whether it’s in the timeline or otherwise—you will hear them. I unlinked the length of the envelope from the length of the clip and made it three beats long. Then I went in and edited out most of the sample by automating the volume of the track, so now you just catch pieces and parts of it as it continues and repeats every three beats. 

What we now have is a volume envelope in 3/4 superimposed over an eight-bar piano loop in 4/4, creating a polyrhythmic relationship between the two. For added texture, I used ping-pong delay and voila!

What is modulation in music production?

Modulation means “change.” An LFO is a very common modulator—it uses the waveform it outputs to control, or modulate, parameters elsewhere. LFO’s are often found along specific places in the signal flow of a device that determine which parameters it’s capable of modulating. Common destinations are volume, pitch, cutoff frequency on a filter, and panning. There are Max for Live objects which allow you to use their output to modulate parameters on pretty much any device or track, so you can “tap-in” at any point in the signal flow to create change. 

Modulation in practice

I’ve added a Max for Live device called “LFO,” (here is a great resource for discovering new Max for Live devices) and inserted it onto the track with the piano loop and unlinked envelope. I set the output of the LFO to be a random waveform, so there’s constant variation in its frequency, which is set to a rate of 3/16. So now, every 3/16, the frequency of the LFO changes to something random. I then assigned the LFO to modulate the frequency parameter on a device called “Frequency Shifter," which takes incoming audio and shifts it’s frequency up or down. Have we said “frequency” enough times for it to stop making sense yet? Watch the video below to see this modulation in action. 

If you let something like this run for long enough, you will eventually stumble across musical results which you likely never would have been able to create on your own. It’s important to note that you can set its output to be something as simple as sine or saw, linked to the global tempo, which will result in less abstract results. 

What is inter-plugin communication in music production?

Thanks to intelligent software, and because of an iZotope product feature referred to as inter-plugin communication, we can also allow our machines to make suggestions as to where we should be doing things. Whether it's boosting or attenuating frequencies on a digital EQ, or something else entirely. 

Multiple instances of Neutron in one session communicate with each other, giving you visual and real-time feedback as to where frequencies from one track might be at odds from the frequencies of another—otherwise known as masking

sidechain_compression-1200x500
A battle as old as time, kick drum vs. bass. Who will win in the battle over frequency ranges?

Inter-plugin communication in practice

A tale as old as music production itself is the one about the kick drum and the bass fighting for the same part of the frequency spectrum. To make sure this doesn't happen, I’ve inserted Neutron on both the track with the drums and the bass, and can visually A/B the two side by side. You’ll see in the video that the area in red is where masking is occurring at different levels. Now it’s on me to go in there and make cuts based upon Neutron’s suggestion.   

Where does this leave us? 

And so concludes our look at collaborating with your computer. We explored ways software can play a creative role in your process, generate musical material, and create variations inside of repeated patterns. We also explored how it can help with technical tasks like EQing and mixing your tracks. Hopefully these concepts spark ideas that are useful to you in your own productions, and result in musical effects that you otherwise wouldn’t have been able to achieve on your own. 

Learn more creative tips and tricks for music production: