The Making of Moogfest 2017: How Bronto Transformed Sound Into Music

Eric Hirsh, Senior Sales Engineer

Eric Hirsh, Senior Sales Engineer

This is the second article in a four-part series about how Bronto employees collaborated to transform raw, anonymized data into an interactive audio and visual installation for Moogfest 2017 called The Sounds of Commerce.

In our first article, software engineer Clif Jackson explained how he converted raw Bronto data from the 2016 Black Friday and Cyber Monday data set into a music-data format called MIDI and applied various time compressions to it, resulting in a file library of musically diverse options.

My role in The Sounds of Commerce was to make the MIDI data come alive and enter the physical world, tying together sounds emanating from speakers, sights emanating from monitor screens and tactile experiences with our custom-built controllers.

One of the original tenets of the project was that the sonification and visualization of Bronto data would not be a passive experience, but an interactive one, where any group of people could walk up, directly manipulate the sights and sounds, and experience the results of their creative contributions in real time.

While we never formally wrote up a creative manifesto, our goals for The Sounds of Commerce would have looked something like this:

  • Create art where the source of the music comes from a cloud-based platform’s event stream instead of traditional human performers.
  • Enable any participant to intuitively and creatively relate to big data.
  • Foster an environment where music is a social experience.
  • Spark conversations and ideas.

The Nuts and Bolts

To use development lingo, if Clif provided the data library, I architected the run-time environment for the installation. I have a long-running passion for music technology, even more so than was hinted at in my employee highlight from earlier this year. To put user experience and interaction design at the fore of this project, I chose to use an operating system and software with which I was already familiar so as not to waste cycles on learning new technology. In future iterations, Clif and I would really like to explore recreating The Sounds of Commerce with a Raspberry Pi and open-source virtual instrument software.

The brain of our exhibit is a platform called Ableton Live, running on a MacBook Pro. Besides the fact that I am already well-versed in Live, it was the obvious choice for an interactive exhibit because it was designed to support non-linear live performance instead of linear, fixed compositions.

What do I mean by that? Your favorite song was written by a human. It tells a story. It has a beginning, a middle and an end. The software used to record such music mimics a tape recorder. You can see the linear sequence of events that make up the song:

But The Sounds of Commerce isn’t a traditional song. It is neither finite in length nor fixed in composition. It’s an audio experience that is always on (well, at least during the hours of Moogfest). Anyone can walk up or walk away at any moment, so the music is both ever-present and ever-changing.

Abelton looks more like a matrix of music clips than a traditional linear roll of music. What I built in Abelton was a set of clips of commerce data, all of them different lengths and speeds, using the various time compressions in Clif’s library. Ableton made sure they were always playing, endlessly looping, phasing in and out with each other. Any clip could start and stop at any time, independent of any other clip. In this screenshot, you can see that the columns represent different kinds of data (email sends, opens, clicks, conversions), and the rows contain different corresponding files from the library.

It took me longer than I anticipated to curate a set of clips that could sound good no matter how they were combined (all together, just a few at a time, each playing from different locations within the clip). To make an analogy to visual arts, painters are usually able to precisely define their color palettes. With The Sounds of Commerce, I had no such advance control. This was not my piece to compose, but I at least needed to find a way to make sure all the colors, or sounds in our case, didn’t combine to make an ugly, discordant brown.

Finding the Right Sound

Ultimately, mine was a process of limiting parameters so that the end user down the line could safely modulate parameters and still enjoy the results. Clif talked about time compression in his article, and I’d like to add the equally important pitch quantization we applied to the data. In a way, we “downsampled the data” for the sake of music.

Technically, sound waves ride on a continuous frequency spectrum. Western music resolves pitch into one of 12 possible tones that repeat every octave. And what’s even more common in Western music is to focus only on a subset of seven of those twelve notes – this is called a diatonic scale. Without getting deep into music theory and cognition, notes played in combination limited to the pitches of a diatonic scale tend to sound more pleasant than a combination of all twelve (chromatic) notes. Clif used MIDITime to transform the data to be limited to the notes of the D dorian scale (Fun fact: This is the same scale on which the Miles Davis tune “Kind of Blue” is built).

Even with pleasant-sounding notes, it took me a while to find the right sound, or timbre, for each type of data. There are actually 12 tracks in The Sounds of Commerce run-time, not four. That’s because I chose a slow, medium and fast file for each type of email data we were using: sends, opens, clicks and conversions. That way, there could be more variety of surface rhythm for different participants to choose from. I can’t claim much artistic significance to the sounds chosen. Sends, opens and clicks each got synthetic pads, bells and leads, and orders got percussion sounds. Slow files worked well with long notes, low pads. Fast files were short sounds with quick decays, or else we’d have gotten that discordant sound.

Adding the Human Element

The final step in my process was to make the playback of all 12 files interactive. Allowing the concept keyboard player or DJ to press buttons, spin knobs and use sliders to affect the sounds is a native concept of Ableton. I am proud that the Bronto team built five of its own custom MIDI controllers instead of using products manufactured for professional use. We will detail the design and construction of those in an article we’ll publish later in this series.

As it turns out, the MIDI specification doesn’t just include attributes of musical events, such as notes or rhythms. It also has a section of control signals (continuous controllers). Each of our five controllers transmitted on a designated MIDI channel (1-5), and each individual knob or slider transmitted a unique MIDI CC number. Ableton makes it easy to map control signals to modulation parameters in the software. Here’s a screenshot:

I wanted any given interaction to produce an immediate and perceptible result. We designed our controllers to be whimsical, retro-arcade game boxes. Each knob and slider was purposefully unlabeled as to its function so the user would have to discover and explore. Some of the modulations acted upon the music/MIDI data – shifting the pitches of the notes wildly up or down, changing the length of the notes, even a few switches that would turn on chaos mode, which loosened the seven-note diatonic quantization and reverted to the full 12 chromatic notes. Other modulations acted on the resulting audio, more like what a DJ might do – adding effects like reverb, delay, sweeping filters, big crushers. Still other buttons and knobs were reserved for the visual portion of the installation, which we will detail in a future article.

Even though there were no clear labels on the physical controls themselves, I did put thought into what sort of biomechanical action related most appropriately to each sort of sound. Circular knobs lend themselves to continuous controls such as volume and amount of reverb, while knife switches trigger a binary state, such as turning chaos mode on or off.

The mapping process was extremely iterative, right up to the last possible minute before Moogfest began, but I even made some revisions on the second and third days based on feedback we received in anecdotes from festival-goers and Bronto employees who staffed our Moogfest booth.

I was honored to have had the opportunity to work on this project, and I’m grateful to everyone who shared their ideas, offered creative feedback, devoted time outside of work to build things and participated in the booth to interact with festival-goers.