How to Enjoy Music: Listening for Experience

by Langdon Crawford

A student recently pointed me towards this article on mic.com. It discusses why we’ve lost touch with music. On the one hand, streaming and skipping on digital platforms decreases our attention spans when it comes to music. On the other, crushing musical data into bandwidth and spatial constraints is limiting our experiences.

I don’t even for a second believe its the quality of the audio that is neutering our connection with music. The ease at which we experience music is.

Take a look back at the thought an attention required just to make ye olde mix tape.
Then when you listened to that mix tape, you actually listened.  You didn’t edit, you didn’t skim, skip, or flip the tape over other song.
And the quality was total crap. It was a cassette tape recorded either from the radio or another cassette, which may have also been another mix tape. The noise floor on these tapes was crazy.
But the tapes were awesome.
You listened to every song, because you knew what it took to make that tape. And when you listened, you listened not only to the tracks but the intention of the mix tape maker who chose to put on one track after another.
You can make a playlist in iTunes or burn a CD in seconds. It’s the ease of rendering which has rendered the playlist almost trivial. It’s just a collection of songs.
When you receive a playlist from someone, you just sample it.  You listen and go, Hey, thats cool, then probably skip to the next song after the second chorus or bass drop.
All that said you have to ask why people are listing to music. When I’m inhabiting my role as a music maker and educator, I listen to music very differently than I do when I’m driving to the airport or painting the house.
Each context involves different levels of attention, (background noise, vs focused analysis) and different motivations (inspiration for my next piece or lecture, vs inspiration to move furniture while vacuuming). If I’m looking for something new, I might skip more tracks on Pandora. If I’m looking for the right vibe to keep me going while organizing photos on my computer, I might jump to the heavy section of Bohemian Rhapsody…
But if I’m looking to feel the feelings,  I’m probably going to set up a playlist of the songs as presented on an artist’s album and let them play through, preferably in a chill space without any noise from Fox News or Buzzfeed.  Vinyl records and cassettes are good for that, but it’s not because of the sound quality.  It’s the fact that its not a computer, it’s the album and only that album. I don’t have a remote or itchy trigger finger when playing a tape or record. It’s a time to listen and relax: there’s no need to stress, or spazz each time there is a series of 4 notes I don’t absolutely love, because the next good part is probably less than a minute away…
Take a break and enjoy music every once in a while.  It might just be good for you.

In the Studio: Signal Flow

We’ve got a new video! And more coming in the works.

Understanding signal flow, the direction and path a signal takes in your system, is so important. In the case of a recording studio, it goes from the sound source to its storage destination, but signal flow can be comparable to production order: how a product goes from manufacture to the store shelf. If at any point your product gets derailed, it will never reach its destination. Sometimes I like thinking about sound as a similarly concrete thing.

 

Ear to the Sky: NASA’s New Soundcloud

NASA has a Soundcloud, and it’s every bit as awesome as you’d imagine.

Part historical archive and part sound effects library, there’s an entire set dedicated to JFK’s quotes about Apollo 11:

As well as snippets of sound from space. My top three favorites are:

This is one of a series of star light curve waves captured by the Kepler mission and converted to sound. Note its rhythmic, periodic components.

Radio waves or wind in the tundra?

And finally, Voyager captures a sine wave sweep along with Jupiter’s lightning.

There’s many more sounds to be explored at the full Soundcloud page, including mission talk, rocket launches, and the strangely adorable sound of Juno saying hi in morse code. Not only is this further proof that NASA knows what it’s doing when it comes to social media (who else follows the Curiosity on Twitter?) this is also a fantastic new resource for composers and sound designers. But even going beyond that: NASA has provided a new way to experience our forays into the unknown. At a time when everyone could use a little inspiration, this is a wellspring. I hope it captures the public’s imagination.

Voltage Regulators: An Introduction

We finish off our Introduction to Electronic Components series with the voltage regulator, which is a useful little thing. Sometimes, you have more voltage than you need—say, when you’re working with a 9 volt battery but you need 5 volts for your circuit. Also, if your project is especially finicky, they can also take in a fluctuating amount of voltage and emit a perfectly constant value.

If you’re looking for more about other components (say…diodes, buttons and switches, resistors, etc etc) check out the whole playlist!

 

Visual Microphones: The Future is Now

The minutest movements of plant leaves. A glass of water: deceptively still. A bag of chips, lying discarded on the table. One of these things may be slightly less poetic than the others, but they do have one thing in common: scientists from MIT can recover sound from all three.

Calling it “the Visual Microphone” a team of researchers are using visual data to recover sound from videos of everyday objects. These objects are seemingly still to the naked eye, but upon reviewing the video, researchers were able to pinpoint the modes of vibration of these objects.

From their abstract:

“When sound hits an object, it causes small vibrations of the object’s surface. We show how, using only high-speed video of the object, we can extract those minute vibrations and partially recover the sound that produced them, allowing us to turn everyday objects—a glass of water, a potted plant, a box of tissues, or a bag of chips—into visual microphones.”

Comprised of Abe Davis, Michael Rubinstein, Neal Wadhwa, Gautham Mysore, Fredo Durand and William T. Freeman, the team’s website says they’re working on releasing code and data. But so far, they’ve posted sound samples of their work. Check it out here.

Home Foley Experiment Part 2: Revenge of the Foley Experiment!

Wait…there was a part 1?

That’s right, we’re finally posting the second part of our at-home Foley extravaganza! Last Thanksgiving we put together a Very Special Episode about a Foley fight scene. Later, we returned to the subject and created dinosaur chomps out of that same sound! Of course Ronny crunching on some celery doesn’t have quite the same menacing quality that an on-screen, cretaceous bringer of doom and death properly deserves, so we’ll show you how to make your dino crunches sound big and scary.

 

Oh and since this video our friend has seen the error of his innocent finger-attacking ways and now lives peacefully in the NYU Studios.

a t rex named foley

Draw MIDI: A DIY Paper Circuit Project

Update: Alex won first place in NYU Music Technology’s 2014 product design competition!                       Congrats!

 

Today’s Project: Draw MIDI

(Production: Ronny Mraz, Adam November, and Kathleen “Ying-Ying” Zhang, full credits at Youtube)

A digital-based project, Alex Haff’s Draw MIDI uses capacitance sensing to collect electrical signal from a pencil-and-paper keyboard, converts that to MIDI using an Arduino, and then sends the code into your computer via a Max patch. That may sound complicated, but it’s quite simple once you understand the function of each part of the project.

System Requirements and Code

While this project can technically be done on Windows, it takes a bit of finagling. We recommend Mac OSX.

Because Draw MIDI is a digital project, you’ll also need some code. For this project, Alex used both an Arduino sketch and a Max patch. You can find each of them here and here.

The Arduino code will run in the controller’s software on your computer. The Max patch will need to be run in Max. If you don’t own Max, never fear. You can copy and paste the code into Max’s free runtime application.

Step-by-Step

NYU Music Technology is an active member of the Instructables.com community. As such, a step-by-step guide of building the project may be found here.

Edit: Thanks for the feature, Instructables!

We had a lot of fun with this week’s project and hope you will, too! A special thanks to Langdon Crawford for cleaning up and hosting the code.

Draw MIDIPaper Circuits

Paper circuits have been making their way around the tech world because their low-cost components give them great potential for cheap mass production. For DIY, they offer similar affordability, a great availability of materials, and they’re just plain fun. There’s a novelty in creating something interactive from ink and graphite. Usually, our words and drawings can’t fly off the page, but with the addition of electronics they can light up or be heard.

For an industry application of paper circuits, watch our talk with educator Alex Ruthmann.

For some inspiration, check out this Ted Talk, “DJ Decks Made of Paper” by Kate Stone.

And for some great, simple projects the Exploratorium’s Tinkering Studio has you covered.

 

The Science of Bass Drops!

Video

Edit April 3, 2014: Hey everyone! Hope no one minded our little April Fool’s joke. That said, we’d love to actually make this video. Who wants to do some research? (We do.)

In this video, we explore the science behind bass drops. Why are bass drops so popular in electronic music? Are we responding to pitch the same way cognitively as we do to dynamics? Join us as we explore just why bass drops are so popular. Is it the tension and release? Does it cognitively have to do with the way the pitch drops? Are we, in fact, evolutionarily inclined to love the bass? Or, at least, to respond to it.This video is best listened to on Beats headphones or the phattest sub you own. Please don’t use your Macbook speakers.

 

A shout out to producer Evan Scott for providing the music for this week’s episode. https://soundcloud.com/evanscottproducer

The Science of Wikidrummer

Video

We’re back from spring break with a new video!

Drummer Julien Audigier and Audio Zéro put together the Wikidrummer video, an exploration of different environments and the effect these spaces had on drum sounds.

In this video, we break down the science behind why all of these spaces sound different. We’ll also show you our own reverb experiment, and how you can incorporate physical spaces into your music, even if it was recorded somewhere else.

Chroma from Harmonix: Why I’m Excited as a Gamer and Musician

Harmonix (the creators of Rock Band) are prototyping Chroma, a video game that utilizes the mechanics of both first person shooters (FPS) and musical games such as Rock Band and Audiosurf. At its core it functions as an FPS, but here the soundtrack takes center stage, literally altering the course of battle. Musical gameplay mechanics include rhythmically placed speed pads, beat-matching combos for increased damage, a constantly changing map based on the selected song, and last but certainly not least a gun that literally shoots music.

From a technology standpoint, this game is really something to look at. There are a ton of things happening under the hood to make each feature run flawlessly in each match. Continuous spectrum analysis, calculations, and data manipulation need to occur every second on top of all the nitty gritty programming that makes this game a game in the first place. It’s not only an example of how powerful musical analysis can be, but how it translates into signal theory and its analysis.

Looking at this purely from a gamer’s standpoint, an ever-changing map adds a cool dynamic to a match. Perhaps you as a player can learn the exact moment in a song when the map will change and even know where on the map the optimal position for cover will be when the shift happens. Perhaps the other team does as well. As that song approaches this transition, both teams will rush to the same part of the map and intense fight will ensue. Essentially, the flow of gameplay has been improved as a direct result of the music.

Another example: perhaps your team has a sniper who just so happens to be a drummer with perfect rhythm. You’ll get combos for days. I mean can you say, “BOOM, HEADSHOT?” If features like these are executed properly, the FPS community could have another arena style shooter akin to Quake and Unreal Tournament on their hands—except with more music, which we could totally get behind.

But it’s important to note (ha) that the musical elements add many pitfalls that need to be avoided. Having guns that produce musical tones could get annoying very quickly if they aren’t automated. Not only that, if players are penalized for shooting out of rhythm or in the wrong melodic context it would make griefing a player’s team far too easy. Think feeders in League of Legends except these heartless demons ruin the song (the primary driver of match dynamics) while they’re at it.  That would literally ruin a match. And how would song choice play into a match? The team who picks the song would have the clear advantage, especially if they know all of that song’s map transitions (see the song transition example above). Either the song choice or the way the terrain modulates needs to be randomized to maintain some level of balance from match to match.

An interesting decision made by Harmonix is the use of original music produced by the studio. If you think back to the days of Guitar Hero for a second, you can recall that you played along with music by mainstream artists. We’re not trying to knock the folks producing the music over at Harmonix, but the tracks have to be really killer to not get stale.  Obtaining song licenses is complicated, but it would be a huge draw to the game, especially if players were able to do battle to their favorite songs. Think about it. If you thought Dragonforce’s “Through the Fire and Flames” on expert was crazy, can you imagine that level of chaos except with guns and modular terrain? That. I want that to be a thing.

Chroma is looking to be a very interesting player in the video game scene. With games like Rocksmith and Audiosurf gaining popularity, it would appear that gamers of the current generation are intrigued by the use of game mechanics driven by music. We’re going to keep an eye on this as development continues, and certainly give it a shot when it releases. I’m already excited to fry a screaming 12-year-old with my dubstep raygun of death.