The Science of Bass Drops!

Video

Edit April 3, 2014: Hey everyone! Hope no one minded our little April Fool’s joke. That said, we’d love to actually make this video. Who wants to do some research? (We do.)

In this video, we explore the science behind bass drops. Why are bass drops so popular in electronic music? Are we responding to pitch the same way cognitively as we do to dynamics? Join us as we explore just why bass drops are so popular. Is it the tension and release? Does it cognitively have to do with the way the pitch drops? Are we, in fact, evolutionarily inclined to love the bass? Or, at least, to respond to it.This video is best listened to on Beats headphones or the phattest sub you own. Please don’t use your Macbook speakers.

 

A shout out to producer Evan Scott for providing the music for this week’s episode. https://soundcloud.com/evanscottproducer

Advertisements

The Science of Wikidrummer

Video

We’re back from spring break with a new video!

Drummer Julien Audigier and Audio Zéro put together the Wikidrummer video, an exploration of different environments and the effect these spaces had on drum sounds.

In this video, we break down the science behind why all of these spaces sound different. We’ll also show you our own reverb experiment, and how you can incorporate physical spaces into your music, even if it was recorded somewhere else.

Chroma from Harmonix: Why I’m Excited as a Gamer and Musician

Harmonix (the creators of Rock Band) are prototyping Chroma, a video game that utilizes the mechanics of both first person shooters (FPS) and musical games such as Rock Band and Audiosurf. At its core it functions as an FPS, but here the soundtrack takes center stage, literally altering the course of battle. Musical gameplay mechanics include rhythmically placed speed pads, beat-matching combos for increased damage, a constantly changing map based on the selected song, and last but certainly not least a gun that literally shoots music.

From a technology standpoint, this game is really something to look at. There are a ton of things happening under the hood to make each feature run flawlessly in each match. Continuous spectrum analysis, calculations, and data manipulation need to occur every second on top of all the nitty gritty programming that makes this game a game in the first place. It’s not only an example of how powerful musical analysis can be, but how it translates into signal theory and its analysis.

Looking at this purely from a gamer’s standpoint, an ever-changing map adds a cool dynamic to a match. Perhaps you as a player can learn the exact moment in a song when the map will change and even know where on the map the optimal position for cover will be when the shift happens. Perhaps the other team does as well. As that song approaches this transition, both teams will rush to the same part of the map and intense fight will ensue. Essentially, the flow of gameplay has been improved as a direct result of the music.

Another example: perhaps your team has a sniper who just so happens to be a drummer with perfect rhythm. You’ll get combos for days. I mean can you say, “BOOM, HEADSHOT?” If features like these are executed properly, the FPS community could have another arena style shooter akin to Quake and Unreal Tournament on their hands—except with more music, which we could totally get behind.

But it’s important to note (ha) that the musical elements add many pitfalls that need to be avoided. Having guns that produce musical tones could get annoying very quickly if they aren’t automated. Not only that, if players are penalized for shooting out of rhythm or in the wrong melodic context it would make griefing a player’s team far too easy. Think feeders in League of Legends except these heartless demons ruin the song (the primary driver of match dynamics) while they’re at it.  That would literally ruin a match. And how would song choice play into a match? The team who picks the song would have the clear advantage, especially if they know all of that song’s map transitions (see the song transition example above). Either the song choice or the way the terrain modulates needs to be randomized to maintain some level of balance from match to match.

An interesting decision made by Harmonix is the use of original music produced by the studio. If you think back to the days of Guitar Hero for a second, you can recall that you played along with music by mainstream artists. We’re not trying to knock the folks producing the music over at Harmonix, but the tracks have to be really killer to not get stale.  Obtaining song licenses is complicated, but it would be a huge draw to the game, especially if players were able to do battle to their favorite songs. Think about it. If you thought Dragonforce’s “Through the Fire and Flames” on expert was crazy, can you imagine that level of chaos except with guns and modular terrain? That. I want that to be a thing.

Chroma is looking to be a very interesting player in the video game scene. With games like Rocksmith and Audiosurf gaining popularity, it would appear that gamers of the current generation are intrigued by the use of game mechanics driven by music. We’re going to keep an eye on this as development continues, and certainly give it a shot when it releases. I’m already excited to fry a screaming 12-year-old with my dubstep raygun of death.

Where Music, Education, and Technology Collide

We at the Science of Music are firmly perched on the intersection between music and technology, but we’re at a few other crossroads as well.

With technology changing the way we look at art (and vice versa) it’s only to be expected that it’s changing the way we look at education. We, in particular, are especially interested in the way technology is reshaping how we think of music education.

A new edition to NYU’s faculty in 2013, Alex Ruthmann also serves as the President of the Association of Technology in Music Education. We’re happy to have him at NYU, and especially happy to talk with him about the future of music education, why people should learn music, and what it means to be both an innovator and an educator. For more from Alex, you can find him on Twitter as @alexruthmann or visit him at his website.

Streaming is Doomed, Slash Heads a Hackathon, and Why Not a Musical First Person Shooter?: February 17-23

Generator Research has an alarming study for Pandora, Spotify, and other music streaming services: at their current business model, they are doomed to be unprofitable. The Generator report offers information on the top thirteen music streaming services, and one gloomy outlook. Meanwhile, last week saw the launch of yet another social streaming app (musx), and a potential partnership between Steam and Spotify. Also, Pandora is trying to take fate into its own hands by taking music publishers to court over a century-old royalty agreement. While Generator’s study found that the biggest suck on resources were the royalties that subscription streaming services had to pay to publishers, rulings against music publishing giants could upset the entire industry.

Our official response:

Colbert eats popcorn

Colbert don’t care

Our weekly nerdout is a little bit more metal than usual because Slash is holding a hackathon at SXSW music festival! An 80s hair metal icon holding a hackathon makes them cool, right? Like, not just nerdy cool but cool kid cool? Right? Okay, we might be reaching a little. But we do love our hackathons, and we just wish other people loved them too.

Speaking of love, how deep doth our love flow for music? Well, we love music a whole bunch, but the “our” in that sentence actually refers to humanity. The National Geographic notes that every human civilization has engaged in a form of musical development, but a sense of rhythm doesn’t just start at humans

…and doesn’t stop in just this reality. Video game creator Harmonix (of Guitar Hero fame) is creating a first person shooter. Based around music. Chroma, a collaboration with Hidden Path Entertainment, will allow players to use their musical prowess to jump higher, reload faster, and generally have all the advantages that are imbued to musicians in real life.

Finally this week, Slate has printed the words of one of NYU Music Tech’s own. Grad student Ethan Hein took on the issue of pop music pedagogy vs. more traditional techniques in response to a Quora question. Is he right? Shoot us an email or let us know in the comments!

Got a tip? Send us a message at scienceofmusicnyu@gmail.com , or Tweet to @NYUSciOfMusic, or post it to our Facebook page.

Analog Vs. Digital: The Age-Old Question

Video

Can you really hear a difference between analog and digital recording? What about in synthesis? In a personified rap battle between analog and digital, who would win?

NYU students, having nothing better to do at our ivory tower of academia (I mean, it’s not like we live in one of America’s biggest commercial and cultural centers or anything) love contemplating age-old questions such as these. You may remember some freshmen who previously did ponder whether was better to auto-tune or to not auto-tune. And while these questions may prove inconclusive, the best part is, as always, in the debate.

 

 

So our freshmen rose to the occasion once again to bring forth this rousing performance; a call to arms for their fellow rap battlers. And a cry heard throughout the land did ring: Who won? Who lost? You decide!*

*No seriously, leave a comment below.

Wah Pedals: A Look “Under the Hood”

Video


Since their invention in 1966, wah pedals have had a prolific history in modern rock music. From Jimi Hendrix’s “Voodoo Child” to David Gilmour in “Echoes” (where it was used backwards). But what goes into a wah pedal? How does it get that distinctive, “cry baby” sound? Let’s get under the hood and find out.