Visual Microphones: The Future is Now

The minutest movements of plant leaves. A glass of water: deceptively still. A bag of chips, lying discarded on the table. One of these things may be slightly less poetic than the others, but they do have one thing in common: scientists from MIT can recover sound from all three.

Calling it “the Visual Microphone” a team of researchers are using visual data to recover sound from videos of everyday objects. These objects are seemingly still to the naked eye, but upon reviewing the video, researchers were able to pinpoint the modes of vibration of these objects.

From their abstract:

“When sound hits an object, it causes small vibrations of the object’s surface. We show how, using only high-speed video of the object, we can extract those minute vibrations and partially recover the sound that produced them, allowing us to turn everyday objects—a glass of water, a potted plant, a box of tissues, or a bag of chips—into visual microphones.”

Comprised of Abe Davis, Michael Rubinstein, Neal Wadhwa, Gautham Mysore, Fredo Durand and William T. Freeman, the team’s website says they’re working on releasing code and data. But so far, they’ve posted sound samples of their work. Check it out here.

The Science of Wikidrummer


We’re back from spring break with a new video!

Drummer Julien Audigier and Audio Zéro put together the Wikidrummer video, an exploration of different environments and the effect these spaces had on drum sounds.

In this video, we break down the science behind why all of these spaces sound different. We’ll also show you our own reverb experiment, and how you can incorporate physical spaces into your music, even if it was recorded somewhere else.

Wah Pedals: A Look “Under the Hood”


Since their invention in 1966, wah pedals have had a prolific history in modern rock music. From Jimi Hendrix’s “Voodoo Child” to David Gilmour in “Echoes” (where it was used backwards). But what goes into a wah pedal? How does it get that distinctive, “cry baby” sound? Let’s get under the hood and find out.

Hearing and the Ear: An Introduction

We’re going to be frank: it troubles us when musicians don’t take care of their ears. Because hearing is super important seeing as it’s the basis of what we do. But, as important as hearing is, how many of us actually know how it works? Physically, mechanically, acoustically?

10.1371 journal.pbio.0030137.g001-L-A

10.1371 journal.pbio.0030137.g001-L-A (Photo credit: Wikipedia)

Let’s talk about how sound enters your ear. We have, of course, the external part of our ears. Without getting into it too deeply, this part of our ears channels sound vibrations into the ear canal. The ear canal , also known as the external auditory canal, leads from the outer ear to the middle ear. Incidentally, the ear canal itself has a resonant bias in the frequency range of 2k Hz to 7k Hz, which means that our ears are attuned to the frequencies of human speech.

English: View-normal-tympanic-membrane

English: View-normal-tympanic-membrane (Photo credit: Wikipedia)

As the ear canal channels this air fluctuation, it causes the tympanic membrane (illustrated right) to move. The membrane vibrates with the compression and rarefaction of the sound wave: moving inward with the compression phase, and outward with rarefaction.

Auditory Ossicles in the Middle Ear

Wikimedia Commonscauses the ossicles ( 3 tiny bones in the middle ear: the the malleus, ) to move.

Diagrammatic longitudinal section of the cochlea.

Diagrammatic longitudinal section of the cochlea. (Photo credit: Wikipedia

This in and out motion in turn causes the ossicles (three tiny cranial bones in the middle ear: the malleus, incus, and stapes) to move. These bones act as complex levers have to concentrate the force applied to the relatively large surface area of tympanic membrane to suit the relatively small opening, the oval window, that leads into our inner ears. Specifically, the oval window opens into the cochlea.

The cochlea, an organ that looks kind of like a snail shell, is where the mechanical energy of the sound’s vibration is converted into a neural signal. The cochlea is hollow and filled with fluid and lots of different things that are anatomically fascinating, but we won’t really discuss in this post. One thing that we will talk about, however, is the basilar membrane, which is suspended in the cochlea.

Sinusoidal drive through the oval window (top)...

(Photo credit: Wikipedia)

When sound waves enter the cochlea’s oval window they resonate the fluid inside, producing standing waves. This process decomposes complex sounds into their simplest, sinusoidal components. These standing waves adhere to distinct locations along the basilar membrane: locations which are determined by the waves’ frequencies. As seen at right, the lower the frequency the larger the amount of space on the membrane the standing wave takes up and vice versa. These sinusoidal standing waves cause the basilar membrane to move and thus cause the hair-like structures on the organ of corti to vibrate as well.

English: Organ of Corti

English: Organ of Corti (Photo credit: Wikipedia)

The organ of corti contains several layers of hair cells, and the nerve endings on these hair-like structures is where the actual transduction from mechanical energy to nervous impulse takes place. We talk about transduction possibly too much, but this is an important topic! Make note here, because this is how a sound vibration is translated into a signal the brain can understand. This neural signal is sent to the brain’s stem and cerebral cortex, where it interprets sound.

The anatomy of hearing as well as the study of the ear is its own science, and we’ve just barely the surface. Check out the links and some related articles for more!

Frequency Domain and EQ Basics

You see frequency domain all the time when you use audio equalizers, but how clear are you on what that is, exactly? Learn how to master any EQ/spectral analysis tool by watching this video on exactly what the frequency is, why it’s important in the music/audio field, and how—if you do any mixing whatsoever—you come across it all the time.

Once you’ve got this part down, you may be interested in learning about the actual method used to get a sound representation from the time domain to the frequency domain. If so, check out this link for more:….


  • Written and Directed by Travis Kaufman and Nick Dooley
  • Produced with support from The National Science Foundation

How to Use an SPL Meter

This video explains how to use an Sound Pressure Level (SPL) meter.  This is an essential tool for measuring intensity (think amplitude or volume) of a sound.   This is different than our perception of loudness, thus a specialized instrument (the SPL meter) is needed.


  • Written and Directed by Nick Dooley and Travis Kaufman
  • Produced with support from The National Science Foundation

Audio Interconnections: XLR, TRS and More

*Singing* The quarter inch tip is connected to the…positive! The sleeve is connected to the…ground wire!

Okay, we at the Science of Music are nerdy but we’re not quite that nerdy, we swear. But understanding cables and interconnections will help every musician who has to deal with gear. From XLR to RCA to the immortal midi cable (which has two unused pins…seriously, check it out), having an understanding of gear, mic, line, and insert cables will make your life infinitely easier.


  • Written and Directed by Nick Dooley and Travis Kaufman
  • Produced with support from The National Science Foundation