Researchers at Carnegie Mellon University seemed to have solved the age-old problem of microphone leakage, using just a few inexpensive video cameras. Instead of recording sound using microphones, cameras see the vibrations created by individual instruments and then translate them into sound waves.
No matter how good the microphones are, there will always be drawbacks when recording multiple instruments. To be able to edit the sound in post-production of a single instrument, you must first isolate them. This is why recording studios are usually set up with more than one room, or instrument tracks are recorded separately and layered on top of each other.
But recording music this way takes a lot of time, equipment and money, and it’s also not the most efficient way to get the best musical performance from musicians who are more accustomed to playing in live together in a grunge bar than separately in a sterile studio environment.
What if you could record all the musicians playing together but also retain the ability to mix each instrumental track separately without bleed?
Well, that’s where researchers at the Carnegie Mellon University School of Computer Science’s Robotics Institute turned to video cameras instead. If you pluck the string of a guitar or a violin, you are not only creating sound waves, you can actually see the vibration. With the right equipment, these vibrations can be viewed and analyzed to recreate the sounds produced, even if no sound is recorded.
Optical microphones are not a new idea, but what CMU researchers have proposed and shared in a recently published paper, ‘Dual Shutter Optical Vibration Detection,’ is a way to make them work using low-end, more affordable camera gear.
The new system shines a bright laser light source onto a vibrating surface, such as the body of a guitar, and captures the movements of the resulting light-flecked pattern. The range of human hearing can detect sounds oscillating as quickly as 20,000 times per second, so optical microphones have typically relied on expensive high-speed cameras to capture physical vibrations oscillating just as rapidly.
However, the CMU team got around this problem by using 2 types of cameras at the same time. One with a global shutter that captures entire frames of video resulting in distinct speckle patterns, and one with a rolling shutter that captures frames line by line from top to bottom of the sensor resulting in distorted speckle patterns which actually contain more information about how they move through time.
The images from each camera are then compared to each other using a special algorithm up to 63,000 times per second. It’s as fast as a super expensive optical microphone but at a fraction of the cost.
The audio quality is still not as high as a traditional microphone, but it could give sound engineers and producers more control when recording live music or very large ensembles such as symphony orchestras.
The system also has other potentially interesting applications aside from music. A video camera monitoring all the machinery in a factory, or pointed at the engine of a running car, could determine when individual parts or components are making an abnormal sound, indicating that maintenance may be needed before a problem actually becomes a problem.