Researchers have found a unique way to listen in other people's conversation by watching the video of objects like potato chip bags and plant leaves lying near to the sound.
Researchers at MIT, Microsoft, and Adobe have developed an algorithm that can reconstruct an audio signal by analyzing minute vibrations of objects depicted in video. In one set of experiments, they were able to recover intelligible speech from the vibrations of a potato-chip bag photographed from 15 feet away through soundproof glass.
In other experiments, they extracted useful audio signals from videos of aluminum foil, the surface of a glass of water, and even the leaves of a potted plant.
Abe Davis, a graduate student in electrical engineering and computer science at MIT said that when sound hits an object, it causes the object to vibrate and the motion of this vibration creates a very subtle visual signal that's usually invisible to the naked eye.
He further explained that they are recovering sounds from objects and that gives them a lot of information about the sound that's going on around the object, but it also gives us a lot of information about the object itself, because different objects are going to respond to sound in different ways.
The researchers have now begun trying to determine material and structural properties of objects from their visible response to short bursts of sound.
The technique has obvious applications in law enforcement and forensics, but Davis is more enthusiastic about the possibility of what he described as a "new kind of imaging.