If you are holding a packet of potato chips in your hand and talking to another person in a very low voice and you are confident that no other person can hear you or no other person is listening to you, then you might be wrong. Lately, MIT researchers, along with researchers of Microsoft and Adobe, have developed an algorithm which help them listen to your conversation by watching your potato chip bag.
According to researchers, the algorithm is able to “reconstruct an audio signal by analyzing minute vibrations of objects”. In this case, the object is potato chip bag. In an experiment, researchers put a potato chip bag 15 feet away through a pane of soundproof glass and set a video camera pointing at the potato chip bag. With the help of the camera, researchers captured tiny vibrations that hit the potato chip bag while someone was speaking and the algorithm helped them reconstructing those tiny vibrations into an intelligible speech. See the video.
Alexei Efros, a University of California at Berkeley researcher, said in a statement , “This is totally out of some Hollywood thriller. You know that the killer has admitted his guilt because there’s surveillance footage of his potato chip bag vibrating.”
That’s not all. With the help of this technology, MIT researchers claim that they can reconstruct music, speech, or seemingly any other sound. While a bag of chips is one example of where this method can be put to work, MIT has found success with it elsewhere, including when watching plant leaves and the surface of a glass of water. Researchers will publish their study at the computer graphics conference Siggraph.