At this summer’s Siggraph — the premier computer-graphics conference — researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) will present new software that amplifies variations in successive frames of video that are imperceptible to the naked eye. So, for instance, the software makes it possible to actually “see” someone’s pulse, as the skin reddens and pales with the flow of blood, and it can exaggerate tiny motions, making visible the vibrations of individual guitar strings or the breathing of a swaddled infant in a neonatal intensive care unit.
The system is somewhat akin to the equalizer in a stereo sound system, which boosts some frequencies and cuts others, except that the pertinent frequency is the frequency of color changes in a sequence of video frames, not the frequency of an audio signal. The prototype of the software allows the user to specify the frequency range of interest and the degree of amplification. The software works in real time and displays both the original video and the altered version of the video, with changes magnified.