What if what you saw with your eyes could be interpreted in a brain-scanner? Well, that just happened. Check it out:
Gallant’s coauthors acted as study subjects, watching YouTube videos inside a magnetic resonance imaging machine for several hours at a time. The team then used the brain imaging data to develop a computer model that matched features of the videos — like colors, shapes and movements — with patterns of brain activity.
“Once we had this model built, we could read brain activity for that subject and run it backwards through the model to try to uncover what the viewer saw,” said Gallant.
Subtle changes in blood flow to visual areas of the brain, measured by functional MRI, predicted what was on the screen at the time — whether it was Steve Martin as Inspector Clouseau or an airplane. The reconstructed videos are blurry because they layer all the YouTube clips that matched the subject’s brain activity pattern. The result is a haunting, almost dream-like version of the video as seen by the mind’s eye.