In what is being called a “fundamental leap forward in our understanding of how brains work,” Japanese researchers have successfully caught on film a thought being formed in the brain. And while the brain in this study belongs to a zebrafish, not a human, the footage is captivating, and sheds light on how researchers could use a similar technique to see how our brains work.
To observe the zebrafish brain’s neurons in real time, researchers used a fluorescent probe that makes neurons light up when they’re active. What was the zebrafish thinking about? Something we humans obsess about all the time: Food. Researchers showed the fish a squirming piece of prey, and watched as the fish’s brain perceived it and considered consuming it. “In other words, you’re seeing what the fish thinks when it sees its lunch,” explains Jamie Condliffe at Gizmodo. In the video, parts of the fish’s brain light up like lightning in a storm before the light ripples through the neurons. It is, for lack of a better phrase, so cool.
Using diagnostic imaging to monitor blood flow in subjects’ heads, researchers had 13 healthy participants sip ice water (sadly, not ice cream) through a straw pressed against the upper palate. Subjects were told to raise their hands when the headache hit, and then raised them again when the pain went away.
Researchers discovered that consuming something cold causes “an abrupt increase in blood flow to a major artery in the brain,” which is subsequently followed “by the familiar headache-like pain.” When the artery constricts again after the sudden rush of blood, the pain stops.
What if what you saw with your eyes could be interpreted in a brain-scanner? Well, that just happened. Check it out:
Gallant’s coauthors acted as study subjects, watching YouTube videos inside a magnetic resonance imaging machine for several hours at a time. The team then used the brain imaging data to develop a computer model that matched features of the videos — like colors, shapes and movements — with patterns of brain activity.
“Once we had this model built, we could read brain activity for that subject and run it backwards through the model to try to uncover what the viewer saw,” said Gallant.
Subtle changes in blood flow to visual areas of the brain, measured by functional MRI, predicted what was on the screen at the time — whether it was Steve Martin as Inspector Clouseau or an airplane. The reconstructed videos are blurry because they layer all the YouTube clips that matched the subject’s brain activity pattern. The result is a haunting, almost dream-like version of the video as seen by the mind’s eye.