A recent experiment in Japan shows that some of our most private thoughts may be more accessible than previously assumed. By hooking up volunteers to brain scanners, and then correlating accounts of their dreams to visual images, researchers have developed a brain decoding technique allowing them to predict broad categories of people's dreams with up to 60% accuracy.
For the experiment, three volunteers were asked to take naps inside an fMRI scanner. The researchers, a team led by Yuki Kamitani from the ATR Computational Neuroscience Lab in Kyoto, then monitored their brain activity, looking for signs that a dream state had been entered into. Once it appeared that the participants were dreaming, they were woken up and asked to describe their visual experiences.
The Telegraph explains more:
After gathering around 200 dream reports from each subject, repeated elements such as ''tree'' or ''man'' were grouped into roughly 20 broad categories. These were tailored to each participant. In one case, ''ice pick'', ''key'' and ''plunger'' were all placed in the category ''implement''.
Recordings from the fMRI brain scans were examined for activity patterns that coincided with the dream categories. Volunteers were also asked to look at photos from the internet corresponding to their dreams while their brain activity was monitored.
The data were used to train a computer programme to recognise the brain activity ''signatures'' associated with different types of dream image, while weeding out non-visual brain activity during sleep.
In a second round of dreaming, the programme successfully predicted what kind of images each volunteer was dreaming about with 60 per cent accuracy.
The system is far from perfect, and as noted, it can only predict visualizations from a broad set of categories; decoding the finer content of dreams is still not possible. But that said, it did a remarkably good job — one better than chance — at predicting, say, whether the volunteer was dreaming about a person or an apartment building.
Read the entire study at Science.
Image: Shutterstock/vgstudio.