Common brain activity patterns during perception, imagery, and dreaming
細川 友紀奈 (1251092)
Dreams are internally generated conscious experiences during sleeping and are induced without external stimuli. A machine learning-based decoding analysis provides a means to investigate similarities of neural activity patterns across different experiences, and previous studies have demonstrated decoding both between perception and imagery, and between perception and dreaming using human fMRI signals in higher visual cortex. However, it has never been examined whether imagery and dreaming are represented by similar activity patterns, and whether dreaming is more closely related to imagery or to perception is unknown. In this study, we performed decoding analysis of visual object categories among the three modalities to characterize neural states of dreaming in relation to perception and imagery, in which decoders were trained using fMRI activities induced by either perception or imagery and tested on those measured during dreaming. The decoder trained using the higher visual cortical activity during imagery accurately predicted objects viewed in dreaming as well as the percept-trained decoder. Furthermore, the same decoding analyses performed using fMRI activities from multiple brain areas revealed distinct performance dependences on the areas across the percept- and imagery-trained decoders. These results suggest that dreaming is represented by common neural activity patterns both with perception and imagery in broad brain areas, including the higher visual areas, while the degree of the commonalities is different across areas, indicating the singularity of the neural state during dreaming.