Home Health & Wellness Researchers Identify New Coding Mechanism That Transfers Information from Perception to Memory

Researchers Identify New Coding Mechanism That Transfers Information from Perception to Memory

Reading Time: 2 minutes

Our memories are rich in detail: Many of us can vividly recall the colour of our home, the layout of our kitchen, or the front of our favourite café. How the brain encodes this information has long puzzled neuroscientists.

In a new Dartmouth-led study, researchers identified a neural coding mechanism that allows the transfer of information back and forth between perceptual regions to memory areas of the brain. The results are published in Nature Neuroscience.

Prior to this work, the classic understanding of brain organisation was that perceptual regions of the brain represent the world “as it is”, with the brain’s visual cortex representing the external world based on how light falls on the retina, “retinotopically.” In contrast, it was thought that the brain’s memory areas represent information in an abstract format, stripped of details about its physical nature. However, according to the co-authors, this explanation fails to take into account that as information is encoded or recalled, these regions may in fact share a common code in the brain.

“We found that memory-related brain areas encode the world like a ‘photographic negative’ in space,” said co-lead author Adam Steel, a postdoctoral researcher in the Department of Psychological and Brain Sciences and fellow in the Neukom Institute for Computational Science at Dartmouth. “And that ‘negative’ is part of the mechanics that move information in and out of memory, and between perceptual and memory systems.”

In a series of experiments, participants were tested on perception and memory while their brain activity was recorded using a functional magnetic resonance imaging (fMRI) scanner. The team identified an opposing “push-pull-like” coding mechanism, which governs the interaction between perceptual and memory areas in the brain.

The results showed that when light hits the retina, visual areas of the brain respond by increasing their activity to represent the pattern of light. Memory areas of the brain also respond to visual stimulation, but, unlike visual areas, their neural activity decreases when processing the same visual pattern.

The co-authors report that the study has three unusual findings. The first is their discovery that a visual coding principle is preserved in memory systems.

The second is that this visual code is upside-down in memory systems. “When you see something in your visual field, neurons in the visual cortex are driving while those in the memory system are quieted,” said senior author Caroline Robertson, an assistant professor of psychological and brain sciences at Dartmouth.

Third, this relationship flips during memory recall. “If you close your eyes and remember that visual stimuli in the same space, you’ll flip the relationship: Your memory system will be driving, suppressing the neurons in perceptual regions,” said Robertson.

“Our results provide a clear example of how shared visual information is used by memory systems to bring recalled memories in and out of focus,” added co-lead author Ed Silson, a lecturer of human cognitive neuroscience at the University of Edinburgh.

Moving forward, the team plans to explore how this push-and-pull dynamic between perception and memory may contribute to challenges in clinical conditions, including in Alzheimer’s.

© Copyright 2014–2034 Psychreg Ltd