If you’ve ever woken up in the middle of the night to go to the bathroom and stumbled around in the dark, banging into walls or dressers in a room you’ve walked through countless times, you’ve experienced the effects of inadequately calibrated neurons.
In many animals, humans included, an accurate sense of direction is generated with the help of brain cells known as head direction neurons, which do so by incorporating two main streams of information—visual landmarks and positional estimates based on self-movement.
Without the former, our ability to navigate even familiar locations degrades. But given a visual landmark—like the glow of an alarm clock or the shadow of a door—our internal map of the environment refreshes, and we can make our way with ease once again.
A similar process occurs in fruit flies, which use so-called compass neurons to keep track of the orientation of their heads and body. In a new study, published in Nature on Nov. 20, Harvard Medical School neuroscientists have now decoded how visual cues can rapidly reorganize the activity of these compass neurons to maintain an accurate sense of direction.
By tracking individual neurons in fruit flies as they navigate a virtual reality environment, the researchers shed light on neural mechanisms that allow organisms to build a spatial map of their world, as well as processes involved in short-term memory.
“When we look at the pattern of connections between compass neurons and the visual system, we see that they are remodeled by visual experiences,” said senior study author Rachel Wilson, the Martin Family Professor of Basic Research in the Field of Neurobiology in the Blavatnik Institute at HMS.
“These changes are happening over minutes and correspond with the timescale that we experience subjectively when we enter a new environment and explore it,” Wilson said. “To me, it’s remarkable that we can get insight into something as complicated as spatial navigation by studying a brain that’s smaller than a poppy seed.”
Virtual sun
Comprised of only around 100,000 neurons, the Drosophila fruit fly brain is capable of highly complex behaviors. Previous studies have shown that during navigation, compass neurons, also known as E-PG neurons, are critical for the fly’s ability to sense direction.
These neurons are arranged into a ring, like the dial of a compass. As the fly moves, a corresponding “bump” of neural activity moves around the ring like a compass needle—if the fly turns 90 degrees, the bump of activity also rotates 90 degrees.
In the dark, the accuracy of this “needle” diminishes due to the absence of visual clues, as the organism only has estimates of its own movements to navigate with. But given a visual prompt, the needle snaps back to place, accurately reflecting the fly’s heading.
To investigate how visual inputs alter this process, Wilson and team—including lead study author Yvette Fisher, research fellow in neurobiology; Jenny Lu, an MD/PhD student; and Isabel D’Alessandro, a research assistant—carried out a series of experiments that combined virtual reality with high-powered microscopy.
They fixed a fly to a pin with glue and lowered it onto a Styrofoam ball floating frictionless on a column of air. Surrounded by a visual panorama, the fly moved its legs to walk and turn, causing the ball to rotate and giving precise measurements of the fly’s movements. An imaging technique known as two-photon microscopy allowed the researchers to visualize the activity of individual neurons in the fly’s brain as it navigated in virtual reality.
Flies were presented with a visual cue—an unapproachable bright point of light that served to represent the sun, which insects use for long-distance navigation.
Initially, flies would try to approach the virtual sun. After some time, they would walk in a straight line at a fixed angle to the sun; and if the light moved, the flies made a compensatory turn to return to that fixed angle, demonstrating that they were paying attention to the virtual object and using it for course control.
When the team looked in the fly brain, they found that the activity of compass neurons was being influenced by visual system-associated neurons, known as R neurons. Specifically, R neurons were inhibiting compass neuron activity in a spatially specific manner, thereby reorienting the compass.
“Basically, visual system inputs seem to push the compass needle, so to speak, to the part of the compass that isn’t being inhibited,” Wilson said. “This will push the compass away from the wrong direction toward the right direction.”
Plastic memory
After the flies were acclimated, the researchers presented them with a second virtual sun, directly opposite of the first. This caused the activity of compass neurons to occasionally flip around 180 degrees.
When the second sun was removed, compass activity was variable—sometimes it would settle down into its original heading, sometimes the opposite, and sometimes, it would continue to swing around back and forth.
“It’s as if the fly became confused or was changing its mind about what direction it was pointed in,” Wilson said.
The researchers found that this process depended on the interaction of compass neurons and R neurons, specifically the strength of inhibitory activity at the synapses, or points of connection, between them. Inputs from the visual system can reshape the function of those connections over the timespan of a few minutes.
Thus, a visual cue can interact with the representation of direction contained within compass neurons and alter their activity to remodel the compass, ultimately changing the fly’s sense of direction.
“The exciting thing for us is that the pattern of inhibitory inputs from visual neurons onto compass neurons is plastic,” Wilson said. “We can reorganize that functional pattern by just giving the fly an altered experience in virtual reality.
This is likely relatable to mammals and other organisms, she added. “When navigating in a new environment, it often feels like it takes a few minutes to build up a mental map of the neighborhood or park or office you walked into. That’s the timescale in which these changes in synapse strength are occurring.”
Their findings now provide a mechanistic explanation for how visual experiences can directly alter the activity of direction-sensing neurons to change how the brain maps its internal representation of the world.
A better understanding of this process also sheds light on a form of short-term learning known as unsupervised learning, in which the brain aims to be as consistent with itself and its environment as possible, without the influence of reward or punishment.
“Short-term memory is encoded in the ring. If you turn off the lights, it retains a memory of the direction its headed,” Wilson said. “You can watch that memory evolve as the fly tracks the turns it makes and integrates those movements over time to update the compass. You can also watch that memory slowly become more inaccurate over time.”
“When you turn the lights back on, the compass clicks back into the right answer. We’ve all had that experience, I think, where you can catch sight of a visual landmark and feel the compass in your brain sort of rotate, and then you just see the world differently,” she continued. “We can watch those dynamics here in the fly brain, in real time.”
The work was supported by the HMS Neurobiology Imaging Facility (grant P30 NS072030), the National Institutes of Health (grants U19NS104655, F30DC017698, T32GM007753), and a Howard Hughes Medical Institute Hanna H. Gray Fellowship. Rachel Wilson is an HHMI Investigator.
Original post https://alertarticles.info