Brain tissue structure could explain link between fitness and memory

Studies have suggested a link between fitness and memory, but researchers have struggled to find the mechanism that links them. A new study by University of Illinois researchers found that the key may lie in the microstructure of the hippocampus, a region in the middle of the brain involved in memory processes.

Aron Barbey, a professor of psychology, led a group of researchers at the Beckman Institute for Advanced Science and Technology at Illinois that used a specialized MRI technique to measure the structural integrity of the hippocampus in healthy young adults and correlated it with their performances on fitness and memory tests. They found that viscoelasticity, a measure of structural integrity in brain tissue, was correlated with fitness and memory performance — much more so than simply looking at the size of the hippocampus.

“Using a new tool to examine the integrity of the hippocampus in healthy young adults could tell us more about how this region functions and how to predict decline for early intervention,” Barbey said. “By the time we look at diseases states, it’s often too late.”

Prior research led by Illinois psychology professor Neal Cohen, who is also a co-author on the new paper, demonstrated that the hippocampus is critical for relational memory and that the integrity of this region predicts a host of neurodegenerative diseases. To date, much research on the hippocampus’ structure has focused on its size.

Studies in developing children and declining older adults have found strong correlations between hippocampal size and memory. However, size does not seem to matter as much in healthy young adults, said postdoctoral researcher Hillary Schwarb. The Illinois group looked instead at the microstructure of the tissue, using an emerging neuroimaging tool called magnetic resonance elastography. The method involves an MRI scan, but with a pillow under the subject’s head vibrating at a very low amplitude — as gentle as driving on the interstate, Schwarb said. The vibration is the key to measuring the structural integrity of the hippocampus.

“It’s a lot like sending ripples through a still pond — if there’s some large thing like a boulder under the surface, the ripples are going to displace around it,” Schwarb said. “We are sending waves through the brain and reconstructing the displacements into a map we can look at and measure.”

The study, published in the journal NeuroImage, found that those who performed better on the fitness test tended to also perform better on the memory task, confirming a correlation the group had noticed before. But by adding the information on the structure of the hippocampus, the researchers were able to find a possible pathway for the link. They found that the subjects with higher fitness levels also had more elastic tissue in the hippocampus. The tissue structure, in turn, was associated with memory.

“We found that when the hippocampus is more elastic, memory is better. An elastic hippocampus is like a firm foam mattress pad that pops right back up after you get up,” said study co-author Curtis Johnson, a former graduate researcher at the Beckman Institute who is now a professor at the University of Delaware. “When the hippocampus is more viscous, memory is worse. A viscous hippocampus is like a memory-foam mattress that holds its shape even after you get up.”

The results suggest that the viscoelasticity of the hippocampus may be the mediating factor in the relationship between fitness and memory in healthy young adults.

“It also shows us that magnetic resonance elastography is a useful tool for understanding tissue microsctructure, and that microstructure is important to cognition,” Schwarb said. “This provides us a new level of analysis for studying the human brain.”

Working human forebrain circuits assembled in a lab dish

Peering into laboratory glassware, Stanford University School of Medicine researchers have watched stem-cell-derived nerve cells arising in a specific region of the human brain migrate into another brain region. This process recapitulates what’s been believed to occur in a developing fetus, but has never previously been viewed in real time.

The investigators saw the migrating nerve cells, or neurons, hook up with other neurons in the target region to form functioning circuits characteristic of the cerebral cortex.

These observations showcase neuroscientists’ newfound ability to monitor, assemble and manipulate so-called neural spheroids, generated from human induced pluripotent stem cells, to study the normal development of the human forebrain during late pregnancy.

“We’ve never been able to recapitulate these human-brain developmental events in a dish before,” said the study’s senior author, Sergiu Pasca, MD, assistant professor of psychiatry and behavioral sciences. “The process happens in the second half of pregnancy, so viewing it live is challenging. Our method lets us see the entire movie, not just snapshots.”

The findings, and the techniques used to obtain them, carry potential for the personalized study of individuals’ psychiatric disorders. In the study, to be published online April 26 in Nature, the scientists were able to attribute, for the first time, defects in neuronal migration to Timothy syndrome, a rare condition that predisposes people to autism, epilepsy and cardiac malfunction. Postdoctoral scholars Fikri Birey, PhD, Jimena Andersen, PhD, and Chris Makinson, PhD, share lead authorship.

The need for 3-D models

Culturing neurons in a lab dish is old hat. But the two-dimensional character of life lived atop a flat glass coverslip doesn’t sit well with cells designed for three-dimensional existence. Neurons cultured in monolayers mature only partially, tend to die fairly quickly and interact suboptimally.

In a 2015 study, Pasca and his colleagues described their method for producing neural spheroids. Neural precursor cells generated from iPS cells are placed in culture dishes whose bottoms are coated to make it impossible for neurons to attach. The cells float freely in a nutrient-rich broth, ultimately developing into hundreds of almost perfectly round balls approaching 1/16 of an inch in diameter and consisting of over 1 million cells each. These neurons can live for up to two years, and they mature fully.

The spheroids created in the 2015 study recapitulated the human cerebral cortex’s six-layer-thick architecture, and the neurons they contained were of the type that arise in and dominate the cerebral cortex. They’re called glutamatergic neurons because they secrete the excitatory chemical glutamate.

But the cerebral cortex’s glutamatergic neurons don’t remain alone for long. During fetal development, they are eventually joined by neurons of another type that originate in a slightly deeper region of the developing forebrain. These neurons secrete a neuromodulatory — and usually inhibitory — substance called GABA, so they’re deemed GABAergic. It’s known that GABAergic cells migrate from their region of origin to the cortex, where they interlace with its resident glutamatergic cells and with one another to form the circuitry responsible for the brain’s most advanced cognitive activities. But no one had been able to watch this happen with human cells in a dish.

In the new study, the researchers separated their spheroids into two batches and coaxed them to become different regions of the human brain. They cultured one batch in a medium that induces cortexlike spheroids containing glutamatergic neurons. They placed the second batch in dishes whose broth steers the spheroids toward resembling the underlying brain region where GABAergic neurons originate.

Then, the investigators juxtaposed the two distinct types of spheroids. Within three days, the two spheroids fused, and GABAergic neurons from one spheroid began migrating into the glutamatergic-neuron-rich spheroid. Their migration pattern, the scientists noted, was halting: They would move toward the target spheroid for a little while, then stop for an extended period, then start up again in stuttering jumps.

On reaching their destination, the GABAergic travelers underwent a transformation, sprouting dendrites — branching, foliage-like “tails” that receive inputs from other neurons — and forming working connections with the glutamatergic neurons. Electrophysiological tests revealed that GABAergic and glutamatergic neurons were successfully forming circuits and signaling to one another.

Insight into Timothy syndrome

The scientists had access to tissue samples from patients with Timothy syndrome, an extremely rare and historically lethal condition caused by a mutation in the gene coding for a type of calcium channel — a protein containing a pore that responds to different voltage levels by opening or closing, respectively permitting or blocking the flow of calcium across otherwise impermeable membranes. Such calcium channels are essential to many cellular processes. Timothy syndrome patients’ severe cardiac abnormalities once spelled ultra-short life expectancies, but now can be ameliorated with pacemakers. However, survivors usually have autism and frequently have epilepsy.

The investigators generated both types of neural spheroids from their Timothy-syndrome tissue samples, fused them and watched to see what would happen. What they saw was this: The GABAergic neurons, which seemed to develop normally, exhibited aberrant start-and-stop migration patterns. Their forward movements were more frequent, but far less efficient, than those of normal neurons.

The mutation behind Timothy syndrome increases the likelihood that the calcium channel for which it codes will let calcium ions flow through it. So, the researchers reasoned, a drug impeding the channel’s activity might reverse the aberration. Indeed, two different drugs that block this type of calcium channel restored normal migratory activity to the Timothy-syndrome-derived GABAergic neurons.

Diverse variants in the same gene responsible for Timothy syndrome are associated with schizophrenia, other forms of autism spectrum disorder and bipolar disorder. Pasca said he suspects these variants may affect GABAergic neurons’ integration with cortical glutamatergic neurons, resulting in a cognition-altering imbalance between excitation and inhibition in the cortex and laying the groundwork for these disorders.

“Our method of assembling and carefully characterizing neuronal circuits in a dish is opening up new windows through which we can view the normal development of the fetal human brain,” said Pasca. “More importantly, it will help us see how this goes awry in individual patients.”

Stanford’s Office of Technology Licensing has filed for a patent on the intellectual property involving the generation of brain-region-specific neural spheroids and their assembly for studying development and disease.

###

Other Stanford study co-authors are postdoctoral scholars Saiful Islam, PhD, and Nina Huber, PhD; senior research scientists Nancy O’Rourke, PhD, and Wu Wei, PhD; high school lab intern Nicholas Thom; Lars Steinmetz, PhD, professor of genetics; Jonathan Bernstein, MD, PhD, associate professor of pediatrics; Joachim Hallmayer, MD, professor of psychiatry and behavioral sciences; and John Huguenard, PhD, professor of neurology and of neurosurgery.

The study was funded by the National Institutes of Health (grants R01MH100900, R01MH100900, R01MH107800 and P01HG00020526), the California Institute for Regenerative Medicine, the MQ Foundation, the Donald E. and Delia B. Baxter Foundation, the Kwan Research Fund, the Stanford Child Research Health Institute, the Wishes for Elliot Foundation, a Walter V. and Idun Berry Postdoctoral Fellowship, the Stanford School of Medicine Dean’s Office, the UC-San Francisco Program for Breakthrough Biomedical Research and the Sandler Foundation.

Stanford’s Department of Psychiatry and Behavioral Sciences and the Stanford Center for Sleep Sciences and Medicine also supported the work.

The Stanford University School of Medicine consistently ranks among the nation’s top medical schools, integrating research, medical education, patient care and community service. For more news about the school, please visit http://med.stanford.edu/school.html. The medical school is part of Stanford Medicine, which includes Stanford Health Care and Stanford Children’s Health. For information about all three, please visit http://med.stanford.edu.

Print media contact: Bruce Goldman at (650) 725-2106 (goldmanb@stanford.edu) Broadcast media contact: Becky Bach at (650) 724-2454 (retrout@stanford.edu)

 

Motor neurons adjust to control tasks, new brain research reveals

New research from Carnegie Mellon University’s College of Engineering and the University of Pittsburgh reveals that motor cortical neurons optimally adjust how they encode movements in a task-specific manner. The findings enhance our understanding of how the brain controls movement and have the potential to improve the performance and reliability of brain-machine interfaces, or neural prosthetics, that assist paralyzed patients and amputees.

“Our brain has an amazing ability to optimize its own information processing by changing how individual neurons represent the world. If we can understand this process as it applies to movements, we can design more precise neural prostheses,” says Steven Chase, assistant professor in the Department of Biomedical Engineering and the Center for Neural Basis of Cognition. “We can one day, for example, design robotic arms that more accurately implement a patient’s intended movement because we now better understand how our brain adjusts on a moment-by-moment basis when we are in motion.”

The study explored the change in brain activity during simple motor tasks performed through virtual reality in both 2-D and 3-D. The researchers wanted to know if the motor cortical neurons would automatically adjust their sensitivity to direction when presented with a wide range of possible directions instead of a narrow one. Previous research in the field has suggested that this phenomenon, called dynamic range adaptation, is known to occur in neurons sensitive to sound, touch, and light — prompting the researchers to ask if the same phenomena would apply to neurons in the motor system that are associated with movement.

“When you walk out into the bright summer sun, you squint, and the neurons in your retina use dynamic range adaptation to automatically increase their sensitivity so that you can clearly see until the clouds pass over again,” explains Robert Rasmussen, MD/Ph.D. student at the University of Pittsburgh School of Medicine and first author of the study. “This feature allows the brain to better encode information by using its limited resources efficiently. We wanted to find out if our brain encodes movement in the same way.”

The results revealed that dynamic range adaptation did indeed occur in the motor cortical neurons. Based on these findings, the researchers concluded that this feature is widespread throughout the brain.

“We found that dynamic range adaptation isn’t restricted to sensory areas of the brain. Instead, it is a ubiquitous encoding feature of the cortex,” explains Andrew Schwartz, distinguished professor of neurobiology and chair in systems neuroscience at the University of Pittsburgh School of Medicine, and a member of the University of Pittsburgh Brain Institute. “Our findings show that it is a feature of information processing, which your brain uses to efficiently process whatever information it is given — whether that is light, sound, touch, or movement. This is an exciting result that will motivate further research into motor learning and future clinical applications.”

Story Source:

Materials provided by College of Engineering, Carnegie Mellon University. Note: Content may be edited for style and length.

Deep brain stimulation decreases tics in young adults with severe Tourette syndrome

A surgical technique that sends electrical impulses to a specific area of the brain reduces the “tics,” or involuntary movements and vocal outbursts, experienced by young adults with severe cases of Tourette syndrome, according to a new study led by investigators from NYU Langone Medical Center.

The study, published April 7 in the Journal of Neurosurgery, is a retrospective review of Tourette patients who underwent an experimental technique known as thalamic deep brain stimulation (DBS) at NYU Langone. The findings, according to the researchers, add to a growing body of evidence supporting DBS as a safe and effective treatment for severe cases of Tourette syndrome — and may ultimately lead to approval by the U.S. Food and Drug Administration.

“Our study shows that deep brain stimulation is a safe, effective treatment for young adults with severe Tourette syndrome that cannot be managed with current therapies,” says Alon Mogilner, MD, PhD, an associate professor in the departments of neurosurgery and anesthesiology, at NYU Langone, and director of its Center for Neuromodulation. “This treatment has the potential to improve the quality of life for patients who are debilitated through their teenage years and young adulthood.”

Tourette syndrome begins in childhood and many patients improve as they get older. However, for some, the symptoms become so severe that they become socially isolated and unable to work or attend school.

Dr. Mogilner and his colleague, Michael H. Pourfar, MD, an assistant professor in the departments of neurosurgery and neurology and co-director of the Center for Neuromodulation, have pioneered the largest U.S. case series of thalamic DBS to treat severe Tourette syndrome in young adults. Worldwide, only an estimated 160 cases have been performed to date.

In a multi-stage procedure, they insert two electrodes into a region of the brain called the medial thalamus, part of the brain circuit that functions abnormally in Tourette’s. During a second surgery the following day or a few days later, a pacemaker-like device called a neurostimulator is connected to the electrodes to emit electrical impulses into the medial thalamus. These impulses are adjusted during a series of follow-up outpatient visits to find the combination of settings that best control symptoms.

In the study, the NYU Langone team followed 13 patients with at least six months of follow-up visits. Study participants ranged in age from 16 years to 33 years. To determine the effectiveness of the procedure, the researchers measured the severity of tics before and after surgery using the Yale Global Tic Severity Scale (YGTSS). They found that the severity of tics decreased on average 37 percent from the time of the operations to the first follow-up visit. At their latest visit, patients’ tic scores decreased by an average of 50 percent.

Equally significant, all patients reported in a survey six months after surgery that their symptoms improved either “much” or “very much,” and all said they would have the surgery again — even those who had complications or experienced relatively less pronounced responses. “The survey represents an important aspect of the study,” Dr. Pourfar says, “because the YGTSS, though a validated scale, may not fully capture the impact of DBS on quality of life for a person with Tourette syndrome.”

Since the study completed, four more patients have undergone DBS surgery for Tourette syndrome.

DBS has been used to treat other neurological conditions that cannot be adequately controlled by medication, including Parkinson’s disease, essential tremor, dystonia and epilepsy.

Because the FDA has not yet approved DBS for the treatment of Tourette syndrome, it is still considered investigational. As a result, a committee of independent specialists reviews each case to ensure that patients have tried alternate therapies and that the disability is severe enough to warrant undergoing the procedure.

In addition to Drs. Mogilner and Pourfar, Richard S. Dowd, MD, of Tufts Medical Center served as a co-author.

Dr. Pourfar reports receiving consulting fees and teaching honoraria from Medtronic Neurological, the manufacturer of the DBS device used in the study. Dr. Mogilner reports receiving consulting fees, honoraria and grant support from Medtronic Neurological, and consulting fees from Alpha-Omega Engineering.

Getting a leg up: Hand task training transfers motor knowledge to feet

The human brain’s cerebellum controls the body’s ability to tightly and accurately coordinate and time movements as fine as picking up a pin and as muscular as running a foot race. Now, Johns Hopkins researchers have added to evidence that this structure also helps transfer so-called motor learning from one part of the body to another.

One implication of the research, the Johns Hopkins investigators say, is that practicing a newly learned task involving the hands can also improve a person’s ability to do the same task with the foot, and vice versa.

“Our study gives us new insight into the cerebellum’s role in the learning process, information that maybe someday we can use to enhance the learning transfer between limbs so that we can rehabilitate patients who have lost function in hands, feet, arms or legs,” says Pablo Celnik, M.D., director of physical medicine and rehabilitation at the Johns Hopkins University School of Medicine.

The study, described in The Journal of Neuroscience on March 1, was primarily designed to demonstrate the value of a brain stimulation technique called cerebellar inhibition that can be used to investigate how connections in the brain change as people learn new motor skills.

For the study, investigators recruited 32 healthy subjects with an average age of 23.9. The subjects were asked to learn to play a computer-based game in which they needed to move a cursor from a starting point to a target. However, the researchers adjusted the movement of the cursor so that it moved at a 30-degree angle from the position of the mouse, forcing the subjects to adapt their movements to reach the target with the cursor.

Each subject learned the new task either with the hand or the foot. During this process, the researchers used magnetic stimulation to measure activity in two areas of the brain, the motor cortex and the cerebellum. The electrical brain activity between these areas was used to calculate the degree of connectivity between them.

In one part of the experiment, Celnik’s team tested to see if learning a new task incited change in the connection between the motor cortex and the cerebellum. Twenty subjects trained at the task with their right hand. After measuring the subjects’ baseline performance, the researchers switched to the angle-adjusted mouse, and the subjects completed 144 more trials. The measurements from these subjects showed that the connectivity between the cerebellum and the motor cortex changed not just for the areas of the motor cortex that controlled the right hand, but also in the areas known to control the right foot.

The researchers then explored whether this change in activity resulted in an actual transfer of skills from the hand to the foot. Ten patients completed 48 training trials with the angle-adjusted cursor using their right foot, followed by similar test trials for the right hand.

“Without first training the right hand, the subjects’ ability to complete the task improved from the baseline measurements, showing that the learning transferred from the foot,” Celnik says.

In a third part of the experiment, the researchers investigated whether the brain changes were exclusive to learning a new task. Instead of having the subjects train using the adjusted mouse, they instructed subjects to perform a task they already knew, like lifting a finger. Researchers measured the connectivity between the motor cortex and the cerebellum and found that unlike learning a new task, the activity between the motor cortex and the cerebellum did not change when executing a familiar task.

“This shows us there is something special about learning something new that changes how areas of the brain interact that does not happen when we do a movement we already knew how to do,” says Danny Spampinato, a biomedical engineering graduate student at the Johns Hopkins University School of Medicine.

In the future, the researchers say they hope to use the same cerebellar measurements to get a better understanding of this brain area’s role in executing everyday tasks useful to those undergoing rehabilitation after injury or stroke, for example.

 

Blind tadpoles learn visually with eyes grafted onto tail, neurotransmitter drug treatment

Blind tadpoles were able to process visual information from eyes grafted onto their tails after being treated with a small molecule neurotransmitter drug that augmented innervation, integration, and function of the transplanted organs, according to a paper published online today by researchers at the Allen Discovery Center at Tufts University in npj Regenerative Medicine, a Nature Research journal. The work, which used a pharmacological reagent already approved for use in humans, provides a potential road map for promoting innervation — the supply of nerves to a body part — in regenerative medicine.

The researchers sought to better understand how the nascent nerves of re-grown or implanted structures integrate into a host. A lack of innervation and integration can be a barrier in regenerative medicine, particularly for sensory organs that must form connections with the host to communicate auditory, visual and tactile information.

In an effort to identify ways to increase innervation, researchers grafted eyes onto the trunk of the tails of blind tadpoles. They then treated the animals with Zolmitriptan, a compound that activates serotonin receptors 1B and 1D (5-HT1B/D), which have been associated with neural development. Treated tadpoles showed a significant increase in graft innervation without changes to the host’s original nervous system.

The researchers then tested the tadpoles’ ability to distinguish color by creating a test in which the tadpoles were discouraged from occupying a red space in favor of a blue one. Seventy-six percent of sighted tadpoles passed the test. Only 3 percent of blind tadpoles passed the test, while 11 percent of blind tadpoles with eye grafts did so. But among tadpoles with eye grafts that had been treated with 5-HT1B/D and seen graft innervation as a result, 29 percent passed the test.

In addition to testing the tadpoles’ ability to detect color, the researchers also tested true image-forming vision, by determining the tadpoles’ ability to follow optical patterns rotating in clockwise and counterclockwise directions. Dishes of tadpoles were placed above an LCD screen displaying patterns of triangular clusters rotating slightly every second. Eighty percent of sighted tadpoles followed the pattern compared to 38 percent of blind tadpoles and 32 percent of tadpoles with untreated eye grafts. By contrast, 57 percent of tadpoles with innervated eye grafts induced by exposure to the 5-HT1B/D agonist drug were able to follow the rotating patterns. Importantly, this drug is currently in use for treatment of migraine in human patients, providing a proof-of-principle of repurposing neurotransmitter drugs for regenerative medicine in general, and for control of innervation and transplanted organ functionality specifically.

“For regenerative medicine to move forward and enable the repair of damaged tissues and organ systems, we need to understand how to promote innervation and integration of transplanted organs,” said the paper’s corresponding author, Michael Levin, Ph.D., Vannevar Bush professor of biology and director of the Allen Discovery Center at Tufts and the Tufts Center for Regenerative and Developmental Biology. “This research helps illuminate one way to promote innervation and establish neural connections between a host central nervous system and an implant, using a human-approved small molecule drug.”

While studies have examined how human-machine interfaces — including cochlear implants and retinal prosthetics — may be used to treat deafness and blindness, this research examines brain-body plasticity using novel neurogenesis to integrate biological implants, added co-author Douglas Blackiston, post-doctoral associate, Department of Biology and Center for Regenerative and Developmental Biology, Tufts University.

“The fact that the grafted eyes in our model system could transmit visual information, even when direct connections to the brain were absent, suggests the central nervous system contains a remarkable ability to adapt to changes both in function and connectivity,” said Blackiston.

Story Source:

Materials provided by Tufts University. Note: Content may be edited for style and length.

 

Natural chemical helps brain adapt to stress

A natural signaling molecule that activates cannabinoid receptors in the brain plays a critical role in stress-resilience — the ability to adapt to repeated and acute exposures to traumatic stress, according to researchers at Vanderbilt University Medical Center.

The findings in a mouse model could have broad implications for the potential treatment and prevention of mood and anxiety disorders, including major depression and post-traumatic stress disorder (PTSD), they reported in the journal Nature Communications.

“The study suggests that deficiencies in natural cannabinoids could result in a predisposition to developing PTSD and depression,” said Sachin Patel, M.D., Ph.D., director of the Division of Addiction Psychiatry at Vanderbilt University School of Medicine and the paper’s corresponding author.

“Boosting this signaling system could represent a new treatment approach for these stress-linked disorders,” he said.

Patel, the James G. Blakemore Professor of Psychiatry, received a Presidential Early Career Award for Scientists and Engineers last year for his pioneering studies of the endocannabinoid family of signaling molecules that activate the CB1 and CB2 cannabinoid receptors in the brain.

Tetrahydrocannabinol (THC), the active compound in marijuana, binds the CB1 receptor, which may explain why relief of tension and anxiety is the most common reason cited by people who use marijuana chronically.

Patel and his colleagues previously have found CB1 receptors in the amygdala, a key emotional hub in the brain involved in regulating anxiety and the fight-or-flight response. They also showed in animal models that anxiety increases when the CB1 receptor is blocked by a drug or its gene is deleted.

More recently they reported anxiety-like and depressive behaviors in genetically modified mice that had an impaired ability to produce 2-arachidonoylglycerol (2-AG), the most abundant endocannabinoid. When the supply of 2-AG was increased by blocking an enzyme that normally breaks it down, the behaviors were reversed.

In the current study, the researchers tested the effects of increasing or depleting the supply of 2-AG in the amygdala in two populations of mice: one previously determined to be susceptible to the adverse consequences of acute stress, and the other which exhibited stress-resilience.

Augmenting the 2-AG supply increased the proportion of stress-resilient mice overall and promoted resilience in mice that were previously susceptible to stress, whereas depleting 2-AG rendered previously stress-resilient mice susceptible to developing anxiety-like behaviors after exposure to acute stress.

Taken together, these results suggest that 2-AG signaling through the CB1 receptor in the amygdala promotes resilience to the adverse effects of acute traumatic stress exposure, and support previous findings in animal models and humans suggesting that 2-AG deficiency could contribute to development of stress-related psychiatric disorders.

Marijuana use is highly cited by patients with PTSD as a way to control symptoms. Similarly, the Vanderbilt researchers found that THC promoted stress-resilience in previously susceptible mice.

However, marijuana use in psychiatric disorders has obvious drawbacks including possible addiction and cognitive side effects, among others. The Vanderbilt study suggests that increasing production of natural cannabinoids may be an alternative strategy to harness the therapeutic potential of this signaling system.

If further research finds that some people with stress-related mood and anxiety disorders have low levels of 2-AG, replenishing the supply of this endocannabinoid could represent a novel treatment approach and might enable some of them to stop using marijuana, the researchers concluded.

Story Source:

Materials provided by Vanderbilt University Medical Center. Original written by Bill Snyder. Note: Content may be edited for style and length.

Identifying genes key to human memory: Insights from genetics and cognitive neuroscience

Researchers have identified more than 100 genes important for memory in people. The study is the first to identify correlations between gene data and brain activity during memory processing, providing a new window into human memory.

“This is very exciting because the identification of these gene-to-behavior relationships opens up new research avenues for testing the role of these genes in specific aspects of memory function and dysfunction,” says Genevieve Konopka of UT Southwestern, who is presenting this new work in San Francisco today at the Cognitive Neuroscience Society (CNS) annual conference. “It means we are closer to understanding the molecular mechanisms supporting human memory and thus will be able to use this information someday to assist with all kinds of memory issues.”

The study is part of the nascent but growing field of “imaging genetics,” which aims to relate genetic variation to variation in brain anatomy and function. “Genes shape the anatomy and functional organization of the brain, and these structural and functional characteristics of the brain give rise to the observable behaviors,” says Evelina Fedorenko of Harvard Medical School and Massachusetts General Hospital.

While past work has aimed to connect behavior to genes, researchers have lacked neural markers, which can provide a powerful bridge between the two. “Probing the genes-brain relationship is likely to yield a rich understanding of the human cognitive and neural architecture, including insights into human uniqueness in the animal kingdom,” says Fedorenko, who is chairing the symposium on imaging genetics at the CNS conference.

The new field is now possible because genotyping has gotten progressively cheaper and easier, while large brain-imaging and electrophysiology datasets have become increasingly available. At the same time, there has been a rise in the number of large-scale international collaborations (e.g., ENIGMA) that “foster novel theorizing, further methodological innovations, as well as allow aggregating datasets across labs, countries and continents,” she says.

Combining cognitive neuroscience with genetics can involve several different approaches, Fedorenko says. Researchers, for example, can search for neural differences in individuals with developmental disorders that are associated with certain genetic variants and compare them to a control group. Others may compare brain anatomy and function in identical versus fraternal twins. While yet other researchers may look for patterns of gene expression across the cortex and relate the observed patterns to other data on brain architecture — which is the approach Konopka and colleagues used for the new memory-gene study.

The goal of the study, performed in collaboration with neurosurgeon Dr. Bradley Lega, was to identify genes important for “normal cognition” such as learning and memory. Previous work established that certain groups of genes have altered gene expression in individuals with cognitive deficits. The work also builds on prior analyses by Konopka’s team of fMRI data, linking resting-state brain behavior to specific genes.

The researchers used two sets of data: RNA in post-mortem brain tissue and intracranial EEG (iEEG) data from epilepsy patients. “We measure RNA as a proxy for gene expression in the brain,” Konopka explains. “Quantitating RNA in the brain requires extracting RNA from the brain tissue itself. Thus, we are limited to accessing brain tissue post-mortem, or, in rare occasions, can obtain tissue from surgical resections of the brain.”

The iEEG dataset includes data from epilepsy patients performing an episodic memory task while they were undergoing electrode monitoring to localize seizures. Collected over 10 years from the University of Pennsylvania and Thomas Jefferson University, it is one of the largest available datasets for such memory data across the brain. “While the subjects all suffer from epilepsy, we take several precautions to include the intracranial data that is not affected by epileptic activity,” Konopka says. “Thus, we believe the resulting genes we identify are generalizable beyond the epilepsy population.” Both the RNA and iEEG data are from neocortical regions of the left hemisphere of the brain, allowing for population-level analysis.

The genes the researchers identified as being important for human memory are distinct from genes previously correlated with other types of cognitive processing and resting state fMRI activity. “At this point, we cannot say whether the gene expression itself might drive memory or whether it is simply a reflection of the brain activity patterns needed for proper memory formation,” Konopka says.

The memory genes also overlap with several genes associated with autism, which means “we have identified a window into the molecular pathways important for normal memory function that are at risk from a genetic perspective in autism,” she says. The new study will inform future work, specifically identifying gene targets for further experimentation in animal models of memory function.

Fedorenko is excited by these and many other early findings from the new imaging genetics field. But she points out that their robustness and replicability have yet to be established. “We, as a field, need to increase our standards of rigor and require results to be replicated at least across two datasets before they are published, so as not to flood the literature with false positives,” she says.

Another challenge, Fedorenko says, surrounds the issue of “big data.” Understanding genetic variability intrinsically requires large numbers, she says. Indeed, her own work on the human language system has accumulated large datasets with good individual-level neural markers of language activity, enabling brain-genetic investigations. “Oftentimes, however, data-mining bottom-up approaches need to be supplemented with more targeted hypothesis-driven carefully controlled experimental studies,” she says.

Still, Fedorenko emphasizes that the marriage between cognitive neuroscience and genetics is likely to be a fruitful one. “Given the inherently interdisciplinary nature of this emerging field of research, I hope many young neuroscientists and geneticists will get excited about possibilities of new critical discoveries and join our efforts by bringing in fresh energy and revolutionary ideas, so that together we can understand how genes give rise to our neural and cognitive architecture.”

 

New gene discovered associated with Tau, a common form of brain pathology

Investigators at Rush University Medical Center and the Brigham and Women’s Hospital in Boston reported the discovery of a new gene that is associated with susceptibility to a common form of brain pathology called Tau that accumulates in several different conditions, including Alzheimer’s disease, certain forms of dementia and Parkinsonian syndromes as well as chronic traumatic encephalopathy that occurs with repeated head injuries.

Published in Molecular Psychiatry, the manuscript describes the identification and validation of a genetic variant within the protein tyrosine phosphatase receptor-type delta (PTPRD) gene.

“Aging leads to the accumulation of many different pathologies in the brain,” said co-principal investigator Dr. David Bennett who directs the Alzheimer Disease Center at Rush. “One of the most common forms of pathology is the neurofibrillary tangle (NFT) that was at the center of our study,” he said. “The NFT is thought to be more closely related to memory decline than other forms of aging-related pathologies, but there are still very few genes that have been implicated in the accumulation of this key feature of Alzheimer’s disease and other brain diseases.”

Using autopsies from 909 individuals participating in studies of aging based at Rush University, the team of investigators assessed the human genome for evidence that a genetic variant could affect NFT. Lead author Dr. Lori Chibnik of Brigham and Women’s Hospital said that “The variant that we discovered is common: most people have one or two copies of the version of the gene that is linked to accumulating more pathology as you get older. Interestingly, tangles can accumulate through several different mechanisms, and the variant that we discovered appears to affect more than one of these mechanisms.”

The reported results offer an important new lead as the field of neurodegeneration searches for robust novel targets for drug development. This is especially true given the recent disappointing results in Alzheimer’s disease trials targeting amyloid, the other major form of pathology related to Alzheimer’s disease.

Tau pathology is more closely connected to loss of brain function with advancing age and may be more impactful as a target. The advent of new techniques to measure Tau in the brains of living individuals with positron emission tomography offers a biomarker for therapies targeting Tau. Dr. De Jager, co-principal investigator at Brigham and Women’s Hospital notes, “This study is an important first step. However, the result needs further validation, and the mechanism by which the PTPRD gene and the variant that we have discovered contribute to the accumulation of NFT remains elusive. Other studies in mice and flies implicate PTPRD in memory dysfunction and worsening of Tau pathology, suggesting that altering the level of PTPRD activity could be helpful in reducing an individual’s burden of Tau pathology.”

Tau pathology is one of the defining features of Alzheimer disease, which is the most common form of dementia in older age. While symptomatic treatments exist, there are currently no preventive therapies. PTPRD is an intriguing new candidate that deserves further evaluation in the search for Alzheimer’s disease therapies.

Story Source:

Materials provided by Rush University Medical Center. Note: Content may be edited for style and length.

 

Surprising culprit in nerve cell damage identified

In many neurodegenerative conditions — Parkinson’s disease, amyotrophic lateral sclerosis (ALS) and peripheral neuropathy among them — an early defect is the loss of axons, the wiring of the nervous system. When axons are lost, nerve cells can’t communicate as they should, and nervous system function is impaired. In peripheral neuropathy in particular, and perhaps other diseases, sick axons trigger a self-destruct program.

In new research, scientists at Washington University School of Medicine in St. Louis have implicated a specific molecule in the self-destruction of axons. Understanding just how that damage occurs may help researchers find a way to halt it.

The study is published March 22 in the journal Neuron.

“Axons break down in a lot of neurodegenerative diseases,” said senior author Jeffrey D. Milbrandt, MD, PhD, the James S. McDonnell Professor and head of the Department of Genetics. “Despite the fact these diseases have different causes, they are all likely rooted in the same pathway that triggers axon degeneration. If we could find a way to block the pathway, it could be beneficial for many different kinds of patients.”

Since the molecular pathway that leads to loss of axons appears to do more harm than good, it’s not clear what role this self-destruct mechanism plays in normal life. But scientists suspect that if the pathway that destroys axons could be paused or halted, it would slow or prevent the gradual loss of nervous system function and the debilitating symptoms that result. One such condition, peripheral neuropathy, affects about 20 million people in the United States. It often develops following chemotherapy or from nerve damage associated with diabetes, and can cause persistent pain, burning, stinging, itching, numbness and muscle weakness.

“Peripheral neuropathy is by far the most common neurodegenerative disease,” said co-author Aaron DiAntonio, MD, PhD, the Alan A. and Edith L. Wolff Professor of Developmental Biology. “Patients don’t die from it, but it has a huge impact on quality of life.”

In previous studies, Stefanie Geisler, MD, an assistant professor of neurology, working with DiAntonio and Milbrandt, showed that blocking this axon self-destruction pathway prevented the development of peripheral neuropathy in mice treated with the chemotherapy agent vincristine. The hope is that if methods are developed to block this pathway in people, then it might be possible to slow or prevent the development of neuropathy in patients.

Toward that end, the Milbrandt and DiAntonio labs showed that a molecule called SARM1 is a central player in the self-destruct pathway of axons. In healthy neurons, SARM1 is present but inactive. For reasons that are unclear, injury or disease activate SARM1, which sets off a series of events that drains a key cellular fuel — called nicotinamide adenine dinucleotide (NAD) — and leads to destruction of the axon. Though the researchers previously had shown SARM1 was required for this chain of events to play out, the details of the process were unknown.

SARM1 and similar molecules — those containing what are called TIR domains — most often are studied in the context of immunity, where these domains serve as scaffolds. Essentially, TIR domains provide a haven for the assembly of molecules or proteins to perform their work.

The researchers had assumed that SARM1 acted as a scaffold to provide support for the work of destroying axons, beginning with the rapid loss of cellular fuel that occurs minutes after SARM1 becomes active. The scientists set about searching for the demolition crew — the active molecule or molecules that use the SARM1 scaffold to carry out the demolition. The study’s first author, Kow A. Essuman, a Howard Hughes Medical Institute Medical Research Fellow and an MD/PhD student in Milbrandt’s lab, performed a litany of cellular and biochemical experiments searching for the demolition crew and came up empty.

“We performed multiple experiments but could not identify molecules that are traditionally known to consume NAD,” Essuman said.

But as a last resort, the investigators tested SARM1 itself. To their great surprise, they found it was doing more than simply providing a passive platform. Specifically, the researchers showed SARM1’s TIR domain acts as an enzyme, a molecule that carries out biochemical reactions, in this case destroying axons by first burning all their NAD cellular fuel.

“There are more than 1,000 papers describing the function of proteins containing TIR domains,” DiAntonio said. “No one had ever shown that this type of molecule could be an enzyme. So we went into our experiments assuming SARM1 was only a scaffold and that there must be some other enzyme responsible for demolition of the axon. We essentially searched for a demolition crew, only to discover that the scaffold itself is destroying the structure. It’s the last thing you would expect.”

The findings suggest molecules similar to SARM1 — those with TIR domains and known to serve as scaffolds in the immune system — may prove to have additional functions that go beyond their structural roles. The research also invites a search for drugs that block the SARM1 enzyme from triggering axonal destruction.

 

Optical tool monitors brain’s circulatory response to pain

Functional near-infrared spectroscopy (fNIRS), an optical imaging tool for monitoring of regional blood flow and tissue oxygenation, is being explored as a way to track the brain’s response to acute pain in adults and infants.

In “Functional near-infrared spectroscopy study on tonic pain activation by cold pressor test,” published today in the journal Neurophotonics, by SPIE, the international society for optics and photonics, researchers from Drexel University describe their work in identifying the relationship between pain threshold and tolerance, and the associated hemodynamic response in the cerebral cortex.

The authors — Zeinab Barati, Issa Zakeri, and Kambiz Pourrezaei — report on using cold pressor tests (CPT) at various temperatures to assess whether the perception of pain is proportional to the evoked hemodynamic response at a given water temperature.

Hemodynamic response is the response of the circulatory system to stimuli such as exercise or emotional stress. The fNIRS technique may have several advantages over other hemodynamic-based imaging techniques, including its portability and noninvasiveness. It also requires no ionizing radiation or drug injection, and it can withstand a certain amount of motion, which is desirable for patients such as infants, small children, and elders with involuntary movement disorders.

“This is one of a few recent studies demonstrating that fNIRS can afford noninvasive, objective measures of cortical responses at the prefrontal region to noxious pain resulting from either cold or hot thermal stimulations,” said Hanli Liu, professor of bioengineering at the University of Texas at Arlington. “Interestingly, this is the first scientific report on gender difference in hemodynamic responses to noxious cold stimuli seen in the prefrontal cortical region, although no gender difference was found in pain threshold, tolerance, or scores. It sheds light on hidden differences in biological variables in the human brain.”

Applications of the technique could include pre- and postoperative uses, study of spontaneous pain under natural conditions, examination of patients who cannot remain motionless in a magnet or positron emission tomography (PET) scanner, assessment of patients under sedation or who may have reduced perception of pain, and study of conditions affecting pain tolerance, such as drug abuse, smoking, and alcohol use.

Advanced brain imaging methods such as PET and functional magnetic resonance imaging (fMRI) have revealed the activation of brain regions during physical or psychological experiences of pain. Previous studies have found a relationship between the intensity of perceived pain and the neuronal activation in several cortical areas.

Story Source:

Materials provided by SPIE–International Society for Optics and Photonics. Note: Content may be edited for style and length.