Playing video games in childhood improves working memory years later

By Chrissy Sexton

Playing video games as a child leads to long-lasting cognitive benefits, according to new research from the Universitat Oberta de Catalunya (UOC). The study suggests that gaming improves working memory and concentration.

Previous studies have shown that gaming improves attention, enhances visual-spatial skills, and causes structural changes in the brain – even increasing the size of some regions. The current study is the first to show that video games promote positive cognitive changes that can take place years after people stop playing them.

“People who were avid gamers before adolescence, despite no longer playing, performed better with the working memory tasks, which require mentally holding and manipulating information to get a result,” said study lead author Dr. Marc Palaus.

The research was focused on 27 people between the ages of 18 and 40 with and without any kind of video gaming experience.

The experts analyzed cognitive skills, including working memory, at three points during the study period: before training the volunteers to play Nintendo’s Super Mario 64, at the end of the training, and fifteen days later.

The findings revealed that participants who had not played video games in childhood did not benefit from improvements in processing and inhibiting irrelevant stimuli. As expected, these individuals were initially slower than those who had played games as children.

“People who played regularly as children performed better from the outset in processing 3D objects, although these differences were mitigated after the period of training in video gaming, when both groups showed similar levels,” said Dr. Palaus.

The experts also performed 10 sessions of a non-invasive brain stimulation known as transcranial magnetic stimulation on the individuals.

“It uses magnetic waves which, when applied to the surface of the skull, are able to produce electrical currents in underlying neural populations and modify their activity,” explained Palaus.

The researchers theorized that combining video gaming with this type of stimulation could improve cognitive performance, but that was not the case.

“We aimed to achieve lasting changes. Under normal circumstances, the effects of this stimulation can last from milliseconds to tens of minutes. We wanted to achieve improved performance of certain brain functions that lasted longer than this.”

The game used in the study had a 3D platform, but there are many types of video games that can influence cognitive functions. According to Dr. Palaus, what most video games have in common is that they involve elements that make people want to continue playing, and that they gradually get harder and present a constant challenge.

“These two things are enough to make it an attractive and motivating activity, which, in turn, requires constant and intense use of our brain’s resources,” said Dr. Palaus. “Video games are a perfect recipe for strengthening our cognitive skills, almost without our noticing.”

The study is published in the journal Frontiers in Human Neuroscience.

https://www.earth.com/news/playing-video-games-in-childhood-improves-working-memory-years-later/

Poor Sleep Linked with Future Amyloid-β Build Up

by Abby Olena

There’s evidence in people and animals that short-term sleep deprivation can change the levels of amyloid-β, a peptide that can accumulate in the aging brain and cause Alzheimer’s disease. Scientists now show long-term consequences may also result from sustained poor sleep. In a study published September 3 in Current Biology, researchers found that healthy individuals with lower-quality sleep were more likely to have amyloid-β accumulation in the brain years later. The study could not say whether poor sleep caused amyloid-β accumulation or vice versa, but the authors say that sleep could be an indicator of present and future amyloid-β levels.

“Traditionally, sleep disruptions have been accepted as a symptom of Alzheimer’s disease,” says Ksenia Kastanenka, a neuroscientist at Massachusetts General Hospital who was not involved in the work. Her group showed in 2017 that improving sleep in a mouse model of Alzheimer’s disease, in which the animals’ slow wave sleep is disrupted as it usually is in people with the disease, halted disease progression.

Collectively, the results from these studies and others raise the possibility that “sleep rhythm disruptions are not an artifact of disease progression, but actually are active contributors, if not a cause,” she says, hinting at the prospect of using these sleep measures as a biomarker for Alzheimer’s disease.

As a graduate student at the University of California, Berkeley, Joseph Winer, who is now a postdoc at Stanford University, and his colleagues were interested in whether or not sleep could predict how the brain changes over time. They collaborated with the team behind the Berkeley Aging Cohort Study, which includes a group of 32 cognitively healthy adults averaging about 75 years of age. They participated in a sleep study, then had periodic cognitive assessments and between two and five positron emission tomography (PET) scans to check for the presence of amyloid-β in their brains for an average of about four years after the sleep study.

The researchers found at their baseline PET scan, which happened within six months of their sleep study, that 20 of the 32 participants already had some amyloid-β accumulation, which was not unexpected based on their average age. They also showed that both slow wave sleep, an indicator of depth of sleep, and sleep efficiency, the amount of time sleeping compared to time in bed, were both predictive of the rate of amyloid change several years later. In other words, people with lower levels of slow wave sleep and sleep efficiency were more likely to have faster amyloid build up.

The subjects all remained cognitively healthy over the duration of the study, says Winer. “We do expect that they’re at higher risk for developing Alzheimer’s in their lifetime because of the amyloid plaque.”

The strengths of the study include the well-characterized participants with detailed sleep assessments, as well as cognitive testing and longitudinal amyloid PET imaging, says Brendan Lucey, a sleep neurologist at Washington University in St. Louis who did not participate in the work.

There are still open questions about the link between sleep and amyloid deposition over time. “Amyloid accumulation on PET increases at different rates in amyloid-negative and amyloid-positive individuals, and even within amyloid-positive individuals,” Lucey explains. “Without adjusting for participants’ starting amyloid [levels], we don’t know if some participants would have been more likely to have increased amyloid compared to others, independent of sleep.”

“It is very hard to untangle this question of baselines,” acknowledges Winer. Because the sleep measures the team identified in the study are related to amyloid levels, to actually tease apart the effect of sleep quality on amyloid deposition and vice versa, it’d be necessary to study people starting as early as their fifties, when they’re much less likely to have amyloid accumulation, he says.

This study is “a great start,” David Holtzman, a neurologist and collaborator of Lucey at Washington University in St. Louis who did not participate in the work, tells The Scientist. In addition to controlling for the amount of amyloid deposition that is present in a subject’s brain at the beginning of the study, it would be important to see if the findings bear out in larger numbers of people and what role genetic factors play.

“The most important question down the road is to test the idea in some sort of a treatment paradigm,” Holtzman adds. “You can do something to improve the quality of sleep or increase slow wave sleep, and then determine if it actually slows down the onset of Alzheimer’s disease clinically.”

J.R. Winer et al., “Sleep disturbance forecasts β-amyloid accumulation across subsequent years,” Current Biology, doi:10.1016/j.cub.2020.08.017, 2020.

https://www.the-scientist.com/news-opinion/poor-sleep-linked-with-future-amyloid-build-up-67923?utm_campaign=TS_OTC_2020&utm_medium=email&_hsmi=95303853&_hsenc=p2ANqtz–8BBfH3OsENS0A5GHEfhRVVh3ox2uWli04iEz1JAIpGp_Zeq9dMKwhb5f5X1AeB01d4d07al4rDaOWz_GzA5Ax6TXrGQ&utm_content=95303853&utm_source=hs_email

Long-term usage of antidepressant medications may protect from dementia

Long-term treatment with certain antidepressants appeared associated with reduced dementia incidence, according to results of a case-control study published in Journal of Clinical Psychiatry.

“Depression could represent one of these potentially modifiable risk factors for all-cause dementia,” Claudia Bartels, PhD, of the department of psychiatry and psychotherapy at University Medical Center Goettingen in Germany, and colleagues wrote. “Numerous studies have concordantly demonstrated a strong association between depression and an increased risk [for] subsequent dementia. Selective serotonin reuptake inhibitors (SSRIs) are commonly used to treat depressive symptoms in [Alzheimer’s disease] dementia.

“Preclinical research in recent years has suggested that SSRIs reduce amyloid plaque burden in transgenic mouse models of [Alzheimer’s disease] and in cognitively healthy humans, attenuate amyloid-[beta]1-42–induced tau hyperphosphorylation in cell culture and improve cognition in mice.”

However, the effects of SSRIs on cognition in Alzheimer’s disease dementia were linked mostly to negative results in randomized clinical trials; research is sparse regarding which antidepressants may influence risk for developing dementia; and evidence is particularly rare for treatment duration effects on this risk. Thus, Bartels and colleagues sought to determine the effects of antidepressant drug classes and individual compounds with various treatment durations on the risk for developing dementia. The researchers analyzed data of 62,317 individuals with an incident dementia diagnosis who were included in the German Disease Analyzer database, and they compared outcomes to those of controls matched by age, sex and physician. They conducted logistic regression analyses, which were adjusted for health insurance status and comorbid diseases linked to dementia or antidepressant use, to evaluate the association between dementia incidence and treatment with four major classes of antidepressant drug, as well as 14 of the most commonly prescribed individual antidepressants.

Results showed an association between treatment for 2 years or longer with any antidepressant and a lower risk for dementia vs. short-term treatment among 17 of 18 comparison. Particularly for long-term treatment, herbal and tricyclic antidepressants were linked to a decrease in incidence of dementia. Long-term treatment with escitalopram (OR = 0.66; 95% CI, 0.5-0.89) and Hypericum perforatum (OR = 0.6; 95% CI, 0.51-0.7) were associated with the lowest risks for dementia on an individual antidepressant basis.

“Clinical trials — although well acknowledged as the gold standard procedure — have debunked numerous promising compounds and become increasingly challenging with longer treatment durations,” Bartels and colleagues wrote. “Thus, and in awareness of the controversy of this suggestion, analyzing data from registries in a naturalistic setting may be an attractive and feasible alternative. If individual datasets could be combined in a multinational effort, even more powerful analyses of merged big databases could be performed and an additive contribution with naturalistic data could be made.”

https://www.healio.com/news/psychiatry/20200828/longterm-treatment-with-certain-antidepressants-may-reduce-dementia-incidence

Alzheimer’s risk factors may be measurable in adolescents and young adults

Risk factors for Alzheimer’s dementia may be apparent as early as our teens and 20s, according to new research reported at the Alzheimer’s Association International Conference® (AAIC®) 2020.

These risk factors, many of which are disproportionately apparent in African Americans, include heart health factors — such as high blood pressure, high cholesterol and diabetes — and social factors like education quality. According to the Alzheimer’s Association Alzheimer’s Disease Facts and Figures report, older African Americans are about twice as likely to have Alzheimer’s or other dementias as older whites.

“By identifying, verifying, and acting to counter those Alzheimer’s risk factors that we can change, we may reduce new cases and eventually the total number of people with Alzheimer’s and other dementia,” said Maria C. Carrillo, Ph.D., Alzheimer’s Association chief science officer. “Research like this is important in addressing health inequities and providing resources that could make a positive impact on a person’s life.”

“These new reports from AAIC 2020 show that it’s never too early, or too late, to take action to protect your memory and thinking abilities,” Carrillo said.

The Alzheimer’s Association is leading the U.S. Study to Protect Brain Health Through Lifestyle Intervention to Reduce Risk (U.S. POINTER), a two-year clinical trial to evaluate whether lifestyle interventions that simultaneously target many risk factors protect cognitive function in older adults who are at increased risk for cognitive decline. U.S. POINTER is the first such study to be conducted in a large, diverse group of Americans across the United States.

African American Youth At Higher Risk of Dementia

In a population of more than 714 African Americans in the Study of Healthy Aging in African Americans (STAR), Kristen George, Ph.D., MPH, of the University of California, Davis, and colleagues found that high blood pressure and diabetes, or a combination of multiple heart health-related factors, are common in adolescence and are associated with worse late-life cognition. Study participants were adolescents (n=165; ages 12-20), young adults (n=439; ages 21-34) and adults (n=110; ages 35-56). Mean age at cognitive assessment was 68.

Cognition was measured using in-person tests of memory and executive function. The researchers found that, in this study population, having diabetes, high blood pressure, or two or more heart health risk factors in adolescence, young adulthood, or mid-life was associated with statistically significantly worse late-life cognition. These differences persisted after accounting for age, gender, years since risk factors were measured, and education.

Before this report, little was known about whether cardiovascular disease (CVD) risk factors developed prior to mid-life were associated with late-life cognition. This is an important question because African Americans have a higher risk of CVD risk factors compared to other racial/ethnic groups from adolescence through adulthood.

According to the researchers, these findings suggest that CVD risk factors as early as adolescence influence late-life brain health in African Americans. Efforts to promote heart and brain healthy lifestyles must not only include middle-aged adults, but also younger adults and adolescents who may be especially susceptible to the negative impact of poor vascular health on the brain.

Early Adult BMI Associated With Late Life Dementia Risk

In what the authors say is the first study to report on the issue, higher early adulthood (age 20-49) body mass index (BMI) was associated with higher late-life dementia risk.

Relatively little is known about the role of early life BMI on the risk of Alzheimer and other dementias. The scientists studied a total of 5,104 older adults from two studies, including 2,909 from the Cardiovascular Health Study (CHS) and 2,195 from the Health, Aging and Body Composition study (Health ABC). Of the total sample, 18% were Black and 56% were women. Using pooled data from four established cohorts spanning the adult life course, including the two cohorts under the study, the scientists estimated BMI beginning at age 20 for all older adults of CHS and Health ABC.

For women, dementia risk increased with higher early adulthood BMI. Compared to women with normal BMI in early adulthood, dementia risk was 1.8 times higher among those who were overweight, and 2.5 times higher among those who were obese. Analyses were adjusted for midlife and late life BMI.

They found no association between midlife BMI and dementia risk among women.

For men, dementia risk was 2.5 times higher among those who were obese in early adulthood, 1.5 times higher among those who were overweight in mid-life and 2.0 times higher among those who were obese in mid-life, in models also adjusted for late life BMI.

For both women and men, dementia risk decreased with higher late life BMI.

Adina Zeki Al Hazzouri, Ph.D. of Columbia University and colleagues found that high BMI in adulthood is a risk factor for dementia in late life. The researchers suggest that efforts aimed at reducing dementia risk may need to begin earlier in life with a focus on obesity prevention and treatment.

Quality of Early-Life Education Influences Dementia Risk

In a diverse group of more than 2,400 people followed up to 21 years, higher quality early-life education was associated with better language and memory performance, and lower risk of late-life dementia. Results were somewhat different between men and women, and between Blacks and Whites in the study.

The study included 2,446 Black and White men and women, age 65 and older, enrolled in the Washington Heights/Inwood Columbia Aging Project who attended elementary school in the United States. A school quality variable based on historical measures included: mandatory school enrollment age, minimum dropout age, school term length, student-teacher ratio, and student attendance.

People who attended school in states with lower quality education had more rapid decline in memory and language as an older adult. Black women and men and White women who attended schools in states with higher quality education were less likely to develop dementia. According to the scientists, the results were explained, in part, because people who attend higher quality schools end up getting more years of school.

Justina Avila-Rieger, PhD, a postdoctoral research scientist at Columbia University Irving Medical Center and colleagues say the findings provide evidence that later life dementia risk and cognitive function is influenced by early-life state educational policies.

https://www.sciencedaily.com/releases/2020/07/200730092616.htm

Neuroscientists identify the brain cells that help humans adapt to change


Ph.D. candidate Kianoush Banaie Boroujeni at his neuroscience set up at Vanderbilt University, explaining a main result of the study he conducted in the laboratory of Thilo Womelsdorf at Vanderbilt University.

by Marissa Shapiro, Vanderbilt University

There are 86 billion neurons, or cells, in the human brain. Of these, an infinitely small portion of them handle cognitive flexibility—our ability to adjust to new environments and concepts.

A team of researchers with interdisciplinary expertise in psychology, informatics (the application of information science to solve problems with data) and engineering along with the Vanderbilt Brain Institute (VBI) gained critical insights into one of the biggest mysteries in neuroscience, identifying the location and critical nature of these neurons.

The article was published in the journal Proceedings of the National Academy of Science (PNAS) on July 13. The discovery presents an opportunity to enhance researchers’ understanding and treatment of mental illnesses rooted in cognitive flexibility.

Brain circuits created by these neurons have led to an evolutionary advantage in the ability of humans to adapt to changing environments. When these neurons are weakened, people may have trouble adjusting to changes in their environment including difficulty in overcoming traditions, biases and fears. Typically, people oscillate between repeating rewarding behavior and exploring newer and potentially better rewards. The cost-benefit ratio of repeating to exploring is an equation that the brain is constantly working to resolve, particularly when there are changes to a person’s environment. A lack of cognitive flexibility results in debilitating mental conditions.

The consequences of this research could be multifold. “These cells could be part of the switch that determines your best attentional strategy,” said Thilo Womelsdorf, associate professor of psychology and computer science, and the paper’s principal investigator. “Weakening these brain cells could make it difficult to switch attention strategies, which can ultimately result in obsessive-compulsive behaviors or a struggle to adjust to new situations. On the opposite end, if such a switch is ‘loose’ attention might become ‘loose’ and people will experience a continuously uncertain world and be unable to concentrate on important information for any amount of time.”

The researchers hypothesized that within the area of the brain that helps people learn fine motor skills like playing an instrument, there exists a subregion that could enable the same flexible processes for thoughts.

The group of brain cells, located below the outer cortical mantle in the basal ganglia, were identified by measuring the activity of brain cells during computer-simulated real-world tasks. To mimic many real-world situations the researchers, including scientists from the Centre for Vision Research at York University, developed a simulation to present more than one object at a time and changed what was rewarded. This created flexible learning as to which objects are linked to a reward through trial-and-error. By measuring the activity of brain cells, the team observed an interesting pattern: brain cell activity was heightened amid change and diminished when confidence in the outcome grew. “These neurons seem to help the brain circuits to reconfigure and transition from formerly relevant information, and a tenuous connection to attend to new, relevant information,” said Kianoush Banaie Boroujeni, the study’s first author and Ph.D. candidate in the Womelsdorf lab.

“There is a technological revolution in neuroscience,” said Lisa Monteggia, Barlow Family Director of the Vanderbilt Brain Institute and professor of pharmacology. “The ability to use technology to control a single cell with molecular and genetic tools can only work when scientists know where to look. Dr. Womelsdorf and his collaborators have given us the ability to do such work and significantly move the field of neuroscience forward.”

https://medicalxpress.com/news/2020-07-neuroscientists-brain-cells-humans.html

Boosting a liver protein may mimic the brain benefits of exercise

By Laura Sanders

Exercise’s power to boost the brain might require a little help from the liver.

A chemical signal from the liver, triggered by exercise, helps elderly mice keep their brains sharp, suggests a study published in the July 10 Science. Understanding this liver-to-brain signal may help scientists develop a drug that benefits the brain the way exercise does.

Lots of studies have shown that exercise helps the brain, buffering the memory declines that come with old age, for instance. Scientists have long sought an “exercise pill” that could be useful for elderly people too frail to work out or for whom exercise is otherwise risky. “Can we somehow get people who can’t exercise to have the same benefits?” asks Saul Villeda, a neuroscientist at the University of California, San Francisco.

Villeda and colleagues took an approach similar to experiments that revealed the rejuvenating effects of blood from young mice (SN: 5/5/14). But instead of youthfulness, the researchers focused on fitness. The researchers injected sedentary elderly mice with plasma from elderly mice that had voluntarily run on wheels over the course of six weeks. After eight injections over 24 days, the sedentary elderly mice performed better on memory tasks, such as remembering where a hidden platform was in a pool of water, than elderly mice that received injections from sedentary mice.

Comparing the plasma of exercised mice with that of sedentary mice showed an abundance of proteins produced by the liver in mice that ran on wheels.

The researchers closely studied one of these liver proteins produced in response to exercise, called GPLD1. GPLD1 is an enzyme, a type of molecular scissors. It snips other proteins off the outsides of cells, releasing those proteins to go do other jobs. Targeting these biological jobs with a molecule that behaves like GPLD1 might be a way to mimic the brain benefits of exercise, the researchers suspect.

Old mice that were genetically engineered to make more GPLD1 in their livers performed better on the memory tasks than other old sedentary mice, the researchers found. The genetically engineered sedentary mice did about as well in the pool of water as the mice that exercised. “Getting the liver to produce this one enzyme can actually recapitulate all these beneficial effects we see in the brain with exercise,” Villeda says.

Blood samples from elderly people also hint that exercise raises GPLD1 levels. Elderly people who were physically active (defined as walking more than 7,100 steps a day) had more of the protein than elderly people who were more sedentary, data on step-counters showed.

GPLD1 seems to exert its effects from outside of the brain, perhaps by changing the composition of the blood in some way, the researchers suspect.

But the role of GPLD1 is far from settled, cautions Irina Conboy, a researcher at the University of California, Berkeley who studies aging. There’s evidence that GPLD1 levels are higher in people with diabetes, she points out, hinting that the protein may have negative effects. And different experiments suggest that GPLD1 levels might actually fall in response to certain kinds of exercise in rats with markers of diabetes.

“We know for sure that exercise is good for you,” Conboy says. “And we know that this protein is present in the blood.” But whether GPLD1 is good or bad, or whether it goes up or down with exercise, she says, “we don’t know yet.”

CITATIONS
A. M. Horowitz et al. Blood factors transfer beneficial effects of exercise on neurogenesis and cognition to the aged brain. Science. Vol. 369, July 10, 2020, p. 167. doi: 10.1126/science.aaw2622.

Boosting a liver protein may mimic the brain benefits of exercise

Case Western Reserve University-led team develops new approach to treat certain neurological diseases


Paul Tesar, professor of genetics and genome sciences, School of Medicine


Regeneration of myelin in the brain, shown in blue, after ASO drug treatment

A team led by Case Western Reserve University medical researchers has developed a potential treatment method for Pelizaeus-Merzbacher disease (PMD), a fatal neurological disorder that produces severe movement, motor and cognitive dysfunction in children. It results from genetic mutations that prevent the body from properly making myelin, the protective insulation around nerve cells.

Using mouse models, the researchers identified and validated a new treatment target—a toxic protein resulting from the genetic mutation. Next, they successfully used a family of drugs known as ASOs (antisense oligonucleotides) to target the ribonucleic acid (RNA) strands that created the abnormal protein to stop its production. This treatment reduced PMD’s hallmark symptoms and extended lifespan, establishing the clinical potential of this approach.

By demonstrating effective delivery of the ASOs to myelin-producing cells in the nervous system, researchers raised the prospect for using this method to treat other myelin disorders that result from dysfunction within these cells, including multiple sclerosis (MS).

Their research was published online July 1 in the journal Nature.

“The pre-clinical results were profound. PMD mouse models that typically die within a few weeks of birth were able to live a full lifespan after treatment,” said Paul Tesar, principal investigator on the research, a professor in the Department of Genetics and Genome Sciences at the School of Medicine and the Dr. Donald and Ruth Weber Goodman Professor of Innovative Therapeutics. “Our results open the door for the development of the first treatment for PMD as well as a new therapeutic approach for other myelin disorders.”

Study co-authors include an interdisciplinary team of researchers from the medical school, Ionis Pharmaceuticals Inc., a Carlsbad, California-based pioneer developer of RNA-targeted therapies, and Cleveland Clinic. First author Matthew Elitt worked in Tesar’s lab as a Case Western Reserve medical and graduate student.

PMD attacks the young

PMD is a rare, genetic condition involving the brain and spinal cord that primarily affects boys. Symptoms can appear in early infancy and begin with jerky eye movements and abnormal head movements. Over time, children develop severe muscle weakness and stiffness, cognitive dysfunction, difficulty walking and fail to reach developmental milestones such as speaking. The disease shortens life-expectancy, and people with the most severe cases die in childhood.

The disease results from errors in a gene called proteolipid protein 1 (PLP1). Normally, this gene produces proteolipid protein (PLP) a major component of myelin, which wraps and insulates nerve fibers to allow proper transmission of electrical signals in the nervous system. But a faulty PLP1 gene produces toxic proteins that kill myelin producing cells and prevent myelin from developing and functioning properly—resulting in the severe neurological dysfunction in PMD patients.

PMD impacts a few thousand people around the world. So far, no therapy has lessened symptoms or extended lifespans.

For nearly a decade, Tesar and his team have worked to better understand and develop new therapies for myelin disorders. They have had a series of successes, and their myelin-regenerating drugs for MS are now in commercial development.

Latest research

In the current laboratory work, the researchers found that suppressing mutant PLP1 and its toxic protein restored myelin-producing cells, produced functioning myelin, reduced disease symptoms and extended lifespans.

After validating that PLP1 was their therapeutic target, the researchers pursued pre-clinical treatment options. They knew mutations in the PLP1 gene produced faulty RNA strands that, in turn, created the toxic PLP protein.

So they teamed with Ionis Pharmaceuticals, a leader in RNA-targeted therapeutics and pioneer of ASOs. These short strings of chemically modified DNA can be designed to bind to a specific RNA target and block production of its protein product.

And that’s exactly what happened in their studies. The result was improved myelin and locomotion, and substantial extension of lifespan. “ASOs provided an opportunity to cut the disease-causing protein off at its source,” Elitt said.

The successful clinical use of ASOs is relatively new, yet recent developments seem promising. In 2016, the U.S. Food and Drug Administration approved the first ASO drug for a neurological disorder, spinal muscular atrophy. The drug, Spinraza, was developed by Ionis and commercialized by Biogen Inc. More ASO therapies are in development, and clinical trials and hold promise for addressing many neurological diseases that as of now have no effective treatment options.

Tesar said that ongoing and planned experiments in his laboratory will help guide future clinical development of ASO therapy for PMD. For example, researchers want to understand more about how well the treatment works after the onset of symptoms, how long it lasts, how often treatment needs to be given and whether it might be effective for all PMD patients, regardless of their specific form of the disease.

“While important research questions remain, I’m cautiously optimistic about the prospect for this method to move into clinical development and trials for PMD patients,” Tesar said. “I truly hope our work can make a difference for PMD patients and families.”

Case Western Reserve University-led team develops new approach to treat certain neurological diseases

For The First Time, Scientists Have Captured Video of Brains Clearing Out Dead Neurons

by DAVID NIELD

We already know that our brains have a waste disposal system that keeps dead and toxic neurons from clogging up our biological pathways. Now, scientists have managed to capture a video of the process for the first time, in laboratory tests on mice.

There’s still a lot we don’t know about how dead neurons are cleared out, and how the brain reacts to them, so the new research could be a significant step forward in figuring some of that out – even if we’ve not yet confirmed that human brains work in the exact same way.

“This is the first time the process has ever been seen in a live mammalian brain,” says neurologist Jaime Grutzendler from the Yale School of Medicine in Connecticut.

Further down the line, these findings might even inform treatments for age-related brain decline and neurological disorders – once we know more about how brain clean-up is supposed to work, scientists can better diagnose what happens when something goes wrong.

The team focussed in on the glial cells responsible for doing the clean-up work in the brain; they used a technique called 2Phatal to target a single brain cell for apoptosis (cell death) in a mouse and then followed the route of glial cells using fluorescent markers.

“Rather than hitting the brain with a hammer and causing thousands of deaths, inducing a single cell to die allows us to study what is happening right after the cells start to die and watch the many other cells involved,” says Grutzendler.

“This was not possible before. We are able to show with great clarity what exactly is going on and understand the process.”

Three types of glial cells – microglia, astrocytes, and NG2 cells – were shown to be involved in a highly coordinated cell removal process, which removed both the dead neuron and any connecting pathways to the rest of the brain. The researchers observed one microglia engulf the neuron body and its main branches (dendrites), while astrocytes targeted smaller connecting dendrites for removal. They suspect NG2 may help prevent the dead cell debris from spreading.

The researchers also demonstrated that if one type of glial cell missed the dead neuron for whatever reason, other types of cells would take over their role in the waste removal process – suggesting some sort of communication is occurring between the glial cells.

Another interesting finding from the research was that older mouse brains were less efficient at clearing out dead neural cells, even though the garbage removal cells seemed to be just as aware that a dying cell was there.

This is a good opportunity for future research, and could give experts insight into how older brains start to fail in various ways, as the garbage disposal service starts to slow down or even breaks.

New treatments might one day be developed that can take over this clearing process on the brain’s behalf – not just in elderly people, but also those who have suffered trauma to the head, for example.

“Cell death is very common in diseases of the brain,” says neurologist Eyiyemisi Damisah, from the Yale School of Medicine.

“Understanding the process might yield insights on how to address cell death in an injured brain from head trauma to stroke and other conditions.”

The research has been published in Science Advances.

https://www.sciencealert.com/for-the-first-time-scientists-capture-video-of-brains-clearing-out-dead-neurons

Human brain size gene triggers bigger brain in monkeys


Microscopy image of a section through one brain hemisphere of a 101 day- old ARHGAP11B-transgenic marmoset fetus. Cell nuclei are visualized by DAPI (white). Arrows indicate a sulcus and a gyrus. Credit: Heide et al. / MPI-CBG

The expansion of the human brain during evolution, specifically of the neocortex, is linked to cognitive abilities such as reasoning and language. A certain gene called ARHGAP11B that is only found in humans triggers brain stem cells to form more stem cells, a prerequisite for a bigger brain. Past studies have shown that ARHGAP11B, when expressed in mice and ferrets to unphysiologically high levels, causes an expanded neocortex, but its relevance for primate evolution has been unclear.

Researchers at the Max Planck Institute of Molecular Cell Biology and Genetics (MPI-CBG) in Dresden, together with colleagues at the Central Institute for Experimental Animals (CIEA) in Kawasaki and the Keio University in Tokyo, both located in Japan, now show that this human-specific gene, when expressed to physiological levels, causes an enlarged neocortex in the common marmoset, a New World monkey. This suggests that the ARHGAP11B gene may have caused neocortex expansion during human evolution. The researchers published their findings in the journal Science.

The human neocortex, the evolutionarily youngest part of the cerebral cortex, is about three times bigger than that of the closest human relatives, chimpanzees, and its folding into wrinkles increased during evolution to fit inside the restricted space of the skull. A key question for scientists is how the human neocortex became so big. In a 2015 study, the research group of Wieland Huttner, a founding director of the MPI-CBG, found that under the influence of the human-specific gene ARHGAP11B, mouse embryos produced many more neural progenitor cells and could even undergo folding of their normally unfolded neocortex. The results suggested that the gene ARHGAP11B plays a key role in the evolutionary expansion of the human neocortex.

The rise of the human-specific gene

The human-specific gene ARHGAP11B arose through a partial duplication of the ubiquitous gene ARHGAP11A approximately five million years ago along the evolutionary lineage leading to Neanderthals, Denisovans, and present-day humans, and after this lineage had segregated from that leading to the chimpanzee. In a follow-up study in 2016, the research group of Wieland Huttner uncovered a surprising reason why the ARHGAP11B protein contains a sequence of 47 amino acids that is human-specific, not found in the ARHGAP11A protein, and essential for ARHGAP11B’s ability to increase brain stem cells.

Specifically, a single C-to-G base substitution found in the ARHGAP11B gene leads to the loss of 55 nucleotides from the ARHGAP11B messenger RNA, which causes a shift in the reading frame resulting in the human-specific, functionally critical 47 amino acid sequence. This base substitution probably happened much later than when this gene arose about 5 million years ago, anytime between 1.5 million and 500,000 years ago. Such point mutations are not rare, but in the case of ARHGAP11B its advantages of forming a bigger brain seem to have immediately influenced human evolution.


Wildtype (normal) and ARHGAP11B-transgenic fetal (101 days) marmoset brains. Yellow lines, boundaries of cerebral cortex; white lines, developing cerebellum; arrowheads, folds. Scale bars, 1 mm. Credit: Heide et al. / MPI-CBG

The gene’s effect in monkeys

However, it has been unclear until now if the human-specific gene ARHGAP11B would also cause an enlarged neocortex in non-human primates. To investigate this, the researchers in the group of Wieland Huttner teamed up with Erika Sasaki at the Central Institute for Experimental Animals (CIEA) in Kawasaki and Hideyuki Okano at the Keio University in Tokyo, both located in Japan, who had pioneered the development of a technology to generate transgenic non-human primates. The first author of the study, postdoc Michael Heide, traveled to Japan to work with the colleagues directly on-site.

They generated transgenic common marmosets, a New World monkey, that expressed the human-specific gene ARHGAP11B, which they normally do not have, in the developing neocortex. Japan has similarly high ethical standards and regulations regarding animal research and animal welfare as Germany does. The brains of 101-day-old common marmoset fetuses (50 days before the normal birth date) were obtained in Japan and exported to the MPI-CBG in Dresden for detailed analysis.

Michael Heide explains: “We found indeed that the neocortex of the common marmoset brain was enlarged and the brain surface folded. Its cortical plate was also thicker than normal. Furthermore, we could see increased numbers of basal radial glia progenitors in the outer subventricular zone and increased numbers of upper-layer neurons, the neuron type that increases in primate evolution.” The researchers had now functional evidence that ARHGAP11B causes an expansion of the primate neocortex.

Ethical consideration

Wieland Huttner, who led the study, adds: “We confined our analyses to marmoset fetuses, because we anticipated that the expression of this human-specific gene would affect the neocortex development in the marmoset. In light of potential unforeseeable consequences with regard to postnatal brain function, we considered it a prerequisite—and mandatory from an ethical point of view—to first determine the effects of ARHGAP11B on the development of fetal marmoset neocortex.”

The researchers conclude that these results suggest that the human-specific ARHGAP11B gene may have caused neocortex expansion in the course of human evolution.

More information: “Human-specific ARHGAP11B increases size and folding of primate neocortex in the fetal marmoset” Science (2020). science.sciencemag.org/cgi/doi … 1126/science.abb2401

https://medicalxpress.com/news/2020-06-human-brain-size-gene-triggers.html

Researchers Make Mice Smell Odors that Aren’t Really There

by Ruth Williams

By activating a particular pattern of nerve endings in the brain’s olfactory bulb, researchers can make mice smell a non-existent odor, according to a paper published June 18 in Science. Manipulating these activity patterns reveals which aspects are important for odor recognition.

“This study is a beautiful example of the use of synthetic stimuli . . . to probe the workings of the brain in a way that is just not possible currently with natural stimuli,” neuroscientist Venkatesh Murthy of Harvard University who was not involved with the study writes in an email to The Scientist.

A fundamental goal of neuroscience is to understand how a stimulus—a sight, sound, taste, touch, or smell—is interpreted, or perceived, by the brain. While a large number of studies have shown the various ways in which such stimuli activate brain cells, very little is understood about what these activations actually contribute to perception.

In the case of smell, for example, it is well-known that odorous molecules traveling up the nose bind to receptors on cells that then transmit signals along their axons to bundles of nerve endings—glomeruli—in a brain area called the olfactory bulb. A single molecule can cause a whole array of different glomeruli to fire in quick succession, explains neurobiologist Kevin Franks of Duke University who also did not participate in the research. And because these activity patterns “have many different spatial and temporal features,” he says, “it is difficult to know which of those features is actually most relevant [for perception].”

To find out, neuroscientist Dmitry Rinberg of New York University and colleagues bypassed the nose entirely. “The clever part of their approach is to gain direct control of these neurons with light, rather than by sending odors up the animal’s nose,” Caltech neurobiologist Markus Meister, who was not involved in the work, writes in an email to The Scientist.

The team used mice genetically engineered to produce light-sensitive ion channels in their olfactory bulb cells. They then used precisely focused lasers to activate a specific pattern of glomeruli in the region of the bulb closest to the top of the animal’s head, through a surgically implanted window in the skull. The mice were trained to associate this activation pattern with a reward—water, delivered via a lick-tube. The same mice did not associate random activation patterns with the reward, suggesting they had learned to distinguish the reward-associated pattern, or synthetic smell, from others.

Although the activation patterns were not based on any particular odors, they were designed to be as life-like as possible. For example, the glomeruli were activated one after the other within the space of 300 milliseconds from the time at which the mouse sniffed—detected by a sensor. “But, I’ll be honest with you, I have no idea if it stinks [or] it is pleasant” for the mouse, Rinberg says.

Once the mice were thoroughly trained, the team made methodical alterations to the activity pattern—changing the order in which the glomeruli were activated, switching out individual activation sites for alternatives, and changing the timing of the activation relative to the sniff. They tried “hundreds of different combinations,” Rinberg says. He likened it to altering the notes in a tune. “If you change the notes, or the timing of the notes, does the song remain the same?” he asks. That is, would the mice still be able to recognize the induced scent?

From these experiments, a general picture emerged: alterations to the earliest-activated regions caused the most significant impairment to the animal’s ability to recognize the scent. “What they showed is that, even though an odor will [induce] a very complex pattern of activity, really it is just the earliest inputs, the first few glomeruli that are activated that are really important for perception,” says Franks.

Rinberg says he thinks these early glomeruli most likely represent the receptors to which an odorant binds most strongly.

With these insights into the importance of glomeruli firing times for scent recognition, “the obvious next question,” says Franks, is to go deeper into the brain to where the olfactory bulb neurons project and ask, “ How does the cortex make sense of this?”

E. Chong et al., “Manipulating synthetic optogenetic odors reveals the coding logic of olfactory perception,” Science, 368:eaba2357, 2020.

https://www.the-scientist.com/news-opinion/researchers-make-mice-smell-odors-that-arent-really-there-67643?utm_campaign=TS_DAILY%20NEWSLETTER_2020&utm_medium=email&_hsmi=89854591&_hsenc=p2ANqtz–BMhsu532UL56qwtB0yErPYlgoFTIZWsNouvTV9pnT1ikTw6CvyIPyun3rPGdciV29we7ugRVWYc1uuBDh5CN_F-0FzA&utm_content=89854591&utm_source=hs_email