Too much or too little sleep may increase dementia risk

By Brian P. Dunleavy

Getting too much or too little sleep may increase the risk for cognitive decline, or dementia, in older adults, according to a study published Monday by JAMA Network Open.

In an analysis of the sleep habits of more than 20,000 English and Chinese adults age 48 to 75, people who slept for fewer than four hours or more than 10 hours per day showed evidence of declines in cognitive function, including memory and language comprehension, researchers said.

“This study is an observational study and cannot demonstrate a causal relationship,” study co-author Yanjun Ma told UPI, so the findings don’t necessarily prove that lack of sleep or excessive sleep causes a decline in cognitive function.

Observational studies are intended to assess only the effect of an intervention — in this case, sleep — on study participants, without trying to modify it to compare differences.

It’s possible that diminished or excessive sleep is an early sign of cognitive decline or dementia, as opposed to a risk factor, researchers said.

“Future mechanism studies, as well as intervention studies examining the association between sleep duration and cognitive decline are required,” said Ma, of the Peking University Clinical Research Institute in China.

As many as 6 million Americans have some form of dementia, and changes in sleep patterns are common, according to the Alzheimer’s Association.

To date, research has shown that sleep disturbances can result from cognitive impairment, while animal studies have found links between lack of sleep and increased levels of brain proteins that are thought to be signs for Alzheimer’s disease, said Dr. Yue Leng, who authored a commentary on the study findings.

Leng is an assistant professor of psychiatry at the University of California-San Francisco.

For their research, Ma and colleagues analyzed data on sleep behaviors and cognitive function in 20,065 adults from the English Longitudinal Study of Aging and the China Health and Retirement Longitudinal Study, and tracked them for about eight years, on average.

In addition to finding higher levels of cognitive decline among those who slept fewer than four or more than 10 hours per day, the researchers also observed that people with these sleep habits had “faster cognitive decline” than those who slept seven to nine hours per day, Ma said.

“It’s usually believed that sleep deprivation might lead to cognitive decline, but it’s unclear why too much sleep might be bad for cognitive health,” Leng added. “Older adults should pay more attention to their sleep habits, as these might have implications for their cognitive health.”

https://www.upi.com/Health_News/2020/09/21/Too-much-or-too-little-sleep-may-increase-dementia-risk/5341600697569/

Why do we sleep? The answer may change right before we turn 3.

By Nicoletta Lanese

Humans spend about a third of our lives sleeping, and scientists have long debated why slumber takes up such a huge slice of our time. Now, a new study hints that our main reason for sleeping starts off as one thing, then changes at a surprisingly specific age.

Two leading theories as to why we sleep focus on the brain: One theory says that the brain uses sleep to reorganize the connections between its cells, building electrical networks that support our memory and ability to learn; the other theory says that the brain needs time to clean up the metabolic waste that accumulates throughout the day. Neuroscientists have quibbled over which of these functions is the main reason for sleep, but the new study reveals that the answer may be different for babies and adults.

In the study, published Sep. 18 in the journal Science Advances, researchers use a mathematical model to show that infants spend most of their sleeping hours in “deep sleep,” also known as random eye movement (REM) sleep, while their brains rapidly build new connections between cells and grow ever larger. Then, just before toddlers reach age 2-and-a-half, their amount of REM sleep dips dramatically as the brain switches into maintenance mode, mostly using sleep time for cleaning and repair.

“It was definitely shocking to us that this transition was so sharp,” from growth mode to maintenance mode, senior author Van Savage, a professor of ecology and evolutionary biology and of computational medicine at the University of California, Los Angeles and the Santa Fe Institute, told Live Science in an email. The researchers also collected data in other mammals — namely rabbits, rats and guinea pigs — and found that their sleep might undergo a similar transformation; however, it’s too soon to tell whether these patterns are consistent across many species.

That said, “I think in actuality, it may not be really so sharp” a transition, said Leila Tarokh, a neuroscientist and Group Leader at the University Hospital of Child and Adolescent Psychiatry and Psychotherapy at the University of Bern, who was not involved in the study. The pace of brain development varies widely between individuals, and the researchers had fairly “sparse” data points between the ages of 2 and 3, she said. If they studied individuals through time as they aged, they might find that the transition is less sudden and more smooth, or the age of transition may vary between individuals, she said.

An emerging hypothesis

In a previous study, published in 2007 in the journal Proceedings of the National Academy of Sciences, Savage and theoretical physicist Geoffrey West found that an animal’s brain size and brain metabolic rate accurately predict the amount of time the animal sleeps — more so than the animal’s overall body size. In general, big animals with big brains and low brain metabolic rates sleep less than small animals with the opposite features.

This rule holds up across different species and between members of the same species; for instance, mice sleep more than elephants, and newborn babies sleep more than adult humans. However, knowing that sleep time decreases as brains get bigger, the authors wondered how quickly that change occurs in different animals, and whether that relates to the function of sleep over time.

To begin answering these questions, the researchers pooled existing data on how much humans sleep, compiling several hundred data points from newborn babies and children up to age 15. They also gathered data on brain size and metabolic rate, the density of connections between brain cells, body size and metabolic rate, and the ratio of time spent in REM sleep versus non-REM sleep at different ages; the researchers drew these data points from more than 60 studies, overall.

Babies sleep about twice as much as adults, and they spend a larger proportion of their sleep time in REM, but there’s been a long-standing question as to what function that serves, Tarokh noted.

The study authors built a mathematical model to track all these shifting data points through time and see what patterns emerged between them. They found that the metabolic rate of the brain was high during infancy when the organ was building many new connections between cells, and this in turn correlated with more time spent in REM sleep. They concluded that the long hours of REM in infancy support rapid remodeling in the brain, as new networks form and babies pick up new skills. Then, between age 2 and 3, “the connections are not changing nearly as quickly,” and the amount of time spent in REM diminishes, Savage said.

At this time, the metabolic rate of cells in the cerebral cortex — the wrinkled surface of the brain — also changes. In infancy, the metabolic rate is proportional to the number of existing connections between brain cells plus the energy needed to fashion new connections in the network. As the rate of construction slows, the relative metabolic rate slows in turn.

“In the first few years of life, you see that the brain is making tons of new connections … it’s blossoming, and that’s why we see all those skills coming on,” Tarokh said. Developmental psychologists refer to this as a “critical period” of neuroplasticity — the ability of the brain to forge new connections between its cells. “It’s not that plasticity goes away” after that critical period, but the construction of new connections slows significantly, as the new mathematical model suggests, Tarokh said. At the same time, the ratio of non-REM to REM sleep increases, supporting the idea that non-REM is more important to brain maintenance than neuroplasticity.

Looking forward, the authors plan to apply their mathematical model of sleep to other animals, to see whether a similar switch from reorganization to repair occurs early in development, Savage said.

“Humans are known to be unusual in the amount of brain development that occurs after birth,” lead author Junyu Cao, an assistant professor in the Department of Information, Risk, and Operations Management at The University of Texas at Austin, told Live Science in an email. (Cao played a key role in compiling data and performing computations for the report.) “Therefore, it is conceivable that the phase transition described here for humans may occur earlier in other species, possibly even before birth.”

In terms of human sleep, Tarokh noted that different patterns of electrical activity, known as oscillations, occur in REM versus non-REM sleep; future studies could reveal whether and how particular oscillations shape the brain as we age, given that the amount of time spent in REM changes, she said. Theoretically, disruptions in these patterns could contribute to developmental disorders that emerge in infancy and early childhood, she added — but again, that’s just a hypothesis.

https://www.livescience.com/why-we-sleep-brain-study.html?utm_source=Selligent&utm_medium=email&utm_campaign=9160&utm_content=LVS_newsletter+&utm_term=3675605&m_i=8UY_jNTynHe_3zcTUUgbNBfsu5EMTsCU43mgi8tOnteS3vPmqAlJk16Q6TSxIHJi1tgAdgnm2Gm4GezgFd85bVdOj8L2hG9inkdIOF888c

Playing video games in childhood improves working memory years later

By Chrissy Sexton

Playing video games as a child leads to long-lasting cognitive benefits, according to new research from the Universitat Oberta de Catalunya (UOC). The study suggests that gaming improves working memory and concentration.

Previous studies have shown that gaming improves attention, enhances visual-spatial skills, and causes structural changes in the brain – even increasing the size of some regions. The current study is the first to show that video games promote positive cognitive changes that can take place years after people stop playing them.

“People who were avid gamers before adolescence, despite no longer playing, performed better with the working memory tasks, which require mentally holding and manipulating information to get a result,” said study lead author Dr. Marc Palaus.

The research was focused on 27 people between the ages of 18 and 40 with and without any kind of video gaming experience.

The experts analyzed cognitive skills, including working memory, at three points during the study period: before training the volunteers to play Nintendo’s Super Mario 64, at the end of the training, and fifteen days later.

The findings revealed that participants who had not played video games in childhood did not benefit from improvements in processing and inhibiting irrelevant stimuli. As expected, these individuals were initially slower than those who had played games as children.

“People who played regularly as children performed better from the outset in processing 3D objects, although these differences were mitigated after the period of training in video gaming, when both groups showed similar levels,” said Dr. Palaus.

The experts also performed 10 sessions of a non-invasive brain stimulation known as transcranial magnetic stimulation on the individuals.

“It uses magnetic waves which, when applied to the surface of the skull, are able to produce electrical currents in underlying neural populations and modify their activity,” explained Palaus.

The researchers theorized that combining video gaming with this type of stimulation could improve cognitive performance, but that was not the case.

“We aimed to achieve lasting changes. Under normal circumstances, the effects of this stimulation can last from milliseconds to tens of minutes. We wanted to achieve improved performance of certain brain functions that lasted longer than this.”

The game used in the study had a 3D platform, but there are many types of video games that can influence cognitive functions. According to Dr. Palaus, what most video games have in common is that they involve elements that make people want to continue playing, and that they gradually get harder and present a constant challenge.

“These two things are enough to make it an attractive and motivating activity, which, in turn, requires constant and intense use of our brain’s resources,” said Dr. Palaus. “Video games are a perfect recipe for strengthening our cognitive skills, almost without our noticing.”

The study is published in the journal Frontiers in Human Neuroscience.

Playing video games in childhood improves working memory years later

Poor Sleep Linked with Future Amyloid-β Build Up

by Abby Olena

There’s evidence in people and animals that short-term sleep deprivation can change the levels of amyloid-β, a peptide that can accumulate in the aging brain and cause Alzheimer’s disease. Scientists now show long-term consequences may also result from sustained poor sleep. In a study published September 3 in Current Biology, researchers found that healthy individuals with lower-quality sleep were more likely to have amyloid-β accumulation in the brain years later. The study could not say whether poor sleep caused amyloid-β accumulation or vice versa, but the authors say that sleep could be an indicator of present and future amyloid-β levels.

“Traditionally, sleep disruptions have been accepted as a symptom of Alzheimer’s disease,” says Ksenia Kastanenka, a neuroscientist at Massachusetts General Hospital who was not involved in the work. Her group showed in 2017 that improving sleep in a mouse model of Alzheimer’s disease, in which the animals’ slow wave sleep is disrupted as it usually is in people with the disease, halted disease progression.

Collectively, the results from these studies and others raise the possibility that “sleep rhythm disruptions are not an artifact of disease progression, but actually are active contributors, if not a cause,” she says, hinting at the prospect of using these sleep measures as a biomarker for Alzheimer’s disease.

As a graduate student at the University of California, Berkeley, Joseph Winer, who is now a postdoc at Stanford University, and his colleagues were interested in whether or not sleep could predict how the brain changes over time. They collaborated with the team behind the Berkeley Aging Cohort Study, which includes a group of 32 cognitively healthy adults averaging about 75 years of age. They participated in a sleep study, then had periodic cognitive assessments and between two and five positron emission tomography (PET) scans to check for the presence of amyloid-β in their brains for an average of about four years after the sleep study.

The researchers found at their baseline PET scan, which happened within six months of their sleep study, that 20 of the 32 participants already had some amyloid-β accumulation, which was not unexpected based on their average age. They also showed that both slow wave sleep, an indicator of depth of sleep, and sleep efficiency, the amount of time sleeping compared to time in bed, were both predictive of the rate of amyloid change several years later. In other words, people with lower levels of slow wave sleep and sleep efficiency were more likely to have faster amyloid build up.

The subjects all remained cognitively healthy over the duration of the study, says Winer. “We do expect that they’re at higher risk for developing Alzheimer’s in their lifetime because of the amyloid plaque.”

The strengths of the study include the well-characterized participants with detailed sleep assessments, as well as cognitive testing and longitudinal amyloid PET imaging, says Brendan Lucey, a sleep neurologist at Washington University in St. Louis who did not participate in the work.

There are still open questions about the link between sleep and amyloid deposition over time. “Amyloid accumulation on PET increases at different rates in amyloid-negative and amyloid-positive individuals, and even within amyloid-positive individuals,” Lucey explains. “Without adjusting for participants’ starting amyloid [levels], we don’t know if some participants would have been more likely to have increased amyloid compared to others, independent of sleep.”

“It is very hard to untangle this question of baselines,” acknowledges Winer. Because the sleep measures the team identified in the study are related to amyloid levels, to actually tease apart the effect of sleep quality on amyloid deposition and vice versa, it’d be necessary to study people starting as early as their fifties, when they’re much less likely to have amyloid accumulation, he says.

This study is “a great start,” David Holtzman, a neurologist and collaborator of Lucey at Washington University in St. Louis who did not participate in the work, tells The Scientist. In addition to controlling for the amount of amyloid deposition that is present in a subject’s brain at the beginning of the study, it would be important to see if the findings bear out in larger numbers of people and what role genetic factors play.

“The most important question down the road is to test the idea in some sort of a treatment paradigm,” Holtzman adds. “You can do something to improve the quality of sleep or increase slow wave sleep, and then determine if it actually slows down the onset of Alzheimer’s disease clinically.”

J.R. Winer et al., “Sleep disturbance forecasts β-amyloid accumulation across subsequent years,” Current Biology, doi:10.1016/j.cub.2020.08.017, 2020.

https://www.the-scientist.com/news-opinion/poor-sleep-linked-with-future-amyloid-build-up-67923?utm_campaign=TS_OTC_2020&utm_medium=email&_hsmi=95303853&_hsenc=p2ANqtz–8BBfH3OsENS0A5GHEfhRVVh3ox2uWli04iEz1JAIpGp_Zeq9dMKwhb5f5X1AeB01d4d07al4rDaOWz_GzA5Ax6TXrGQ&utm_content=95303853&utm_source=hs_email

Long-term usage of antidepressant medications may protect from dementia

Long-term treatment with certain antidepressants appeared associated with reduced dementia incidence, according to results of a case-control study published in Journal of Clinical Psychiatry.

“Depression could represent one of these potentially modifiable risk factors for all-cause dementia,” Claudia Bartels, PhD, of the department of psychiatry and psychotherapy at University Medical Center Goettingen in Germany, and colleagues wrote. “Numerous studies have concordantly demonstrated a strong association between depression and an increased risk [for] subsequent dementia. Selective serotonin reuptake inhibitors (SSRIs) are commonly used to treat depressive symptoms in [Alzheimer’s disease] dementia.

“Preclinical research in recent years has suggested that SSRIs reduce amyloid plaque burden in transgenic mouse models of [Alzheimer’s disease] and in cognitively healthy humans, attenuate amyloid-[beta]1-42–induced tau hyperphosphorylation in cell culture and improve cognition in mice.”

However, the effects of SSRIs on cognition in Alzheimer’s disease dementia were linked mostly to negative results in randomized clinical trials; research is sparse regarding which antidepressants may influence risk for developing dementia; and evidence is particularly rare for treatment duration effects on this risk. Thus, Bartels and colleagues sought to determine the effects of antidepressant drug classes and individual compounds with various treatment durations on the risk for developing dementia. The researchers analyzed data of 62,317 individuals with an incident dementia diagnosis who were included in the German Disease Analyzer database, and they compared outcomes to those of controls matched by age, sex and physician. They conducted logistic regression analyses, which were adjusted for health insurance status and comorbid diseases linked to dementia or antidepressant use, to evaluate the association between dementia incidence and treatment with four major classes of antidepressant drug, as well as 14 of the most commonly prescribed individual antidepressants.

Results showed an association between treatment for 2 years or longer with any antidepressant and a lower risk for dementia vs. short-term treatment among 17 of 18 comparison. Particularly for long-term treatment, herbal and tricyclic antidepressants were linked to a decrease in incidence of dementia. Long-term treatment with escitalopram (OR = 0.66; 95% CI, 0.5-0.89) and Hypericum perforatum (OR = 0.6; 95% CI, 0.51-0.7) were associated with the lowest risks for dementia on an individual antidepressant basis.

“Clinical trials — although well acknowledged as the gold standard procedure — have debunked numerous promising compounds and become increasingly challenging with longer treatment durations,” Bartels and colleagues wrote. “Thus, and in awareness of the controversy of this suggestion, analyzing data from registries in a naturalistic setting may be an attractive and feasible alternative. If individual datasets could be combined in a multinational effort, even more powerful analyses of merged big databases could be performed and an additive contribution with naturalistic data could be made.”

https://www.healio.com/news/psychiatry/20200828/longterm-treatment-with-certain-antidepressants-may-reduce-dementia-incidence

Alzheimer’s risk factors may be measurable in adolescents and young adults

Risk factors for Alzheimer’s dementia may be apparent as early as our teens and 20s, according to new research reported at the Alzheimer’s Association International Conference® (AAIC®) 2020.

These risk factors, many of which are disproportionately apparent in African Americans, include heart health factors — such as high blood pressure, high cholesterol and diabetes — and social factors like education quality. According to the Alzheimer’s Association Alzheimer’s Disease Facts and Figures report, older African Americans are about twice as likely to have Alzheimer’s or other dementias as older whites.

“By identifying, verifying, and acting to counter those Alzheimer’s risk factors that we can change, we may reduce new cases and eventually the total number of people with Alzheimer’s and other dementia,” said Maria C. Carrillo, Ph.D., Alzheimer’s Association chief science officer. “Research like this is important in addressing health inequities and providing resources that could make a positive impact on a person’s life.”

“These new reports from AAIC 2020 show that it’s never too early, or too late, to take action to protect your memory and thinking abilities,” Carrillo said.

The Alzheimer’s Association is leading the U.S. Study to Protect Brain Health Through Lifestyle Intervention to Reduce Risk (U.S. POINTER), a two-year clinical trial to evaluate whether lifestyle interventions that simultaneously target many risk factors protect cognitive function in older adults who are at increased risk for cognitive decline. U.S. POINTER is the first such study to be conducted in a large, diverse group of Americans across the United States.

African American Youth At Higher Risk of Dementia

In a population of more than 714 African Americans in the Study of Healthy Aging in African Americans (STAR), Kristen George, Ph.D., MPH, of the University of California, Davis, and colleagues found that high blood pressure and diabetes, or a combination of multiple heart health-related factors, are common in adolescence and are associated with worse late-life cognition. Study participants were adolescents (n=165; ages 12-20), young adults (n=439; ages 21-34) and adults (n=110; ages 35-56). Mean age at cognitive assessment was 68.

Cognition was measured using in-person tests of memory and executive function. The researchers found that, in this study population, having diabetes, high blood pressure, or two or more heart health risk factors in adolescence, young adulthood, or mid-life was associated with statistically significantly worse late-life cognition. These differences persisted after accounting for age, gender, years since risk factors were measured, and education.

Before this report, little was known about whether cardiovascular disease (CVD) risk factors developed prior to mid-life were associated with late-life cognition. This is an important question because African Americans have a higher risk of CVD risk factors compared to other racial/ethnic groups from adolescence through adulthood.

According to the researchers, these findings suggest that CVD risk factors as early as adolescence influence late-life brain health in African Americans. Efforts to promote heart and brain healthy lifestyles must not only include middle-aged adults, but also younger adults and adolescents who may be especially susceptible to the negative impact of poor vascular health on the brain.

Early Adult BMI Associated With Late Life Dementia Risk

In what the authors say is the first study to report on the issue, higher early adulthood (age 20-49) body mass index (BMI) was associated with higher late-life dementia risk.

Relatively little is known about the role of early life BMI on the risk of Alzheimer and other dementias. The scientists studied a total of 5,104 older adults from two studies, including 2,909 from the Cardiovascular Health Study (CHS) and 2,195 from the Health, Aging and Body Composition study (Health ABC). Of the total sample, 18% were Black and 56% were women. Using pooled data from four established cohorts spanning the adult life course, including the two cohorts under the study, the scientists estimated BMI beginning at age 20 for all older adults of CHS and Health ABC.

For women, dementia risk increased with higher early adulthood BMI. Compared to women with normal BMI in early adulthood, dementia risk was 1.8 times higher among those who were overweight, and 2.5 times higher among those who were obese. Analyses were adjusted for midlife and late life BMI.

They found no association between midlife BMI and dementia risk among women.

For men, dementia risk was 2.5 times higher among those who were obese in early adulthood, 1.5 times higher among those who were overweight in mid-life and 2.0 times higher among those who were obese in mid-life, in models also adjusted for late life BMI.

For both women and men, dementia risk decreased with higher late life BMI.

Adina Zeki Al Hazzouri, Ph.D. of Columbia University and colleagues found that high BMI in adulthood is a risk factor for dementia in late life. The researchers suggest that efforts aimed at reducing dementia risk may need to begin earlier in life with a focus on obesity prevention and treatment.

Quality of Early-Life Education Influences Dementia Risk

In a diverse group of more than 2,400 people followed up to 21 years, higher quality early-life education was associated with better language and memory performance, and lower risk of late-life dementia. Results were somewhat different between men and women, and between Blacks and Whites in the study.

The study included 2,446 Black and White men and women, age 65 and older, enrolled in the Washington Heights/Inwood Columbia Aging Project who attended elementary school in the United States. A school quality variable based on historical measures included: mandatory school enrollment age, minimum dropout age, school term length, student-teacher ratio, and student attendance.

People who attended school in states with lower quality education had more rapid decline in memory and language as an older adult. Black women and men and White women who attended schools in states with higher quality education were less likely to develop dementia. According to the scientists, the results were explained, in part, because people who attend higher quality schools end up getting more years of school.

Justina Avila-Rieger, PhD, a postdoctoral research scientist at Columbia University Irving Medical Center and colleagues say the findings provide evidence that later life dementia risk and cognitive function is influenced by early-life state educational policies.

https://www.sciencedaily.com/releases/2020/07/200730092616.htm

Neuroscientists identify the brain cells that help humans adapt to change


Ph.D. candidate Kianoush Banaie Boroujeni at his neuroscience set up at Vanderbilt University, explaining a main result of the study he conducted in the laboratory of Thilo Womelsdorf at Vanderbilt University.

by Marissa Shapiro, Vanderbilt University

There are 86 billion neurons, or cells, in the human brain. Of these, an infinitely small portion of them handle cognitive flexibility—our ability to adjust to new environments and concepts.

A team of researchers with interdisciplinary expertise in psychology, informatics (the application of information science to solve problems with data) and engineering along with the Vanderbilt Brain Institute (VBI) gained critical insights into one of the biggest mysteries in neuroscience, identifying the location and critical nature of these neurons.

The article was published in the journal Proceedings of the National Academy of Science (PNAS) on July 13. The discovery presents an opportunity to enhance researchers’ understanding and treatment of mental illnesses rooted in cognitive flexibility.

Brain circuits created by these neurons have led to an evolutionary advantage in the ability of humans to adapt to changing environments. When these neurons are weakened, people may have trouble adjusting to changes in their environment including difficulty in overcoming traditions, biases and fears. Typically, people oscillate between repeating rewarding behavior and exploring newer and potentially better rewards. The cost-benefit ratio of repeating to exploring is an equation that the brain is constantly working to resolve, particularly when there are changes to a person’s environment. A lack of cognitive flexibility results in debilitating mental conditions.

The consequences of this research could be multifold. “These cells could be part of the switch that determines your best attentional strategy,” said Thilo Womelsdorf, associate professor of psychology and computer science, and the paper’s principal investigator. “Weakening these brain cells could make it difficult to switch attention strategies, which can ultimately result in obsessive-compulsive behaviors or a struggle to adjust to new situations. On the opposite end, if such a switch is ‘loose’ attention might become ‘loose’ and people will experience a continuously uncertain world and be unable to concentrate on important information for any amount of time.”

The researchers hypothesized that within the area of the brain that helps people learn fine motor skills like playing an instrument, there exists a subregion that could enable the same flexible processes for thoughts.

The group of brain cells, located below the outer cortical mantle in the basal ganglia, were identified by measuring the activity of brain cells during computer-simulated real-world tasks. To mimic many real-world situations the researchers, including scientists from the Centre for Vision Research at York University, developed a simulation to present more than one object at a time and changed what was rewarded. This created flexible learning as to which objects are linked to a reward through trial-and-error. By measuring the activity of brain cells, the team observed an interesting pattern: brain cell activity was heightened amid change and diminished when confidence in the outcome grew. “These neurons seem to help the brain circuits to reconfigure and transition from formerly relevant information, and a tenuous connection to attend to new, relevant information,” said Kianoush Banaie Boroujeni, the study’s first author and Ph.D. candidate in the Womelsdorf lab.

“There is a technological revolution in neuroscience,” said Lisa Monteggia, Barlow Family Director of the Vanderbilt Brain Institute and professor of pharmacology. “The ability to use technology to control a single cell with molecular and genetic tools can only work when scientists know where to look. Dr. Womelsdorf and his collaborators have given us the ability to do such work and significantly move the field of neuroscience forward.”

https://medicalxpress.com/news/2020-07-neuroscientists-brain-cells-humans.html

Boosting a liver protein may mimic the brain benefits of exercise

By Laura Sanders

Exercise’s power to boost the brain might require a little help from the liver.

A chemical signal from the liver, triggered by exercise, helps elderly mice keep their brains sharp, suggests a study published in the July 10 Science. Understanding this liver-to-brain signal may help scientists develop a drug that benefits the brain the way exercise does.

Lots of studies have shown that exercise helps the brain, buffering the memory declines that come with old age, for instance. Scientists have long sought an “exercise pill” that could be useful for elderly people too frail to work out or for whom exercise is otherwise risky. “Can we somehow get people who can’t exercise to have the same benefits?” asks Saul Villeda, a neuroscientist at the University of California, San Francisco.

Villeda and colleagues took an approach similar to experiments that revealed the rejuvenating effects of blood from young mice (SN: 5/5/14). But instead of youthfulness, the researchers focused on fitness. The researchers injected sedentary elderly mice with plasma from elderly mice that had voluntarily run on wheels over the course of six weeks. After eight injections over 24 days, the sedentary elderly mice performed better on memory tasks, such as remembering where a hidden platform was in a pool of water, than elderly mice that received injections from sedentary mice.

Comparing the plasma of exercised mice with that of sedentary mice showed an abundance of proteins produced by the liver in mice that ran on wheels.

The researchers closely studied one of these liver proteins produced in response to exercise, called GPLD1. GPLD1 is an enzyme, a type of molecular scissors. It snips other proteins off the outsides of cells, releasing those proteins to go do other jobs. Targeting these biological jobs with a molecule that behaves like GPLD1 might be a way to mimic the brain benefits of exercise, the researchers suspect.

Old mice that were genetically engineered to make more GPLD1 in their livers performed better on the memory tasks than other old sedentary mice, the researchers found. The genetically engineered sedentary mice did about as well in the pool of water as the mice that exercised. “Getting the liver to produce this one enzyme can actually recapitulate all these beneficial effects we see in the brain with exercise,” Villeda says.

Blood samples from elderly people also hint that exercise raises GPLD1 levels. Elderly people who were physically active (defined as walking more than 7,100 steps a day) had more of the protein than elderly people who were more sedentary, data on step-counters showed.

GPLD1 seems to exert its effects from outside of the brain, perhaps by changing the composition of the blood in some way, the researchers suspect.

But the role of GPLD1 is far from settled, cautions Irina Conboy, a researcher at the University of California, Berkeley who studies aging. There’s evidence that GPLD1 levels are higher in people with diabetes, she points out, hinting that the protein may have negative effects. And different experiments suggest that GPLD1 levels might actually fall in response to certain kinds of exercise in rats with markers of diabetes.

“We know for sure that exercise is good for you,” Conboy says. “And we know that this protein is present in the blood.” But whether GPLD1 is good or bad, or whether it goes up or down with exercise, she says, “we don’t know yet.”

CITATIONS
A. M. Horowitz et al. Blood factors transfer beneficial effects of exercise on neurogenesis and cognition to the aged brain. Science. Vol. 369, July 10, 2020, p. 167. doi: 10.1126/science.aaw2622.

Boosting a liver protein may mimic the brain benefits of exercise

Case Western Reserve University-led team develops new approach to treat certain neurological diseases


Paul Tesar, professor of genetics and genome sciences, School of Medicine


Regeneration of myelin in the brain, shown in blue, after ASO drug treatment

A team led by Case Western Reserve University medical researchers has developed a potential treatment method for Pelizaeus-Merzbacher disease (PMD), a fatal neurological disorder that produces severe movement, motor and cognitive dysfunction in children. It results from genetic mutations that prevent the body from properly making myelin, the protective insulation around nerve cells.

Using mouse models, the researchers identified and validated a new treatment target—a toxic protein resulting from the genetic mutation. Next, they successfully used a family of drugs known as ASOs (antisense oligonucleotides) to target the ribonucleic acid (RNA) strands that created the abnormal protein to stop its production. This treatment reduced PMD’s hallmark symptoms and extended lifespan, establishing the clinical potential of this approach.

By demonstrating effective delivery of the ASOs to myelin-producing cells in the nervous system, researchers raised the prospect for using this method to treat other myelin disorders that result from dysfunction within these cells, including multiple sclerosis (MS).

Their research was published online July 1 in the journal Nature.

“The pre-clinical results were profound. PMD mouse models that typically die within a few weeks of birth were able to live a full lifespan after treatment,” said Paul Tesar, principal investigator on the research, a professor in the Department of Genetics and Genome Sciences at the School of Medicine and the Dr. Donald and Ruth Weber Goodman Professor of Innovative Therapeutics. “Our results open the door for the development of the first treatment for PMD as well as a new therapeutic approach for other myelin disorders.”

Study co-authors include an interdisciplinary team of researchers from the medical school, Ionis Pharmaceuticals Inc., a Carlsbad, California-based pioneer developer of RNA-targeted therapies, and Cleveland Clinic. First author Matthew Elitt worked in Tesar’s lab as a Case Western Reserve medical and graduate student.

PMD attacks the young

PMD is a rare, genetic condition involving the brain and spinal cord that primarily affects boys. Symptoms can appear in early infancy and begin with jerky eye movements and abnormal head movements. Over time, children develop severe muscle weakness and stiffness, cognitive dysfunction, difficulty walking and fail to reach developmental milestones such as speaking. The disease shortens life-expectancy, and people with the most severe cases die in childhood.

The disease results from errors in a gene called proteolipid protein 1 (PLP1). Normally, this gene produces proteolipid protein (PLP) a major component of myelin, which wraps and insulates nerve fibers to allow proper transmission of electrical signals in the nervous system. But a faulty PLP1 gene produces toxic proteins that kill myelin producing cells and prevent myelin from developing and functioning properly—resulting in the severe neurological dysfunction in PMD patients.

PMD impacts a few thousand people around the world. So far, no therapy has lessened symptoms or extended lifespans.

For nearly a decade, Tesar and his team have worked to better understand and develop new therapies for myelin disorders. They have had a series of successes, and their myelin-regenerating drugs for MS are now in commercial development.

Latest research

In the current laboratory work, the researchers found that suppressing mutant PLP1 and its toxic protein restored myelin-producing cells, produced functioning myelin, reduced disease symptoms and extended lifespans.

After validating that PLP1 was their therapeutic target, the researchers pursued pre-clinical treatment options. They knew mutations in the PLP1 gene produced faulty RNA strands that, in turn, created the toxic PLP protein.

So they teamed with Ionis Pharmaceuticals, a leader in RNA-targeted therapeutics and pioneer of ASOs. These short strings of chemically modified DNA can be designed to bind to a specific RNA target and block production of its protein product.

And that’s exactly what happened in their studies. The result was improved myelin and locomotion, and substantial extension of lifespan. “ASOs provided an opportunity to cut the disease-causing protein off at its source,” Elitt said.

The successful clinical use of ASOs is relatively new, yet recent developments seem promising. In 2016, the U.S. Food and Drug Administration approved the first ASO drug for a neurological disorder, spinal muscular atrophy. The drug, Spinraza, was developed by Ionis and commercialized by Biogen Inc. More ASO therapies are in development, and clinical trials and hold promise for addressing many neurological diseases that as of now have no effective treatment options.

Tesar said that ongoing and planned experiments in his laboratory will help guide future clinical development of ASO therapy for PMD. For example, researchers want to understand more about how well the treatment works after the onset of symptoms, how long it lasts, how often treatment needs to be given and whether it might be effective for all PMD patients, regardless of their specific form of the disease.

“While important research questions remain, I’m cautiously optimistic about the prospect for this method to move into clinical development and trials for PMD patients,” Tesar said. “I truly hope our work can make a difference for PMD patients and families.”

Case Western Reserve University-led team develops new approach to treat certain neurological diseases

For The First Time, Scientists Have Captured Video of Brains Clearing Out Dead Neurons

by DAVID NIELD

We already know that our brains have a waste disposal system that keeps dead and toxic neurons from clogging up our biological pathways. Now, scientists have managed to capture a video of the process for the first time, in laboratory tests on mice.

There’s still a lot we don’t know about how dead neurons are cleared out, and how the brain reacts to them, so the new research could be a significant step forward in figuring some of that out – even if we’ve not yet confirmed that human brains work in the exact same way.

“This is the first time the process has ever been seen in a live mammalian brain,” says neurologist Jaime Grutzendler from the Yale School of Medicine in Connecticut.

Further down the line, these findings might even inform treatments for age-related brain decline and neurological disorders – once we know more about how brain clean-up is supposed to work, scientists can better diagnose what happens when something goes wrong.

The team focussed in on the glial cells responsible for doing the clean-up work in the brain; they used a technique called 2Phatal to target a single brain cell for apoptosis (cell death) in a mouse and then followed the route of glial cells using fluorescent markers.

“Rather than hitting the brain with a hammer and causing thousands of deaths, inducing a single cell to die allows us to study what is happening right after the cells start to die and watch the many other cells involved,” says Grutzendler.

“This was not possible before. We are able to show with great clarity what exactly is going on and understand the process.”

Three types of glial cells – microglia, astrocytes, and NG2 cells – were shown to be involved in a highly coordinated cell removal process, which removed both the dead neuron and any connecting pathways to the rest of the brain. The researchers observed one microglia engulf the neuron body and its main branches (dendrites), while astrocytes targeted smaller connecting dendrites for removal. They suspect NG2 may help prevent the dead cell debris from spreading.

The researchers also demonstrated that if one type of glial cell missed the dead neuron for whatever reason, other types of cells would take over their role in the waste removal process – suggesting some sort of communication is occurring between the glial cells.

Another interesting finding from the research was that older mouse brains were less efficient at clearing out dead neural cells, even though the garbage removal cells seemed to be just as aware that a dying cell was there.

This is a good opportunity for future research, and could give experts insight into how older brains start to fail in various ways, as the garbage disposal service starts to slow down or even breaks.

New treatments might one day be developed that can take over this clearing process on the brain’s behalf – not just in elderly people, but also those who have suffered trauma to the head, for example.

“Cell death is very common in diseases of the brain,” says neurologist Eyiyemisi Damisah, from the Yale School of Medicine.

“Understanding the process might yield insights on how to address cell death in an injured brain from head trauma to stroke and other conditions.”

The research has been published in Science Advances.

https://www.sciencealert.com/for-the-first-time-scientists-capture-video-of-brains-clearing-out-dead-neurons