Kill Zombie Neurons to Prevent Alzheimer’s Disease

zombie-neurons-identified-in-alzheimers-brains-309765
Senescent cells (represented here in green) no longer function but can broadcast inflammatory signals to the cells around them. These cells are implicated in a number of age-related diseases. Credit: The Mayo Clinic

0Q9B6965
Darren Baker, Ph.D., a Mayo Clinic molecular biologist and senior author of the paper, and first author Tyler Bussian, a Mayo Clinic Graduate School of Biomedical Sciences student.

Zombie cells are the ones that can’t die but are equally unable to perform the functions of a normal cell. These zombie, or senescent, cells are implicated in a number of age-related diseases. And with a new letter in Nature, Mayo Clinic researchers have expanded that list.

In a mouse model of brain disease, scientists report that senescent cells accumulate in certain brain cells prior to cognitive loss. By preventing the accumulation of these cells, they were able to diminish tau protein aggregation, neuronal death and memory loss.

“Senescent cells are known to accumulate with advancing natural age and at sites related to diseases of aging, including osteoarthritis; atherosclerosis; and neurodegenerative diseases, such as Alzheimer’s and Parkinson’s,” says Darren Baker, Ph.D., a Mayo Clinic molecular biologist and senior author of the paper. “In prior studies, we have found that elimination of senescent cells from naturally aged mice extends their healthy life span.”

In the current study, the team used a model that imitates aspects of Alzheimer’s disease.

“We used a mouse model that produces sticky, cobweb like tangles of tau protein in neurons and has genetic modifications to allow for senescent cell elimination,” explains first author Tyler Bussian, a Mayo Clinic Graduate School of Biomedical Sciences student who is part of Dr. Baker’s lab. “When senescent cells were removed, we found that the diseased animals retained the ability to form memories, eliminated signs of inflammation, did not develop neurofibrillary tangles, and had maintained normal brain mass.” They also report that pharmacological intervention to remove senescent cells modulated the clumping of tau proteins.

Also, the team was able to identify the specific type of cell that became senescent, says Dr. Baker.

“Two different brain cell types called ‘microglia’ and ‘astrocytes’ were found to be senescent when we looked at brain tissue under the microscope,” says Bussian. “These cells are important supporters of neuronal health and signaling, so it makes sense that senescence in either would negatively impact neuron health.”

The finding was somewhat surprising, explains Dr. Baker, because at the time their research started, a causal link between senescent cells and neurodegenerative disease had not been established.

“We had no idea whether senescent cells actively contributed to disease pathology in the brain, and to find that it’s the astrocytes and microglia that are prone to senescence is somewhat of a surprise, as well,” says Dr. Baker.

In terms of future work, Dr. Baker explains that this research lays out the best-case scenario, where prevention of damage to the brain avoided the disease state. “Clearly, this same approach cannot be applied clinically, so we are starting to treat animals after disease establishment and working on new models to examine the specific molecular alterations that occur in the affected cells,” says Dr. Baker.

In addition to Dr. Baker and Bussian, the other authors are Asef Aziz, a medical student formerly at Mayo Clinic; Charlton Meyer, Mayo Clinic; Barbara Swenson, Ph.D., Mayo Clinic; and Jan van Deursen, Ph.D., Mayo Clinic. Dr. van Deursen is the Vita Valley Professor of Cellular Senescence. Drs. Baker and van Deursen are inventors on patents licensed to Unity Biotechnology by Mayo Clinic, and Dr. van Deursen is a co-founder of Unity Biotechnology.

Funding for this research was provided by the Ellison Medical Foundation, the Glenn Foundation for Medical Research, the National Institutes of Health, the Mayo Clinic Children’s Research Center, and the Alzheimer’s Disease Research Center of Mayo Clinic.

https://newsnetwork.mayoclinic.org/discussion/senescent-cells-found-in-brains-of-mice-prior-to-cognitive-loss/

Bravery-associated brain cells identified in the hippocampus

hippocampus
The hippocampus is a region of the brain largely responsible for memory formation.

Why can some people comfortably walk between skyscrapers on a high-wire or fearlessly raft Niagara Falls in a wooden barrel, whereas others freeze at the mere thought of climbing off escalators in a shopping mall? In a new study, scientists have found that a certain type of cell in the hippocampus plays a key role.

People differ when it comes to trying dangerous or exhilarating activities. Even siblings can show dramatic differences in risk-taking behaviour. The neural mechanisms that drive risk-taking behaviour are largely unknown. However, scientists from the Department of Neuroscience of Uppsala University in Sweden and the Brain Institute of the Federal University of Rio Grande do Norte in Brazil have found that some cells in the hippocampus play a key role in risk-taking behaviour and anxiety.

In an article published in the journal Nature Communications, the authors show that neurons known as OLM cells, when stimulated, produce a brain rhythm that is present when animals feel safe in a threatening environment (for example, when they are hiding from a predator but aware of the predator’s proximity). The study, produced by Drs. Sanja Mikulovic, Ernesto Restrepo, Klas Kullander and Richardson Leao, among others, showed that anxiety and risk-taking behaviour can be controlled by the manipulation of OLM cells. To find a pathway that quickly and robustly modulates risk-taking behaviour is very important for treatment of pathological anxiety, since reduced risk-taking behaviour is a trait in people with high anxiety levels.

Adaptive (or normal) anxiety is essential for survival because it protects us from harm. Unfortunately, in a large number of people, anxiety can be dysfunctional and severely interfere with daily life. In these cases, doctors often rely on antidepressants to help patients recover from the dysfunctional state. However, these drugs act in the entire brain and not only in the areas where it is needed, and may therefore cause severe side-effects. Thus, drugs that affect a single brain region or a very specific group of cells may be a major breakthrough in treating anxiety and associated disorders like depression. Another interesting finding in the study is that OLM cells can be controlled by pharmacological agents. In the past, the same group of scientists found that OLM cells were the gatekeepers of memories in the hippocampus, and that these cells were very sensitive to nicotine.

“This finding may explain why people binge-smoke when they are anxious,” says Dr. Richardson Leao, researcher at the Brain Institute of the Federal University of Rio Grande do Norte.

The participation of the hippocampus in emotions is much less studied than its role in memory and cognition. In 2014, for example, the Nobel prize was awarded for the discovery of “place cells” that represent a biological GPS and underlie the memories of where we are located in our surroundings. In the past decade, scientists have also started to appreciate the role of the hippocampus in regulating emotions.

“It is fascinating how different regions of the same brain structure control distinct behaviours and how they interact with each other. Identifying specific circuits that underlie either cognitive or emotional processes is crucial for the general understanding of brain function and for more specific drug development to treat disorders,” says Dr. Sanja Mikulovic, Uppsala University.

The discovery of these neurons and their role in anxiety and risk-taking may open a path for the development of highly efficient anxiolytics and antidepressants without common side-effects, such as apathy.

Sanja Mikulovic et al, Ventral hippocampal OLM cells control type 2 theta oscillations and response to predator odor, Nature Communications (2018). DOI: 10.1038/s41467-018-05907-w

https://medicalxpress.com/news/2018-09-bravery-associated-cells-hippocampus.html

Scientists Have Detected an Entirely New Visual Phenomenon in The Human Eye

ghosted-images-1_1024

by DAVID NIELD

New research suggests the human eye and brain are capable of seeing ghosted images, a new type of visual phenomenon that scientists previously thought could only be detected by a computer. It turns out our eyes are more powerful than we thought.

The discovery could teach us more about the inner workings of the eye and brain and how they process information, as well as changing our thinking on what we human beings can truly see of the world around us.

Having been developed as a way of low-cost image capture for light outside the visible spectrum, the patterns produced by these ghosted images are usually processed by software algorithms – but, surprisingly, our eyes have the same capabilities.

“Ghost-imaging with the eye opens up a number of completely novel applications such as extending human vision into invisible wavelength regimes in real-time, bypassing intermediary screens or computational steps,” write the researchers.

“Perhaps even more interesting are the opportunities that ghost imaging offers for exploring neurological processes.”

Ghost imaging works using a camera with a single pixel, rather than the millions of pixels used by the sensors inside today’s digital cameras and smartphones. When it comes to capturing light beyond the visible spectrum, it’s even a more cost-effective method.

These single pixel cameras capture light as it reflects from an object – by watching different random patterns of bouncing light, and crunching through some calculations, the camera can gradually build up a picture of something even with just one pixel.

In some setups, the single pixel camera is used in combination with a second light, modulated in response to the first, and beamed back on the original random patterns. The advantage is that fewer patterns are needed to produce an image.

In this case a second camera using some smart algorithms can pick up the image without having looked at the object at all – just by looking at the patterns being cast and the light being produced from them.

That’s the ghosted image that was previously thought to only be visible to computers running specialist software. However, the new study shows the human visual perception can make sense of these patterns, called Hadamard patterns.

This diagram from the research paper should give you an idea of what’s happening:

ghosted-images-2

It’s a little bit like when our eyes and brains look at a series of still images and treat them as a moving picture – the same sort of subconscious processing seems to be going on.

Of the four volunteers who took part in the study, all four could make out an image of Albert Einstein sticking out his tongue from the Hadamard patterns. Interestingly, though, the illusion only appeared when the patterns were projected quickly enough.

If the rate dropped below 200 patterns per 20 milliseconds, the image couldn’t be seen by the study participants.

As the researchers point out, this is potentially hugely exciting – it means we might be able to devise simple systems to see light outside the visible spectrum, with no computer processing required in the middle.

That’s all to come – and this is really preliminary stuff, so we can’t get too carried away. For now, the team of researchers is using the findings to explore more about how our visual systems work, and whether our eyes and brains have yet-undiscovered superpowers for looking at the world around us.

The research has yet to be peer-reviewed, but you can read it on the pre-print resource Arxiv.

https://www.sciencealert.com/human-eye-sees-ghosted-images-reflected-light

Scientists Determine Four Personality Types Based on New Data

Researchers led by Northwestern Engineering’s Luis Amaral sifted through data from more than 1.5 million questionnaire respondents to find at least four distinct clusters of personality types exist — average, reserved, self-centered, and role model — challenging existing paradigms in psychology.

“People have tried to classify personality types since Hippocrates’s time, but previous scientific literature has found that to be nonsense,”said co-author William Revelle, professor of psychology at Northwestern University’s Weinberg College of Arts and Sciences.

“Now, these data show there are higher densities of certain personality types,” said Revelle, who specializes in personality measurement, theory, and research.

The new study appears in Nature Human Behaviour. The findings potentially could be of interest to hiring managers and mental healthcare providers.

Initially, Revelle was skeptical of the study’s premise. The concept of personality types remains controversial in psychology, with hard scientific proof difficult to find. Previous attempts based on small research groups created results that often were not replicable.

“Personality types only existed in self-help literature and did not have a place in scientific journals,” said Amaral, Erastus Otis Haven Professor of Chemical and Biological Engineering at the McCormick School of Engineering. “Now, we think this will change because of this study.”

The new research combined an alternative computational approach with data from four questionnaires, attracting more than 1.5 million respondents from around the world. The questionnaires, developed by the research community over the decades, have between 44 and 300 questions. People voluntarily take the online quizzes, attracted by the opportunity to receive feedback about their own personality.

These data are now being made available to other researchers for independent analyses.

“A study with a dataset this large would not have been possible before the web,” Amaral said. “Previously, researchers would recruit undergrads on campus and maybe get a few hundred people. Now, we have all these online resources available, and data is being shared.”

Average

Average people are high in neuroticism and extraversion, while low in openness. “I would expect that the typical person would be in this cluster,” said Martin Gerlach, a postdoctoral fellow in Amaral’s lab and the paper’s first author. Females are more likely than males to fall into the Average type.

Reserved

The Reserved type is emotionally stable, but not open or neurotic. They are not particularly extraverted but are somewhat agreeable and conscientious.

Role Models

Role Models score low in neuroticism and high in all the other traits. The likelihood that someone is a role model increases dramatically with age. “These are people who are dependable and open to new ideas,” Amaral said. “These are good people to be in charge of things. In fact, life is easier if you have more dealings with role models.” More women than men are likely to be role models.

Self-Centered

Self-Centered people score very high in extraversion and below average in openness, agreeableness and conscientiousness. “These are people you don’t want to hang out with,” Revelle said. There is a very dramatic decrease in the number of self-centered types as people age, both with women and men.

The group’s first attempt to sort the data used traditional clustering algorithms, but that yielded inaccurate results, Amaral said.

“At first, they came to me with 16 personality types, and there’s enough literature that I’m aware of that says that’s ridiculous,” Revelle said. “I believed there were no types at all.”

He challenged Amaral and Gerlach to refine their data.

“Machine learning and data science are promising but can be seen as a little bit of a religion,” Amaral said. “You still need to test your results. We developed a new method to guide people to solve the clustering problem to test the findings.”

Their algorithm first searched for many clusters using traditional clustering methods, but then winnowed them down by imposing additional constraints. This procedure revealed the four groups they reported.

“The data came back, and they kept coming up with the same four clusters of higher density and at higher densities than you’d expect by chance, and you can show by replication that this is statistically unlikely,” Revelle said.

“I like data, and I believe these results,” he added. “The methodology is the main part of the paper’s contribution to science.”

To be sure the new clusters of types were accurate, the researchers used a notoriously self-centered group—teenaged boys—to validate their information.

“We know teen boys behave in self-centered ways,” Amaral said. “If the data were correct and sifted for demographics, they would they turn out to be the biggest cluster of people.”

Indeed, young males are overrepresented in the Self-Centered group, while females over 15 years old are vastly underrepresented.

Along with serving as a tool that can help mental health service providers assess for personality types with extreme traits, Amaral said the study’s results could be helpful for hiring managers looking to insure a potential candidate is a good fit or for people who are dating and looking for an appropriate partner.

And good news for parents of teenagers everywhere: As people mature, their personality types often shift. For instance, older people tend to be less neurotic yet more conscientious and agreeable than those under 20 years old.

“When we look at large groups of people, it’s clear there are trends, that some people may be changing some of these characteristics over time,” Amaral said. “This could be a subject of future research.”

This article has been republished from materials provided by Northwestern University. Note: material may have been edited for length and content. For further information, please contact the cited source.

Reference:

Martin Gerlach, Beatrice Farb, William Revelle, Luís A. Nunes Amaral. A robust data-driven approach identifies four personality types across four large data sets. Nature Human Behaviour, 2018; DOI: 10.1038/s41562-018-0419-z

‘Mindful people’ feel less pain; MRI imaging pinpoints supporting brain activity

mindfulpeopl
Greater deactivation of the posterior cingulate cortex, a brain region associated with processing self-related thoughts, was associated with lower pain and higher trait mindfulness. Credit: Wake Forest Baptist Medical Center

Ever wonder why some people seem to feel less pain than others? A study conducted at Wake Forest School of Medicine may have found one of the answers—mindfulness. “Mindfulness is related to being aware of the present moment without too much emotional reaction or judgment,” said the study’s lead author, Fadel Zeidan, Ph.D., assistant professor of neurobiology and anatomy at the medical school, part of Wake Forest Baptist Medical Center. “We now know that some people are more mindful than others, and those people seemingly feel less pain.”

The study is an article in press, published ahead-of-print in the journal Pain.

The researchers analyzed data obtained from a study published in 2015 that compared mindfulness meditation to placebo analgesia. In this follow-up study, Zeidan sought to determine if dispositional mindfulness, an individual’s innate or natural level of mindfulness, was associated with lower pain sensitivity, and to identify what brain mechanisms were involved.

In the study, 76 healthy volunteers who had never meditated first completed the Freiburg Mindfulness Inventory, a reliable clinical measurement of mindfulness, to determine their baseline levels. Then, while undergoing functional magnetic resonance imaging, they were administered painful heat stimulation (120°F).

Whole brain analyses revealed that higher dispositional mindfulness during painful heat was associated with greater deactivation of a brain region called the posterior cingulate cortex, a central neural node of the default mode network. Further, in those that reported higher pain, there was greater activation of this critically important brain region.

The default mode network extends from the posterior cingulate cortex to the medial prefrontal cortex of the brain. These two brain regions continuously feed information back and forth. This network is associated with processing feelings of self and mind wandering, Zeidan said.

“As soon as you start performing a task, the connection between these two brain regions in the default mode network disengages and the brain allocates information and processes to other neural areas,” he said.

“Default mode deactivates whenever you are performing any kind of task, such as reading or writing. Default mode network is reactivated whenever the individual stops performing a task and reverts to self-related thoughts, feelings and emotions. The results from our study showed that mindful individuals are seemingly less caught up in the experience of pain, which was associated with lower pain reports.”

The study provided novel neurobiological information that showed people with higher mindfulness ratings had less activation in the central nodes (posterior cingulate cortex) of the default network and experienced less pain. Those with lower mindfulness ratings had greater activation of this part of the brain and also felt more pain, Zeidan said.

“Now we have some new ammunition to target this brain region in the development of effective pain therapies. Importantly this work shows that we should consider one’s level of mindfulness when calculating why and how one feels less or more pain,” Zeidan said. “Based on our earlier research, we know we can increase mindfulness through relatively short periods of mindfulness meditation training, so this may prove to be an effective way to provide pain relief for the millions of people suffering from chronic pain.”

https://medicalxpress.com/news/2018-09-mindful-people-pain-mri-imaging.html

40,000 Volunteers Needed for Largest Ever Study of the Genetics of Anxiety and Depression

largest-ever-study-of-genetic-links-to-depression-and-anxiety-launched-309700

The NIHR and King’s College London are calling for 40,000 people diagnosed with depression or anxiety to enrol online for the Genetic Links to Anxiety and Depression (GLAD) Study and join the NIHR Mental Health Bioresource.

Researchers hope to establish the largest ever database of volunteers who can be called up to take part in research exploring the genetic factors behind the two most common mental health conditions – anxiety and depression.

[youtube=https://youtu.be/wzgvS8gU2Ss\

The GLAD study will make important strides towards better understanding of these disorders and provide a pool of potential participants for future studies, reducing the time-consuming process of recruiting patients for research.

Research has shown 30-40% of the risk for both depression and anxiety is genetic and 60-70% due to environmental factors. Only by having a large, diverse group of people available for studies will researchers be able to determine how genetic and environmental triggers interact to cause anxiety and depression.

Leader of the GLAD study and the NIHR Mental Health BioResource, Dr Gerome Breen of King’s College London, said: “It’s a really exciting time to become involved in mental health research, particularly genetic research which has made incredible strides in recent years – we have so far identified 46 genetic links for depression and anxiety.

“By recruiting 40,000 volunteers willing to be re-contacted for research, the GLAD Study will take us further than ever before. It will allow researchers to solve the big unanswered questions, address how genes and environment act together and help develop new treatment options.”

The GLAD Study, a collaboration between the NIHR BioResource and King’s College London, has been designed to be particularly accessible, with a view to motivating more people to take part in mental health research.

Research psychologist and study lead Professor Thalia Eley, King’s College London, said: “We want to hear from all different backgrounds, cultures, ethnic groups and genders, and we are especially keen to hear from young adults. By including people from all parts of the population, what we learn will be relevant to everyone. This is a unique opportunity to participate in pioneering medical science.”

https://www.nihr.ac.uk/news/nihr-launches-largest-ever-study-of-genetic-links-to-depression-and-anxiety/9201

Infectious Theory Of Alzheimer’s Disease Draws Fresh Interest

germ_custom-ce83850a07c80ed6e717ea56370b3c5140eb2f3f-s800-c85

by BRET STETKA

Dr. Leslie Norins is willing to hand over $1 million of his own money to anyone who can clarify something: Is Alzheimer’s disease, the most common form of dementia worldwide, caused by a germ?

By “germ” he means microbes like bacteria, viruses, fungi and parasites. In other words, Norins, a physician turned publisher, wants to know if Alzheimer’s is infectious.

It’s an idea that just a few years ago would’ve seemed to many an easy way to drain your research budget on bunk science. Money has poured into Alzheimer’s research for years, but until very recently not much of it went toward investigating infection in causing dementia.

But this “germ theory” of Alzheimer’s, as Norins calls it, has been fermenting in the literature for decades. Even early 20th century Czech physician Oskar Fischer — who, along with his German contemporary Dr. Alois Alzheimer, was integral in first describing the condition — noted a possible connection between the newly identified dementia and tuberculosis.

If the germ theory gets traction, even in some Alzheimer’s patients, it could trigger a seismic shift in how doctors understand and treat the disease.

For instance, would we see a day when dementia is prevented with a vaccine, or treated with antibiotics and antiviral medications? Norins thinks it’s worth looking into.

Norins received his medical degree from Duke in the early 1960s, and after a stint at the Centers for Disease Control and Prevention he fell into a lucrative career in medical publishing. He eventually settled in an admittedly aged community in Naples, Fla., where he took an interest in dementia and began reading up on the condition.

After scouring the medical literature he noticed a pattern.

“It appeared that many of the reported characteristics of Alzheimer’s disease were compatible with an infectious process,” Norins tells NPR. “I thought for sure this must have already been investigated, because millions and millions of dollars have been spent on Alzheimer’s research.”

But aside from scattered interest through the decades, this wasn’t the case.

In 2017, Norins launched Alzheimer’s Germ Quest Inc., a public benefit corporation he hopes will drive interest into the germ theory of Alzheimer’s, and through which his prize will be distributed. A white paper he penned for the site reads: “From a two-year review of the scientific literature, I believe it’s now clear that just one germ — identity not yet specified, and possibly not yet discovered — causes most AD. I’m calling it the ‘Alzheimer’s Germ.’ ”

Norins is quick to cite sources and studies supporting his claim, among them a 2010 study published in the Journal of Neurosurgery showing that neurosurgeons die from Alzheimer’s at a nearly 2 1/2 times higher rate than they do from other disorders.

Another study from that same year, published in The Journal of the American Geriatric Society, found that people whose spouses have dementia are at a 1.6 times greater risk for the condition themselves.

Contagion does come to mind. And Norins isn’t alone in his thinking.

In 2016, 32 researchers from universities around the world signed an editorial in the Journal of Alzheimer’s Disease calling for “further research on the role of infectious agents in [Alzheimer’s] causation.” Based on much of the same evidence Norins encountered, the authors concluded that clinical trials with antimicrobial drugs in Alzheimer’s are now justified.

NPR reported on an intriguing study published in Neuron in June that suggested that viral infection can influence the progression of Alzheimer’s. Led by Mount Sinai genetics professor Joel Dudley, the work was intended to compare the genomes of healthy brain tissue with that affected by dementia.

But something kept getting in the way: herpes.

Dudley’s team noticed an unexpectedly high level of viral DNA from two human herpes viruses, HHV-6 and HHV-7. The viruses are common and cause a rash called roseola in young children (not the sexually transmitted disease caused by other strains).

Some viruses have the ability to lie dormant in our neurons for decades by incorporating their genomes into our own. The classic example is chickenpox: A childhood viral infection resolves and lurks silently, returning years later as shingles, an excruciating rash. Like it or not, nearly all of us are chimeras with viral DNA speckling our genomes.

But having the herpes viruses alone doesn’t mean inevitable brain decline. After all, up to 75 percent of us may harbor HHV-6 .

But Dudley also noticed that herpes appeared to interact with human genes known to increase Alzheimer’s risk. Perhaps, he says, there is some toxic combination of genetic and infectious influence that results in the disease; a combination that sparks what some feel is the main contributor to the disease, an overactive immune system.

The hallmark pathology of Alzheimer’s is accumulation of a protein called amyloid in the brain. Many researchers have assumed these aggregates, or plaques, are simply a byproduct of some other process at the core of the disease. Other scientists posit that the protein itself contributes to the condition in some way.

The theory that amyloid is the root cause of Alzheimer’s is losing steam. But the protein may still contribute to the disease, even if it winds up being deemed infectious.

Work by Harvard neuroscientist Rudolph Tanzi suggests it might be a bit of both. Along with colleague Robert Moir, Tanzi has shown that amyloid is lethal to viruses and bacteria in the test tube, and also in mice. He now believes the protein is part of our ancient immune system that like antibodies, ramps up its activity to help fend off unwanted bugs.

So does that mean that the microbe is the cause of Alzheimer’s, and amyloid a harmless reaction to it? According to Tanzi it’s not that simple.

Tanzi believes that in many cases of Alzheimer’s, microbes are probably the initial seed that sets off a toxic tumble of molecular dominoes. Early in the disease amyloid protein builds up to fight infection, yet too much of the protein begins to impair function of neurons in the brain. The excess amyloid then causes another protein, called tau, to form tangles, which further harm brain cells.

But as Tanzi explains, the ultimate neurological insult in Alzheimer’s is the body’s reaction to this neurotoxic mess. All the excess protein revs up the immune system, causing inflammation — and it’s this inflammation that does the most damage to the Alzheimer’s-afflicted brain.

So what does this say about the future of treatment? Possibly a lot. Tanzi envisions a day when people are screened at, say, 50 years old. “If their brains are riddled with too much amyloid,” he says, “we knock it down a bit with antiviral medications. It’s just like how you are prescribed preventative drugs if your cholesterol is too high.”

Tanzi feels that microbes are just one possible seed for the complex pathology behind Alzheimer’s. Genetics may also play a role, as certain genes produce a type of amyloid more prone to clumping up. He also feels environmental factors like pollution might contribute.

Dr. James Burke, professor of medicine and psychiatry at Duke University’s Alzheimer’s Disease Research Center, isn’t willing to abandon the amyloid theory altogether, but agrees it’s time for the field to move on. “There may be many roads to developing Alzheimer’s disease and it would be shortsighted to focus just on amyloid and tau,” he says. “A million-dollar prize is attention- getting, but the reward for identifying a treatable target to delay or prevent Alzheimer’s disease is invaluable.”

Any treatment that disrupts the cascade leading to amyloid, tau and inflammation could theoretically benefit an at-risk brain. The vast majority of Alzheimer’s treatment trials have failed, including many targeting amyloid. But it could be that the patients included were too far along in their disease to reap any therapeutic benefit.

If a microbe is responsible for all or some cases of Alzheimer’s, perhaps future treatments or preventive approaches will prevent toxin protein buildup in the first place. Both Tanzi and Norins believe Alzheimer’s vaccines against viruses like herpes might one day become common practice.

In July of this year, in collaboration with Norins, the Infectious Diseases Society of America announced that they plan to offer two $50,000 grants supporting research into a microbial association with Alzheimer’s. According to Norins, this is the first acknowledgement by a leading infectious disease group that Alzheimer’s may be microbial in nature – or at least that it’s worth exploring.

“The important thing is not the amount of the money, which is a pittance compared with the $2 billion NIH spends on amyloid and tau research,” says Norins, “but rather the respectability and more mainstream status the grants confer on investigating of the infectious possibility. Remember when we thought ulcers were caused by stress?”

Ulcers, we now know, are caused by a germ.

https://www.npr.org/sections/health-shots/2018/09/09/645629133/infectious-theory-of-alzheimers-disease-draws-fresh-interest?ft=nprml&f=1001

Scientists Say They’ve Found The Driver of False Beliefs, And It’s Not a Lack of Intelligence

flatearthbeliefsconfidence_Web_1024

by DAVID NIELD

Why is it sometimes so hard to convince someone that the world is indeed a globe, or that climate change is actually caused by human activity, despite the overwhelming evidence?

Scientists think they might have the answer, and it’s less to do with lack of understanding, and more to do with the feedback they’re getting.

Getting positive or negative reactions to something you do or say is a greater influence on your thinking than logic and reasoning, the new research suggests – so if you’re in a group of like-minded people, that’s going to reinforce your thinking.

Receiving good feedback also encourages us to think we know more than we actually do.

In other words, the more sure we become that our current position is right, the less likely we are to take into account other opinions or even cold, hard scientific data.

“If you think you know a lot about something, even though you don’t, you’re less likely to be curious enough to explore the topic further, and will fail to learn how little you know,” says one of the team members behind the new study, Louis Marti from the University of California, Berkeley.

For the research, more than 500 participants were recruited and shown a series of colored shapes. As each shape appeared, the participants got asked if it was a “Daxxy” – a word made up for these experiments.

The test takers had no clues as to what a Daxxy was or wasn’t, but they did get feedback after guessing one way or the other – the system would tell them if the shape they were looking at qualified as a Daxxy or not. At the same time they were also asked how sure they were about what a Daxxy actually was.

In this way the researchers were able to measure certainty in relation to feedback. Results showed the confidence of the participants was largely based on the results of their last four or five guesses, not their performance overall.

You can see the researchers explain the experiment in the video below:

The team behind the tests says this plays into something we already know about learning – that for it to happen, learners need to recognise that there is a gap between what they currently know and what they could know. If they don’t think that gap is there, they won’t take on board new information.

“What we found interesting is that they could get the first 19 guesses in a row wrong, but if they got the last five right, they felt very confident,” says Marti. “It’s not that they weren’t paying attention, they were learning what a Daxxy was, but they weren’t using most of what they learned to inform their certainty.”

This recent feedback is having more of an effect than hard evidence, the experiments showed, and that might apply in a broader sense too. It could apply to learning something new or trying to differentiate between right and wrong.

And while in this case the study participants were trying to identify a made-up shape, the same cognitive processes could be at work when it comes to echo chambers on social media or on news channels – where views are constantly reinforced.

“If you use a crazy theory to make a correct prediction a couple of times, you can get stuck in that belief and may not be as interested in gathering more information,” says one of the team, psychologist Celeste Kidd from UC Berkeley.

So if you think vaccinations are harmful, for example, the new study suggests you might be basing that on the most recent feedback you’ve had on your views, rather than the overall evidence one way or the other.

Ideally, the researchers say, learning should be based on more considered observations over time – even if that’s not quite how the brain works sometimes.

“If your goal is to arrive at the truth, the strategy of using your most recent feedback, rather than all of the data you’ve accumulated, is not a great tactic,” says Marti.

The research has been published in Open Mind.

https://www.sciencealert.com/feedback-study-explains-why-false-beliefs-stick

The Brain’s Immune Cells are to Blame for Obesity-associated Cognitive Decline

brains-immune-cells-to-blame-for-obesity-associated-cognitive-decline-309339

Obesity leads to cognitive impairment by activating microglial cells, which consume otherwise functional synapses in the hippocampus, according to a study of male mice published in JNeurosci. The research suggests that microglia may be a potential therapeutic target for one of the lesser known effects of this global health epidemic on the brain.

Nearly two billion adults worldwide are overweight, more than 600 million of whom are obese. In addition to increasing risk of conditions such as diabetes and heart disease, obesity is also a known risk factor for cognitive disorders including Alzheimer’s disease. The cellular mechanisms that contribute to cognitive decline in obesity, however, are not well understood.

Elise Cope and colleagues replicated previous research by demonstrating diet-induced obesity in mice impairs performance on cognitive tasks dependent on the hippocampus and results in loss of dendritic spines — the neuronal protrusions that receive signals from other cells — and activates microglia. Using genetic and pharmacological approaches to block microglial activity, the researchers established microglia are causally linked to obesity-induced dendritic spine loss and cognitive decline. The results suggest obesity may drive microglia into a synapse-eating frenzy that contributes to the cognitive deficits observed in this condition.

https://www.technologynetworks.com/neuroscience/news/brains-immune-cells-to-blame-for-obesity-associated-cognitive-decline-309339?utm_campaign=NEWSLETTER_TN_Neuroscience_2017&utm_source=hs_email&utm_medium=email&utm_content=65859986&_hsenc=p2ANqtz-8GahP4LE2EOoHR4ShLvP0WjIDGrQksSkIDt93_VTrGea3qFC8v7VaOr9RXxmjjl8VDuNn2DK1PVOXEa5FBOdgl-GvhlA&_hsmi=65859986

ADHD Tied to Raised Risk of Early Parkinson’s Disease

SS39069

By Alan Mozes

People with attention-deficit/hyperactivity disorder (ADHD) may be more than twice as likely to develop an early onset form of Parkinson’s, new research warns.

What’s more, among “those ADHD patients who had a record of being treated with amphetamine-like drugs — especially Ritalin [methylphenidate] — the risk dramatically increased, to between eight- to nine-fold,” said senior study author Glen Hanson.

But his team did not prove that ADHD or its medications actually caused Parkinson’s risk to rise, and one ADHD expert noted that the absolute risk of developing Parkinson’s remains very small.

For the study, researchers analyzed nearly 200,000 Utah residents. All had been born between 1950 and 1992, with Parkinson’s onset tracked up until the age of 60.

Prior to any Parkinson’s diagnosis, roughly 32,000 had been diagnosed with ADHD.

Hanson, a professor of pharmacology and toxicology at the University of Utah, said that ADHD patients were found to be “2.4 times more likely to develop Parkinson’s disease-like disorders prior to the age of 50 to 60 years,” compared with those with no history of ADHD. That finding held up even after accounting for a number of influential factors, including smoking, drug and alcohol abuse, and other psychiatric disorders.

“Although we cannot accurately say how much time elapsed between ADHD and [a] Parkinson’s-like disorder diagnosis, it was probably between 20 to 50 years,” he said.

As to what might explain the link, Hanson said that both ADHD and most forms of Parkinson’s source back to a “functional disorder of central nervous system dopamine pathways.”

In addition, Hanson said that “the drugs used to treat ADHD apparently work because of their profound effects on the activity of these dopamine pathways.” Theoretically, the treatment itself might trigger a metabolic disturbance, promoting dopamine pathway degeneration and, ultimately, Parkinson’s, he explained.

Still, Hanson pointed out that, for now, “we are not able to determine if the increased risk associated with stimulant use is due to the presence of the drug or the severity of the ADHD,” given that those treated with ADHD drugs tend to have more severe forms of the disorder.

And while demonstrating “a very strong association” between ADHD and Parkinson’s risk, the findings are preliminary, the study authors added.

Also, the absolute risk of developing Parkinson’s remained low, even in the most pessimistic scenario.

For example, the findings suggest that the risk of developing early onset Parkinson’s before the age of 50 would be eight or nine people out of every 100,000 with ADHD. This compares with one or two out of every 100,000 among those with no history of ADHD, the researchers said.

But the scientists noted that the results should raise eyebrows, because Parkinson’s primarily strikes people over the age of 60. Given the age range of those tracked so far in the study, Hanson said that his team was not yet able to ascertain Parkinson’s risk among ADHD patients after the age of 60.

Hanson also pointed out that because ADHD was only first diagnosed in the 1960s, only about 1.5 percent of the people in the study had an ADHD diagnosis, despite current estimates that peg ADHD prevalence at 10 percent. That suggests that the current findings may underestimate the scope of the problem.

“Clearly, there are some critical questions left to be answered concerning what is the full impact of this increased risk,” Hanson said.

Dr. Andrew Adesman is chief of developmental and behavioral pediatrics at Cohen Children’s Medical Center of New York with Northwell Health in New York City. He was not involved with the study and said the findings “surprised” him.

But, “we need to keep in mind that this study needs to be replicated and that the incidence of these conditions was very low, even among those with ADHD,” Adesman said. “The reality is that this would not affect 99.99 percent of individuals with ADHD.”

Meanwhile, Adesman said, “given that this study needs to be replicated, given that it is unclear whether ADHD medications further increase the risks of Parkinson’s, and given the very low risk in an absolute sense, I believe individuals with ADHD should not be hesitant to pursue or continue medical treatment for their ADHD.”

The report was published online Sept. 12 in the journal Neuropsychopharmacology.

Glen Hanson, DDS, Ph.D., vice dean and professor, pharmacology, School of Dentistry, University of Utah, Salt Lake City; Andrew Adesman, M.D., chief, developmental and behavioral pediatrics, Steven & Alexandra Cohen Children’s Medical Center of New York, Northwell Health, New York City; Sept. 12, 2018, Neuropsychopharmacology, online

https://consumer.healthday.com/cognitive-health-information-26/parkinson-s-news-526/adhd-tied-to-raised-risk-of-early-parkinson-s-737637.html