What is Your First memory – and Did it Ever Really Happen?

what-is-your-first-memory-and-did-it-ever-really-happen-309444

By Dr. Lucy Justice

I can remember being a baby. I recall being in a vast room inside a doctor’s surgery. I was passed to a nurse and then placed in cold metal scales to be weighed. I was always aware that this memory was unusual because it was from so early in my life, but I thought that perhaps I just had a really good memory, or that perhaps other people could remember being so young, too.

What is the earliest event that you can remember? How old do you think you are in this memory? How do you experience the memory? Is it vivid or vague? Positive or negative? Are you re-experiencing the memory as it originally happened, through your own eyes, or are you watching yourself “acting” in the memory?

In our recent study, we asked more than 6,000 people of all ages to do the same, to tell us what their first autobiographical memory was, how old they were when the event happened, to rate how emotional and vivid it was and to report what perspective the memory was “seen” from. We found that on average people reported their first memory occurring during the first half of the third year of their lives (3.24 years to be precise). This matches well with other studies that have investigated the age of early memories.

What does this mean for my memory of being a baby then? Perhaps I do just have a really good memory and can remember those early months of life. Indeed, in our study, we found that around 40% of participants reported remembering events from the age of two or below – and 14% of people recalled memories from age one and below. However, psychological research suggests that memories occurring below the age of three are highly unusual – and indeed, highly improbable.

The origin of memory

Researchers who have investigated memory development suggest that the neurological processes needed to form autobiographical memories are not fully developed until between the ages of three and four years. Other research has suggested that memories are linked to language development. Language allows children to share and discuss the past with others, enabling memories to be organised in a personal autobiography.

So how can I remember being a baby? And why did 2,487 people from our study remember events that they dated from the age of two years and younger?

One explanation is that people simply gave incorrect estimates of their age in the memory. After all, unless confirmatory evidence is present, guesswork is all we have when it comes to dating memories from across our lives, including the very earliest.

But if incorrect dating explained the presence of these memories, we would expect that they would be about similar events to those memories from ages three and above. But this was not the case – we found that very early reported memories were of events and objects from infancy (pram, cot, learning to walk) whereas older memories were of things typical of childhood (toys, school, holidays). This finding meant that these two groups of memories were qualitatively different and ruled out the misdating explanation.

If research tells us that these very early memories are highly unlikely, and we have ruled out a misdating explanation, then why do people, including me, have them?

Pure fiction?

We concluded that these memories are likely to be fictional – that is, that they never in fact occurred. Perhaps, rather than recalling an experienced event, we recall imagery derived from photographs, home movies, shared family stories or events and activities that frequently happen in infancy. These facts are then, we suggest, linked with some fragmentary visual imagery and are combined together to form the basis of these fictitious early memories. Over time, this combination of imagery and fact begins to be experienced as a memory.

Although 40% of participants in our study retrieved these fictitious memories, they are not altogether surprising. Contemporary theories of memory highlight the constructive nature of memory; memories are not “records” of events, but rather psychological representations of the self in the past.

In other words, all of our memories contain some degree of fiction – indeed, this is the sign of a healthy memory system in action. But perhaps, for reasons not yet known, we have a psychological need to fictionalise memories from times of our lives that we are unable to remember. For now, these “stories” remain a mystery.

https://theconversation.com/what-is-your-first-memory-and-did-it-ever-really-happen-95953

New tiny sensors track dopamine in the brain for more than a year, and could be useful for monitoring patients with Parkinson’s and other diseases.

Mit-Dopamine-Tracking_0

By Anne Trafton

Dopamine, a signaling molecule used throughout the brain, plays a major role in regulating our mood, as well as controlling movement. Many disorders, including Parkinson’s disease, depression, and schizophrenia, are linked to dopamine deficiencies.

MIT neuroscientists have now devised a way to measure dopamine in the brain for more than a year, which they believe will help them to learn much more about its role in both healthy and diseased brains.

“Despite all that is known about dopamine as a crucial signaling molecule in the brain, implicated in neurologic and neuropsychiatric conditions as well as our abilty to learn, it has been impossible to monitor changes in the online release of dopamine over time periods long enough to relate these to clinical conditions,” says Ann Graybiel, an MIT Institute Professor, a member of MIT’s McGovern Institute for Brain Research, and one of the senior authors of the study.

Michael Cima, the David H. Koch Professor of Engineering in the Department of Materials Science and Engineering and a member of MIT’s Koch Institute for Integrative Cancer Research, and Rober Langer, the David H. Koch Institute Professor and a member of the Koch Institute, are also senior authors of the study. MIT postdoc Helen Schwerdt is the lead author of the paper, which appears in the Sept. 12 issue of Communications Biology.

Long-term sensing

Dopamine is one of many neurotransmitters that neurons in the brain use to communicate with each other. Traditional systems for measuring dopamine — carbon electrodes with a shaft diameter of about 100 microns — can only be used reliably for about a day because they produce scar tissue that interferes with the electrodes’ ability to interact with dopamine.

In 2015, the MIT team demonstrated that tiny microfabricated sensors could be used to measure dopamine levels in a part of the brain called the striatum, which contains dopamine-producing cells that are critical for habit formation and reward-reinforced learning.

Because these probes are so small (about 10 microns in diameter), the researchers could implant up to 16 of them to measure dopamine levels in different parts of the striatum. In the new study, the researchers wanted to test whether they could use these sensors for long-term dopamine tracking.

“Our fundamental goal from the very beginning was to make the sensors work over a long period of time and produce accurate readings from day to day,” Schwerdt says. “This is necessary if you want to understand how these signals mediate specific diseases or conditions.”

To develop a sensor that can be accurate over long periods of time, the researchers had to make sure that it would not provoke an immune reaction, to avoid the scar tissue that interferes with the accuracy of the readings.

The MIT team found that their tiny sensors were nearly invisible to the immune system, even over extended periods of time. After the sensors were implanted, populations of microglia (immune cells that respond to short-term damage), and astrocytes, which respond over longer periods, were the same as those in brain tissue that did not have the probes inserted.

In this study, the researchers implanted three to five sensors per animal, about 5 millimeters deep, in the striatum. They took readings every few weeks, after stimulating dopamine release from the brainstem, which travels to the striatum. They found that the measurements remained consistent for up to 393 days.

“This is the first time that anyone’s shown that these sensors work for more than a few months. That gives us a lot of confidence that these kinds of sensors might be feasible for human use someday,” Schwerdt says.

Paul Glimcher, a professor of physiology and neuroscience at New York University, says the new sensors should enable more researchers to perform long-term studies of dopamine, which is essential for studying phenomena such as learning, which occurs over long time periods.

“This is a really solid engineering accomplishment that moves the field forward,” says Glimcher, who was not involved in the research. “This dramatically improves the technology in a way that makes it accessible to a lot of labs.”

Monitoring Parkinson’s

If developed for use in humans, these sensors could be useful for monitoring Parkinson’s patients who receive deep brain stimulation, the researchers say. This treatment involves implanting an electrode that delivers electrical impulses to a structure deep within the brain. Using a sensor to monitor dopamine levels could help doctors deliver the stimulation more selectively, only when it is needed.

The researchers are now looking into adapting the sensors to measure other neurotransmitters in the brain, and to measure electrical signals, which can also be disrupted in Parkinson’s and other diseases.

“Understanding those relationships between chemical and electrical activity will be really important to understanding all of the issues that you see in Parkinson’s,” Schwerdt says.

The research was funded by the National Institute of Biomedical Imaging and Bioengineering, the National Institute of Neurological Disorders and Stroke, the Army Research Office, the Saks Kavanaugh Foundation, the Nancy Lurie Marks Family Foundation, and Dr. Tenley Albright.

https://news.mit.edu/2018/brain-dopamine-tracking-sensors-0912

Identification of types of chronic pain patients who will respond to placebo.

sugar-pills-relieve-pain-for-chronic-pain-patients-309475

Someday doctors may prescribe sugar pills for certain chronic pain patients based on their brain anatomy and psychology. And the pills will reduce their pain as effectively as any powerful drug on the market, according to new research.

Northwestern Medicine scientists have shown they can reliably predict which chronic pain patients will respond to a sugar placebo pill based on the patients’ brain anatomy and psychological characteristics.

“Their brain is already tuned to respond,” said senior study author A. Vania Apkarian, professor of physiology at Northwestern University Feinberg School of Medicine. “They have the appropriate psychology and biology that puts them in a cognitive state that as soon as you say, ‘this may make your pain better,’ their pain gets better.”

There’s no need to fool the patient, Apkarian said.

“You can tell them, ‘I’m giving you a drug that has no physiological effect but your brain will respond to it,'” he said. “You don’t need to hide it. There is a biology behind the placebo response.”

The study was published Sept. 12 in Nature Communications.

The findings have three potential benefits:

Prescribing non-active drugs rather than active drugs. “It’s much better to give someone a non-active drug rather than an active drug and get the same result,” Apkarian said. “Most pharmacological treatments have long-term adverse effects or addictive properties. Placebo becomes as good an option for treatment as any drug we have on the market.”

Eliminating the placebo effect from drug trials. “Drug trials would need to recruit fewer people, and identifying the physiological effects would be much easier,” Apkarian said. “You’ve taken away a big component of noise in the study.”

Reduced health care costs. A sugar pill prescription for chronic pain patients would result in vast cost savings for patients and the health care system, Apkarian said.

How the study worked

About 60 chronic back pain patients were randomized into two arms of the study. In one arm, subjects didn’t know if they got the drug or the placebo. Researchers didn’t study the people who got the real drug. The other study arm included people who came to the clinic but didn’t get a placebo or drug. They were the control group.

The individuals whose pain decreased as a result of the sugar pill had a similar brain anatomy and psychological traits. The right side of their emotional brain was larger than the left, and they had a larger cortical sensory area than people who were not responsive to the placebo. The chronic pain placebo responders also were emotionally self-aware, sensitive to painful situations and mindful of their environment.

“Clinicians who are treating chronic pain patients should seriously consider that some will get as good a response to a sugar pill as any other drug,” Apkarian said. “They should use it and see the outcome. This opens up a whole new field.”

https://news.northwestern.edu/stories/2018/september/sugar-pills-relieve-pain-for-chronic-pain-patients/

Alzheimer’s one day may be predicted during eye exam

By Jim Dryden

It may be possible in the future to screen patients for Alzheimer’s disease using an eye exam.

Using technology similar to what is found in many eye doctors’ offices, researchers at Washington University School of Medicine in St. Louis have detected evidence suggesting Alzheimer’s in older patients who had no symptoms of the disease.

Their study, involving 30 patients, is published Aug. 23 in the journal JAMA Ophthalmology.

“This technique has great potential to become a screening tool that helps decide who should undergo more expensive and invasive testing for Alzheimer’s disease prior to the appearance of clinical symptoms,” said the study’s first author, Bliss E. O’Bryhim, MD, PhD, a resident physician in the Department of Ophthalmology & Visual Sciences. “Our hope is to use this technique to understand who is accumulating abnormal proteins in the brain that may lead them to develop Alzheimer’s.”

Significant brain damage from Alzheimer’s disease can occur years before any symptoms such as memory loss and cognitive decline appear. Scientists estimate that Alzheimer’s-related plaques can build up in the brain two decades before the onset of symptoms, so researchers have been looking for ways to detect the disease sooner.

Physicians now use PET scans and lumbar punctures to help diagnose Alzheimer’s, but they are expensive and invasive.

In previous studies, researchers examining the eyes of people who had died from Alzheimer’s have reported that the eyes of such patients showed signs of thinning in the center of the retina and degradation of the optic nerve.

In the new study, the researchers used a noninvasive technique — called optical coherence tomography angiography — to examine the retinas in eyes of 30 study participants with an average age in the mid 70s, none of whom exhibited clinical symptoms of Alzheimer’s.

Those participants were patients in The Memory and Aging Project at Washington University’s Knight Alzheimer’s Disease Research Center. About half of those in the study had elevated levels of the Alzheimer’s proteins amyloid or tau as revealed by PET scans or cerebrospinal fluid, suggesting that although they didn’t have symptoms, they likely would develop Alzheimer’s. In the other subjects, PET scans and cerebrospinal fluid analyses were normal.

“In the patients with elevated levels of amyloid or tau, we detected significant thinning in the center of the retina,” said co-principal investigator Rajendra S. Apte, MD, PhD, the Paul A. Cibis Distinguished Professor of Ophthalmology and Visual Sciences. “All of us have a small area devoid of blood vessels in the center of our retinas that is responsible for our most precise vision. We found that this zone lacking blood vessels was significantly enlarged in people with preclinical Alzheimer’s disease.”

The eye test used in the study shines light into the eye, allowing a doctor to measure retinal thickness, as well as the thickness of fibers in the optic nerve. A form of that test often is available in ophthalmologist’s offices.

For this study, however, the researchers added a new component to the more common test: angiography, which allows doctors to distinguish red blood cells from other tissue in the retina.

“The angiography component allows us to look at blood-flow patterns,” said the other co-principal investigator, Gregory P. Van Stavern, MD, a professor of ophthalmology and visual sciences. “In the patients whose PET scans and cerebrospinal fluid showed preclinical Alzheimer’s, the area at the center of the retina without blood vessels was significantly larger, suggesting less blood flow.”

Added Apte: “The retina and central nervous system are so interconnected that changes in the brain could be reflected in cells in the retina.”

Of the patients studied, 17 had abnormal PET scans and/or lumbar punctures, and all of them also had retinal thinning and significant areas without blood vessels in the centers of their retinas. The retinas appeared normal in the patients whose PET scans and lumbar punctures were within the typical range.

More studies in patients are needed to replicate the findings, Van Stavern said, but he noted that if changes detected with this eye test can be used as markers for Alzheimer’s risk, it may be possible one day to screen people as young as their 40s or 50s to see whether they are at risk for the disease.

“We know the pathology of Alzheimer’s disease starts to develop years before symptoms appear, but if we could use this eye test to notice when the pathology is beginning, it may be possible one day to start treatments sooner to delay further damage,” he said.

O’Bryhim BE, Apte RS, Kung N, Coble D, Van Stavern GP. Optical coherence tomography angiography findings in pre-clinical Alzheimer’s disease. JAMA Ophthalmology, Aug. 23, 2018.

Alzheimer’s one day may be predicted during eye exam

Scientists have found a previously unknown mechanism in which the protein tau, which is implicated in Alzheimer’s disease, damages brain cells by interfering with their internal communications.

The discovery sheds new light on the origins of this most common cause of dementia, a hallmark of which is the buildup of tangled tau protein filaments in the brain.

The finding could also lead to new treatments for Alzheimer’s and other diseases that progressively destroy brain tissue, conclude the researchers in a paper about their work that now features in the journal Neuron.

Scientists from Massachusetts General Hospital (MGH) in Charlestown and the Johns Hopkins School of Medicine in Baltimore, MD, led the study, which set out to investigate how tau protein might contribute to brain cell damage.

Alzheimer’s disease does not go away and gets worse over time. It is the sixth most common cause of death in adults in the United States, where an estimated 5.7 million people have the disease.

Exact causes of Alzheimer’s still unknown

Exactly what causes Alzheimer’s and other forms of dementia is still a mystery to science. Evidence suggests that a combination of environment, genes, and lifestyle is involved, with different factors having different amounts of influence in different people.

Most cases of Alzheimer’s do not show symptoms until people are in their 60s and older. The risk of getting the disease rises rapidly with age after this.

Brain studies of people with the disease — together with postmortem analyses of brain tissue — have revealed much about how Alzheimer’s changes and harms the brain.

“Age-related changes” include: inflammation; shrinkage in some brain regions; creation of unstable, short-lived molecules known as free radicals; and disruption of cellular energy production.

The brain of a person with Alzheimer’s disease also has two distinguishing features: plaques of amyloid protein that form between cells, and tangles of tau protein that form inside cells. The recent study concerns the latter.

Changes to tau behavior

Brain cells, or neurons, have internal structures known as microtubules that support the cell and its function. They are highly active cell components that help carry substances from the body of the cell out to the parts that connect it to other cells.

In healthy brain cells, tau protein normally “binds to and stabilizes” the microtubules. Tau behaves differently, however, in Alzheimer’s disease.

Changes in brain chemistry make tau protein molecules come away from the microtubules and stick to each other instead.

Eventually, the detached tau molecules form long filaments, or neurofibrillary tangles, that disrupt the brain cell’s ability to communicate with other cells.

The new study introduces the possibility that, in Alzheimer’s disease, tau disrupts yet another mechanism that involves communication between the nucleus of the brain cell and its body.

Communication with cell nucleus

The cell nucleus communicates with the rest of the cell using structures called nuclear pores, which comprise more than 400 different proteins and control the movement of molecules.

Studies on the causes of amyotrophic lateral sclerosis, frontotemporal, and other types of dementia have suggested that flaws in these nuclear pores are involved somehow.

The recent study reveals that animal and human cells with Alzheimer’s disease have faulty nuclear pores, and that the fault is linked to tau accumulation in the brain cell.

“Under disease conditions,” explains co-senior study author Bradley T. Hyman, the director of the Alzheimer’s Unit at MGH, “it appears that tau interacts with the nuclear pore and changes its properties.”

He and his colleagues discovered that the presence of tau disrupts the orderly structure of nuclear pores containing the major structural protein Nup98. In Alzheimer’s disease cells, there were fewer of these pores and those that were there tended to be stuck to each other.

‘Mislocalized’ Nup98
They also observed another curious change involving Nup98 inside Alzheimer’s disease brain cells. In cells with aggregated tau, the Nup98 was “mislocalized” instead of staying in the nuclear pore.

They revealed that this feature was more exaggerated in brain tissue of people who had died with more extreme forms of Alzheimer’s disease.

Finally, when they added human tau to living cultures of rodent brain cells, the researchers found that it caused mislocalization of Nup98 in the cell body and disrupted the transport of molecules into the nucleus.

This was evidence of a “functional link” between the presence of tau protein and damage to the nuclear transport mechanism.

The authors note, however, that it is not clear whether the Nup98-tau interaction uncovered in the study just occurs because of disease or whether it is a normal mechanism that behaves in an extreme fashion under disease conditions.

They conclude:

“Taken together, our data provide an unconventional mechanism for tau-induced neurodegeneration.”

https://www.medicalnewstoday.com/articles/322991.php

Over 70s’ cognition skills get worse during cold months – and dementia-related proteins flare up

by Wilson Jacob

Dementia screening is more effective in winter and spring because that’s when tell-tale proteins flare up, new research suggests.

In a large-scale study of elderly people in the US, France and Canada, researchers found cognitive ability of over-70s in general is much sharper in summer and fall.

Brain-wise, healthy subjects seemed on average 4.8 years younger during those months than they did between November and May.

Those with Alzheimer’s pathology also experienced ‘dips’ in the winter, due to ‘seasonal rhythms’ in certain proteins, which seemed to make dementia-related genes more expressed in the brain.

The findings suggest that assessing people for the neurodegenerative disease in the latest and earliest months of the year might be the most effective way to detect the disease, for which there is still no definitive test.

People with dementia experience ‘dips’ in winter due to ‘seasonal rhythms’ in Alzheimer’s-related proteins, which seems to make dementia genes more expressed in the brain

According to lead author Dr Andrew Lim, assistant professor of neurology at the University of Toronto, the findings are a significant step towards improving Alzheimer’s diagnosis and treatment.

‘This association was independent of mood, sleep, physical activity, and thyroid status,’ he explained.

‘It was clinically significant, as reflected in a nearly 30 percent higher odds of meeting criteria for mild cognitive impairment or dementia in winter and spring compared to summer and fall, and it persisted in cases with pathologically confirmed Alzheimer’s disease.’

He adds: ‘There may be value in increasing dementia-related clinical resources in the winter and early spring when symptoms are likely to be most pronounced.

‘By shedding light on the mechanisms underlying the seasonal improvement in cognition in the summer and early fall, these findings also open the door to new avenues of treatment for Alzheimer’s disease.’

Several studies suggested season may be associated with cognitive function in some populations of younger adults.

But studies of the seasonal impact in older adults was lacking and little known about the underlying mechanisms.

So 3,353 older adults over 70 with and without Alzheimer’s in three cohort studies in the United States, Canada, and France were recruited.

They tested their thinking and concentration, and the Alzheimer-disease-related proteins in their spinal fluid measured.

Autopsies on those who died were performed and the brain was measured.

The average cognitive functioning was higher in the summer and autumn than the winter and spring, equivalent in cognitive effect to 4.8 years difference in age-related decline.

In addition, the odds of meeting the diagnostic criteria for mild cognitive impairment or dementia were higher in the winter and spring than summer or autumn.

There are several theories as to what could be the cause for these piques and troughs.

First, Dr Lim explains, there are environmental factors like more light and warmer temperatures which could boost general cognition in the summer and fall.

‘If true, then interventions such as phototherapy or temperature modification may be effective in sustaining this peak year-round,’ he says.

Second, in the summer, we tend to be more active, with a better diet, and better sleeping habits.

‘In this study, the association between season and cognition was independent of self-reported sleep and physical activity, although studies incorporating objective markers of these and other behaviors may reveal a more important role for behavioral factors.’

Third, there is the ominous seasonal depressive disorder, which afflicts so many in the winter months. Those seasonal rhythms in psychological state, he says, may also drive the association between season and cognition.

‘In this study, the seasonality of cognition was independent of depression; however, other psychological factors, such as negative affect, which has been associated with mild cognitive impairment and dementia, may be important.’

Lastly, there are the things going on inside the body. All those factors – environmental, lifestyle and psychological – impact our hormone and vitamin levels.

‘In our study adjusting for serum levels of thyroid-stimulating hormone did not substantially attenuate estimates of the association between season and cognition,’ Dr Lim explains.

‘However, additional metabolic factors that may potentially link season to cognition are vitamin D, sex hormones like testosterone, and melatonin.’

The study, published today in the journal PLOS Medicine, had one clear limitation, among others: the participants were only assessed once a season, and only included data on individuals from temperate northern-hemisphere regions, not from southern-hemisphere or equatorial regions.

However, Dr Lim insists they are on to something.

‘The persistence of a robust summer/fall peak in cognition suggests that even in pathologically confirmed Alzheimer’s disease, there remains substantial cognitive plasticity.

‘Identifying drivers or mediators of this effect may enable leveraging this plasticity to improve cognition year-round.’

Dr Rosa Sancho, Head of Research of Alzheimer’s Research UK, concurred. The study is just one piece of the puzzle but sheds light on a rarely discussed element of the lives of dementia patients.

‘For most people with dementia, symptoms get steadily worse over the course of several years but there are things that can also impact memory and thinking ability in the short term. We know that factors like sleep quality and mood can affect cognitive performance whether or not someone has dementia, and this study suggests that the time of year may also influence these skills,’ she said.

https://www.habaricloud.today/2018/09/05/over-70s-cognition-skills-get-worse-during-cold-months-and-dementia-related-proteins-flare-up/

Machine-learning based model may identify dementia in primary care

machine-learning-model-provides-early-dementia-diagnosis-306451

A machine learning-based model using data routinely gathered in primary care identified patients with dementia in such settings, according to research recently published in BJGP Open.

“Improving dementia care through increased and timely diagnosis is a priority, yet almost half of those living with dementia do not receive a timely diagnosis,” Emmanuel A. Jammeh, PhD, of the science and engineering department at Plymouth University in the United Kingdom, and colleagues wrote.

“A cost-effective tool that can be used by [primary care providers] to identify patients likely to be living with dementia, based only on routine data would be extremely useful. Such a tool could be used to select high-risk patients who could be invited for targeted screening,” they added.

The researchers used Read codes, a set of clinical terms used in the U.K. to summarize data for general practice, to develop a machine learning-based model to identify patients with dementia. The Read codes were selected based on their significant association with patients with dementia, and included codes for risk factors, symptoms and behaviors that are collected in primary care. To test the model, researchers collected Read-encoded data from 26,483 patients living in England aged 65 years and older.

Jammeh and colleagues found that their machine-based model achieved a sensitivity of 84.47% and a specificity of 86.67% for identifying dementia.

“This is the first demonstration of a machine-learning approach to identifying dementia using routinely collected [National Health Service] data, researchers wrote.

“With the expected growth in dementia prevalence, the number of specialist memory clinics may be insufficient to meet the expected demand for diagnosis. Furthermore, although current ‘gold standards’ in dementia diagnosis may be effective, they involve the use of expensive neuroimaging (for example, positron emission tomography scans) and time-consuming neuropsychological assessments which is not ideal for routine screening of dementia,” they continued.

The model will be evaluated with other datasets, and have its validation tested “more extensively” at general practitioner practices in the future, Jammeh and colleagues added. – by Janel Miller

https://www.healio.com/family-medicine/geriatric-medicine/news/online/%7B62392171-6ad7-481a-9289-bd69df49d4a4%7D/machine-learning-based-model-may-identify-dementia-in-primary-care

Researchers report startling inflammasome discovery in Alzheimer’s study

1-alzheimersdi
Diagram of the brain of a person with Alzheimer’s Disease.

In recent years, researchers have largely converged on the role of inflammation in the development and progression of Alzheimer’s disease (AD). Studies over the past decade have revealed unexpected interactions between the brain and the immune system, and metabolic conditions such as obesity and diabetes may activate inflammatory responses that contribute to the development and progression of AD.

The activation of the inflammatory response is controlled by the inflammasome, a multi-protein oligomer that promotes the release of several pro-inflammatory cytokines including interleukin 1β (IL-1β) and interleukin 18 (IL-18). In an earlier study, a group of researchers with the University of Massachusetts Medical School, the University of Tokyo and the University of Bonn reported that mice with a cognate of Alzheimer’s disease that were additionally bred to knock out the NLRP3 gene encoding the inflammasome were completely protected from neurodegenerative effects of the disease. The researchers presumed that this was the result of their inability to produce IL-1β and IL-18.

This finding was quite promising, suggesting that targeting components of the inflammasome might be a path to Alzheimer’s treatments. In their new study, they sought to determine the effect of IL-18 by breeding IL-18 knockout mice. The researchers considered IL-18 to be a promising target, because levels are elevated in the cerebrospinal fluid of AD patients with mild cognitive impairment. Additionally, it is known to increase the production of amyloid peptide.

But the result of the new mouse study was startling, and completely unprecedented in Alzheimer’s research. The IL-18 knockout mice developed a lethal seizure disorder that the researchers attribute to an increase in neuronal network transmission. The authors write, “… the effects of IL-18 deletion were so dramatic that we were unable to identify previous evidence to help understand the phenomena.”

The finding that a proinflammatory cytokine might in some way ameliorate seizure-inducing neural activity seems counterintuitive, since inflammation is theorized to promote neurodegenerative symptoms in AD. The researchers believe that epilepsy is understudied in AD patients, even though it is a common complication; they point out that two-thirds of AD patients experience both motor and non-motor seizures. Additionally, AD patients with epilepsy are more likely to develop memory loss and other cognitive symptoms, and experience a more widespread loss of brain cells than AD patients without epilepsy, according to the researchers.

They theorize that IL-18 may be counteracting seizure-promoting effects of IL-1β, and suppressing IL-18 thus induced seizures in the test mice. “In fact,” they write, “the countereffect of IL-18 and IL-1β has been documented in a mouse model of cerebellar ataxia. Importantly, we found that the acute application of IL-18 protein reduced excitatory synaptic transmission in the hippocampus, providing evidence that IL-18 has a protective function in neuronal excitability. Thus, we speculate that IL-18 directly suppresses these proepileptogenic effects of IL-1β in APP/PS1 mice.”

However, the most important implication of the study may be that, while the inflammasome is a promising therapeutic target for Alzheimer’s, inhibiting specific cytokines could negatively affect people with the disease.

More information: Inflammasome-derived cytokine IL18 suppresses amyloid-induced seizures in Alzheimer-prone mice. Proceedings of the National Academy of Sciences (2018). doi.org/10.1073/pnas.1801802115

Abstract
Alzheimer’s disease (AD) is characterized by the progressive destruction and dysfunction of central neurons. AD patients commonly have unprovoked seizures compared with age-matched controls. Amyloid peptide-related inflammation is thought to be an important aspect of AD pathogenesis. We previously reported that NLRP3 inflammasome KO mice, when bred into APPswe/PS1ΔE9 (APP/PS1) mice, are completely protected from amyloid-induced AD-like disease, presumably because they cannot produce mature IL1β or IL18. To test the role of IL18, we bred IL18KO mice with APP/PS1 mice. Surprisingly, IL18KO/APP/PS1 mice developed a lethal seizure disorder that was completely reversed by the anticonvulsant levetiracetam. IL18-deficient AD mice showed a lower threshold in chemically induced seizures and a selective increase in gene expression related to increased neuronal activity. IL18-deficient AD mice exhibited increased excitatory synaptic proteins, spine density, and basal excitatory synaptic transmission that contributed to seizure activity. This study identifies a role for IL18 in suppressing aberrant neuronal transmission in AD.
Journal reference: Proceedings of the National Academy of Sciences

A new map of the brain’s serotonin system

anewmapofthe
A 3-D rendering of the serotonin system in the left hemisphere of the mouse brain reveals two groups of serotonin neurons in the dorsal raphe that project to either cortical regions (blue) or subcortical regions (green) while rarely crossing into the other’s domain.

As Liqun Luo was writing his introductory textbook on neuroscience in 2012, he found himself in a quandary. He needed to include a section about a vital system in the brain controlled by the chemical messenger serotonin, which has been implicated in everything from mood to movement regulation. But the research was still far from clear on what effect serotonin has on the mammalian brain.

“Scientists were reporting divergent findings,” said Luo, who is the Ann and Bill Swindells Professor in the School of Humanities and Sciences at Stanford University. “Some found that serotonin promotes pleasure. Another group said that it increases anxiety while suppressing locomotion, while others argued the opposite.”

Fast forward six years, and Luo’s team thinks it has reconciled those earlier confounding results. Using neuroanatomical methods that they invented, his group showed that the serotonin system is actually composed of at least two, and likely more, parallel subsystems that work in concert to affect the brain in different, and sometimes opposing, ways. For instance, one subsystem promotes anxiety, whereas the other promotes active coping in the face of challenges.

“The field’s understanding of the serotonin system was like the story of the blind men touching the elephant,” Luo said. “Scientists were discovering distinct functions of serotonin in the brain and attributing them to a monolithic serotonin system, which at least partly accounts for the controversy about what serotonin actually does. This study allows us to see different parts of the elephant at the same time.”

The findings, published online on August 23 in the journal Cell, could have implications for the treatment of depression and anxiety, which involves prescribing drugs such as Prozac that target the serotonin system – so-called SSRIs (selective serotonin reuptake inhibitors). However, these drugs often trigger a host of side effects, some of which are so intolerable that patients stop taking them.

“If we can target the relevant pathways of the serotonin system individually, then we may be able to eliminate the unwanted side effects and treat only the disorder,” said study first author Jing Ren, a postdoctoral fellow in Luo’s lab.

Organized projections of neurons

The Stanford scientists focused on a region of the brainstem known as the dorsal raphe, which contains the largest single concentration in the mammalian brain of neurons that all transmit signals by releasing serotonin (about 9,000).

The nerve fibers, or axons, of these dorsal raphe neurons send out a sprawling network of connections to many critical forebrain areas that carry out a host of functions, including thinking, memory, and the regulation of moods and bodily functions. By injecting viruses that infect serotonin axons in these regions, Ren and her colleagues were able to trace the connections back to their origin neurons in the dorsal raphe.

This allowed them to create a visual map of projections between the dense concentration of serotonin-releasing neurons in the brainstem to the various regions of the forebrain that they influence. The map revealed two distinct groups of serotonin-releasing neurons in the dorsal raphe, which connected to cortical and subcortical regions in the brain.

“Serotonin neurons in the dorsal raphe project to a bunch of places throughout the brain, but those bunches of places are organized,” Luo said. “That wasn’t known before.”

Two parts of the elephant

In a series of behavioral tests, the scientists also showed that serotonin neurons from the two groups can respond differently to stimuli. For example, neurons in both groups fired in response to mice receiving rewards like sips of sugar water but they showed opposite responses to punishments like mild foot shocks.

“We now understand why some scientists thought serotonin neurons are activated by punishment, while others thought it was inhibited by punishment. Both are correct – it just depends on which subtype you’re looking at,” Luo said.

What’s more, the group found that the serotonin neurons themselves were more complex than originally thought. Rather than just transmitting messages with serotonin, the cortical-projecting neurons also released a chemical messenger called glutamate – making them one of the few known examples of neurons in the brain that release two different chemicals.

“It raises the question of whether we should even be calling these serotonin neurons because neurons are named after the neurotransmitters they release,” Ren said.

Taken together, these findings indicate that the brain’s serotonin system is not made up of a homogenous population of neurons but rather many subpopulations acting in concert. Luo’s team has identified two groups, but there could be many others.

In fact, Robert Malenka, a professor and associate chair of psychiatry and behavioral sciences at Stanford’s School of Medicine, and his team recently discovered a group of serotonin neurons in the dorsal raphe that project to the nucleus accumbens, the part of the brain that promotes social behaviors.

“The two groups that we found don’t send axons to the nucleus accumbens, so this is clearly a third group,” Luo said. “We identified two parts of the elephant, but there are more parts to discover.”

https://medicalxpress.com/news/2018-08-brain-serotonin.html

Schizophrenia, bipolar disorder, alcohol use, and the use of cannabis all shown to make the brain age prematurely.

Buds-1024x683

For what is thought to be the largest study of its kind, the researchers analyzed brain scans of 31,227 people aged 9 months–105 years.

In a paper that now features in the Journal of Alzheimer’s Disease, they describe how they identified “patterns of aging” from the brain scans.

These were done using single photon emission computed tomography (SPECT) and came from people with psychiatric conditions such as attention deficit hyperactivity disorder (ADHD), schizophrenia, and bipolar disorder. They were all attending a psychiatric clinic that was based at several locations.

Each participant underwent two SPECT brain scans — one during a resting state, and another during completion of “a concentration task” — giving a total of 62,454 scans.

The scientists found that they could predict a person’s age from the pattern of blood flow in their brain.

Brain circulation varied over lifespan
They observed that blood flow varied from childhood into older age throughout the lifespan. They also saw that brain aging was more visible in scans of men and those with schizophrenia, anxiety, bipolar disorder, and ADHD.

Brain aging was also more strongly associated with use of cannabis and alcohol.

“Based on one of the largest brain imaging studies ever done,” says lead study author Dr. Daniel G. Amen, a psychiatrist and founder of Amen Clinics in Costa Mesa, CA, “we can now track common disorders and behaviors that prematurely age the brain.”

He suggests that improving the treatment of these disorders could “slow or even halt the process of brain aging.”

https://www.medicalnewstoday.com/articles/322852.php