Genetic basis of synethesia shown to relate to ability of neurons to form connections in the brain

By Tereza Pultarova

About 4 percent of the people on Earth experience a mysterious phenomenon called synesthesia: They hear a sound and automatically see a color; or, they read a certain word, and a specific hue enters their mind’s eye. The condition has long puzzled scientists, but a small new study may offer some clues.

The study, published March 5 in the journal Proceedings of the National Academy of Sciences, offers insight into what might be happening in the brains of people with synesthesia.

Previous “studies of brain function using magnetic resonance imaging confirm that synesthesia is a real biological phenomenon,” said senior study author Simon Fisher, director of the Max Planck Institute for Psycholinguistics in the Netherlands. For example, when people with synesthesia “hear” color, brain scans show that there’s activity in the parts of the brain linked to both sight and sound, he said. (Not all people with the condition “hear” sights, however; the condition can also link other senses.)Indeed, the brains of people with synesthesia previously have been shown to be more connected across different regions than the brains of people whose senses are not cross-linked, Fisher told Live Science. The question, however, was what causes this different brain wiring, he said.

To answer that question, Fisher and his team looked to genetics.

Synesthesia frequently runs in families, so the researchers decided to look for genes that might be responsible for the development of the condition. They chose three families, in which multiple members across at least three generations had a specific type of synesthesia, the so-called sound-color synesthesia, meaning that hearing sounds evokes perceptions of colors. Typically, a specific sound or musical tone is consistently associated with a specific color for people who have this type of synesthesia. However, different members of a single family can see different colors when hearing the same sound, Fisher said.

The scientists used DNA sequencing to study the participants’ genes, Fisher said. Then, to identify genes that might be responsible for the condition, the scientists compared the genes of family members with synesthesia to the genes of family members without it, he said.

But the findings didn’t yield a straightforward result: “There was not a single gene that could explain synesthesia in all three families,” Fisher said. Instead, “there were 37 candidate variants,” or possible gene variations, he said.

Because the study included only a small number of people, there wasn’t enough data to single out the specific genes, of the 37 possibilities, that played a role in synesthesia. So, instead, the scientists looked at the biological functions of each gene to see how it could be related to the development of the condition. “There were just a few biological themes that were significantly enriched across the candidate genes identified,” Fisher said. “One of those was axonogenesis, a crucial process helping neurons get wired up to each other in the developing brain.” Axonogenesis refers to the development of neurons.

This is consistent with prior findings of altered connectivity in brain scans of people with synesthesia, Fisher said. In other words, the genes identified in the study play a role in how the brain is wired, offering a potential explanation for why the brains of people with synesthesia appear to be wired differently.

https://www.livescience.com/61930-synesthesia-hear-colors-genes.html

Possible reason why ‘magic’ mushrooms evolved

By Rafi Letzter

“Magic” mushrooms seem to have passed their genes for mind-altering substances around among distant species as a survival mechanism: By making fungus-eating insects “trip,” the bugs become less hungry — and less likely to feast on mushrooms.

That’s the upshot of a paper published Feb. 27 in the journal Evolution Letters by a team of biologists at The Ohio State University and the University of Tennessee.

The researchers studied a group of mushrooms that all produce psilocybin — the chemical agent that causes altered states of consciousness in human beings — but aren’t closely related. The scientists found that the clusters of genes that caused the ‘shrooms to fill themselves with psilocybin were very similar to one another, more similar even than clusters of genes found in closely related species of mushrooms.

That’s a sign, the researchers wrote, that the genes weren’t inherited from a common ancestor, but instead were passed directly between distant species in a phenomenon known as “horizontal gene transfer” or HGT.

HGT isn’t really one process, as the biologist Alita Burmeister explained in the journal Evolution, Medicine and Public Health in 2015. Instead, it’s the term for a group of more or less well-understood processes — like viruses picking up genes from one species and dropping them in another — that can cause groups of genes to jump between species.

However, HGT is believed to be pretty uncommon in complex, mushroom-forming fungi, turning up much more often in single-celled organisms.

When a horizontally transferred gene takes hold and spreads after landing in a new species, the paper’s authors wrote, scientists believe that’s a sign that the gene offered a solution to some crisis the organism’s old genetic code couldn’t solve on its own.

The researchers suggested — but didn’t claim to prove — that the crisis in this case was droves of insects feasting on the defenseless mushrooms. Most of the species the scientists studied grew on animal dung and rotting wood — insect-rich environments (and environments full of opportunities to perform HGT). Psilocybin, the scientists wrote, might suppress insects’ appetites or otherwise induce the bugs to stop munching quite so much mush’.

https://www.livescience.com/61877-magic-mushrooms-evolution.html

Humans sleep much less than other primates

BY BRUCE BOWER

People have evolved to sleep much less than chimps, baboons or any other primate studied so far.

A large comparison of primate sleep patterns finds that most species get somewhere between nine and 15 hours of shut-eye daily, while humans average just seven. An analysis of several lifestyle and biological factors, however, predicts people should get 9.55 hours, researchers reported recently in the American Journal of Physical Anthropology. Most other primates in the study typically sleep as much as the scientists’ statistical models predict they should.

Two long-standing features of human life have contributed to unusually short sleep times, argue evolutionary anthropologists Charles Nunn of Duke University and David Samson of the University of Toronto Mississauga. First, when humans’ ancestors descended from the trees to sleep on the ground, individuals probably had to spend more time awake to guard against predator attacks. Second, humans have faced intense pressure to learn and teach new skills and to make social connections at the expense of sleep.

As sleep declined, rapid-eye movement, or REM — sleep linked to learning and memory (SN: 6/11/16, p. 15) — came to play an outsize role in human slumber, the researchers propose. Non-REM sleep accounts for an unexpectedly small share of human sleep, although it may also aid memory (SN: 7/12/14, p. 8), the scientists contend.

“It’s pretty surprising that non-REM sleep time is so low in humans, but something had to give as we slept less,” Nunn says.

Humans may sleep for a surprisingly short time, but Nunn and Samson’s sample of 30 species is too small to reach any firm conclusions, says evolutionary biologist Isabella Capellini of the University of Hull in England. Estimated numbers of primate species often reach 300 or more.

If the findings hold up, Capellini suspects that sleeping for the most part in one major bout per day, rather than in several episodes of varying durations as some primates do, substantially lessened human sleep time.

Nunn and Samson used two statistical models to calculate expected daily amounts of sleep for each species. For 20 of those species, enough data existed to estimate expected amounts of REM and non-REM sleep.

Estimates of all sleep times relied on databases of previous primate sleep findings, largely involving captive animals wearing electrodes that measure brain activity during slumber. To generate predicted sleep values for each primate, the researchers consulted earlier studies of links between sleep patterns and various aspects of primate biology, behavior and environments. For instance, nocturnal animals tend to sleep more than those awake during the day. Species traveling in small groups or inhabiting open habitats along with predators tend to sleep less.

Based on such factors, the researchers predicted humans should sleep an average of 9.55 hours each day. People today sleep an average of seven hours daily, and even less in some small-scale groups (SN: 2/18/17, p. 13). The 36 percent shortfall between predicted and actual sleep is far greater than for any other primate in the study.

Nunn and Samson estimated that people now spend an average of 1.56 hours of snooze time in REM, about as much as the models predict should be spent in that sleep phase. An apparent rise in the proportion of human sleep devoted to REM resulted mainly from a hefty decline in non-REM sleep, the scientists say. By their calculations, people should spend an average of 8.42 hours in non-REM sleep daily, whereas the actual figure reaches only 5.41 hours.

One other primate, South America’s common marmoset (Callithrix jacchus), sleeps less than predicted. Common marmosets sleep an average of 9.5 hours and also exhibit less non-REM sleep than expected. One species sleeps more than predicted: South America’s nocturnal three-striped night monkey (Aotus trivirgatus) catches nearly 17 hours of shut-eye every day. Why these species’ sleep patterns don’t match up with expectations is unclear, Nunn says. Neither monkey departs from predicted sleep patterns to the extent that humans do.

Citations
C.L. Nunn and D.R. Samson. Sleep in a comparative context: Investigating how human sleep differs from sleep in other primates. American Journal of Physical Anthropology. Published online February 14, 2018. doi:10.1002/ajpa.23427.

Humans don’t get enough sleep. Just ask other primates.

How to tell if your child is a future psychopath

By Jane Ridley

Four years ago, Lillyth Quillan cowered behind a padlocked door as her teenage son, taller and stronger than she is, paced back and forth in a rage.

Suddenly he went quiet. “Don’t let me hurt you, Mom,” he said, his voice sounding chillingly calm.

It was the first time the high school freshman had used that particular tone, but he continued to deploy it as he menaced his mom and dad.

“He used the kind of language of abusive husbands — manipulating and controlling,” says Quillan, who had installed locks on every door in her house except her son’s bedroom. “I was terrified of what he was going to do next.”

The boy — whom Quillan chooses to call Kevin in her interview with The Post in reference to the unnerving Lionel Shriver novel “We Need To Talk About Kevin” about a school shooter in upstate New York — was out of control.

After years of cruel and violent behavior plus multiple suspensions and expulsions from school, psychiatrists finally diagnosed the then-14-year-old Kevin with “conduct disorder,” which, in its most extreme form, can be a precursor to psychopathy.

Psychopathy, which is often used interchangeably with the term sociopathy, is believed to affect 1 percent of adults. Key attributes that sociopaths and psychopaths have in common include a disregard for laws, social mores and the rights of others, a failure to feel remorse or guilt and, in some but not all cases, a tendency to violence.

The Diagnostic and Statistical Manual of Mental Disorders (DSM-5) dictates that people under the age of 18 cannot be labelled psychopaths. However, in 2013 the American Psychiatric Association decided to include the condition “conduct disorder with callous and unemotional traits” for children ages 12 and over.

According to a 2001 report published in the journal American Family Physician, approximately 6 to 16 percent of boys and 2 to 9 percent of girls meet the diagnostic criteria for conduct disorder — only a fraction of which have the “callous and unemotional” label that can potentially lead to psychopathy in adulthood.

More than 50 studies have found that kids with the latter diagnosis are more likely to become criminals or display aggressive, psychopathic traits later in life. It has been reported that Nikolas Cruz, the 19-year-old who allegedly shot and killed 17 people at Marjory Stoneman Douglas High School in Parkland, Fla., last month showed classic signs of the disorder as a child, including abusing animals.

“Psychopaths don’t just appear when they are 20. They are always different from an early age,” Kent Kiehl, a psychology professor at the University of New Mexico and the author of “The Psychopath Whisperer,” tells The Post.

Characteristics to look for — as detailed in the widely used Hare Psychopathy Checklist Youth Version considered by clinicians and researchers to be the “gold standard” in assessing psychopathy — include lack of empathy, lack of guilt and regret, pathological lying, grandiose self-worth and failure to accept responsibility for actions such as fighting and bullying.

“Individuals who score high on those traits are more likely to produce further violence,” adds Kiehl. “If they are sanctioned but continue on the same path, it’s not a perfect indicator, but it’s enough to cause concern.”

Kiehl notes that research has shown that psychopathy is hereditary roughly half of the time. But his own breakthrough was the discovery that the psychopathic brain has a different structure than a “normal” one.

In 2014, he conducted a major study that found at least two abnormalities in the brains of adult psychopaths. There was a lack of gray matter in the section involved in processing emotions, while the area that reacts to excitement and thrills is overactive. Although the research has not been carried out yet, the pattern is likely to also occur in the brains of “callous and unemotional” children. “Brain science has helped us understand what is different about these kids,” adds Kiehl.

At the moment, there is no such thing as a “cure” for psychopathy or conduct disorder. But early intervention can be key for harm reduction, even with children as young as 2 or 3.

Paul Frick, a psychology professor at Louisiana State University and the author of “Conduct Disorder and Severe Antisocial Behavior,” recommends a range of therapies, most of which revolve around rewards systems rather than punishments.

“There are so-called ‘emotion coaching’ techniques that parents and therapists can employ to help children pay attention to the feelings of others,” he explains. “We find that they miss the cues that another child is upset.

“By saying: ‘Can you see how Johnny is feeling?’ [when a toy is snatched from him] and getting them to respond correctly, you can motivate them. You give them a star or a sticker as an incentive.

“Even though it doesn’t come naturally to them, they can learn others’ perspectives.”

Experts can identify a callous and unemotional child when they are as young as 3 or 4. Faced with a crying peer, typically developing children either try to comfort them or take flight. But those with the mental condition remain in place, showing apathy and coldness.

Remarkably, the psychology department at King’s College London has been able to trace the characteristics back to infancy. They tested more than 200 babies at 5 months old, tracking whether they preferred looking at a person’s face or at a red ball. The tots who favored the ball displayed more callous traits two and a half years later.

For Quillan, hindsight is 20/20, but she distinctly recalls the first signs that Kevin had behavioral issues at the age of just 8 months.

“He had teeth and would bite me while he was breast-feeding and he would laugh. He thought it was hilarious. I tried looking very sad and mimicking crying to show it was hurting me, but he would only laugh,” says Quillan, who ended up having to put him on formula.

“It didn’t occur to me until much later that this was a child for whom the amusement of my reaction when he bit me was a greater reward than food.”

Now 18, Kevin, who has had numerous run-ins with police, including for shoplifting, was made a ward of state and no longer lives with his parents. He lives in a residential school for “at-risk” youth in California, where he is on a waiting list to receive treatment, such as therapy, to build empathy.

“Because there is no real treatment for conduct disorder. All you can do is wait for your child to be arrested and enter the juvenile system and hope they get better,” says his 40-year-old homemaker mom.

“Luckily, Kevin is no longer violent and is actually cooperative.”

He is doing so well that he is about to receive his high school diploma, recently won an award for wrestling and has encouraged his mother to tell his story.

Now Quillian, who has no other kids, is focusing on advocacy and encouraging parents facing similar nightmares to hers. Three years ago, she formed a support group for families with kids with CD that has 420 members worldwide. More recently, she launched the Society for Treatment Options for Potential Psychopaths to bring awareness and to campaign for treatment for these children before they cause serious harm.

Adds Quillan: “As every news article came out about Parkland and Nikolas Cruz, I thought: ‘My God, this could easily be one of our kids.’”

https://nypost.com/2018/03/07/how-to-tell-if-your-child-is-a-future-psychopath/

Identification of genes that are involved in age-related brain deterioration

A group of genes and genetic switches involved in age-related brain deterioration have been identified by scientists at the Babraham Institute, Cambridge and Sapienza University, Rome. The research, published online today (5th March) in Aging Cell, found that changes to one of these genes, called Dbx2, could prematurely age brain stem cells, causing them to grow more slowly. The study was led jointly by Giuseppe Lupo and Emanuele Cacci in Italy and Peter Rugg-Gunn in the UK.

Cells in the brain are constantly dying and being replaced with new ones produced by brain stem cells. As we age, it becomes harder for these stem cells to produce new brain cells and so the brain slowly deteriorates. By comparing the genetic activity in brain cells from old and young mice, the scientists identified over 250 genes that changed their level of activity with age. Older cells turn some genes, including Dbx2, on and they turn other genes off.

By increasing the activity of Dbx2 in young brain stem cells, the team were able to make them behave more like older cells. Changes to the activity of this one gene slowed the growth of brain stem cells. These prematurely aged stem cells are not the same as old stem cells but have many key similarities. This means that many of the genes identified in this study are likely to have important roles in brain ageing.

The research also identified changes in several epigenetic marks – a type of genetic switch – in the older stem cells that might contribute to their deterioration with age. Epigenetic marks are chemical tags attached to the genome that affect the activity of certain genes. The placement of these marks in the genome change as we age and this alters how the cells behave. The researchers think that some of these changes that happen in the brain may alter causing brain stem cells to grow more slowly.

First author on the paper, Dr Giuseppe Lupo, Assistant Professor at Sapienza University said: “The genes and gene regulators that we identified are corrupted in neural stem cells from older mice. By studying the Dbx2 gene we have shown that these changes may contribute to ageing in the brain by slowing the growth of brain stem cells and by switching on the activity of other age-associated genes.”

Co-lead scientist Dr Peter Rugg-Gunn at the Babraham Institute said: “Ageing ultimately affects all of us and the societal and healthcare burden of neurodegenerative diseases is enormous. By understanding how ageing affects the brain, at least in mice, we hope to identify ways to spot neural stem cell decline. Eventually, we may find ways to slow or even reverse brain deterioration – potentially by resetting the epigenetic switches – helping more of us to stay mentally agile for longer into old age.”

Co-lead scientist Dr Emanuele Cacci at Sapienza University said: “We hope this research will lead to benefits for human health. We have succeeded in accelerating parts of the ageing process in neural stem cells. By studying these genes more closely, we now plan to try turning back the clock for older cells. If we can do this in mice, then the same thing could also be possible for humans.”

This article has been republished from materials provided by the Babraham Institute. Note: material may have been edited for length and content. For further information, please contact the cited source.

Reference: Lupo, G., Nisi, P. S., Esteve, P., Paul, Y.-L., Novo, C. L., Sidders, B., … Rugg-Gunn, P. J. (n.d.). Molecular profiling of aged neural progenitors identifies Dbx2 as a candidate regulator of age-associated neurogenic decline. Aging Cell, n/a-n/a. https://doi.org/10.1111/acel.12745

https://www.technologynetworks.com/genomics/news/these-genes-are-involved-in-age-linked-brain-deterioration-298221?utm_campaign=Newsletter_TN_BreakingScienceNews&utm_source=hs_email&utm_medium=email&utm_content=61138279&_hsenc=p2ANqtz-_FeiFbqi-SP5EqlFOOosvK1dViRCt4fG_ztTzGnpct1WLd4sY0BUbdkcuE7-2clIdZwQsKU1fdtv-8HDaJoh76WD9KwA&_hsmi=61138279

New research in human brain indicates that after age 13 no new neurons are made in the hippocampus, unlike other species.


Young neurons (green) are shown in the human hippocampus at the ages of (from left) birth, 13 years old and 35 years old. Images by Arturo Alvarez-Buylla lab

by Nicholas Weiler

One of the liveliest debates in neuroscience over the past half century surrounds whether the human brain renews itself by producing new neurons throughout life, and whether it may be possible to rejuvenate the brain by boosting its innate regenerative capacity.

Now UC San Francisco scientists have shown that in the human hippocampus – a region essential for learning and memory and one of the key places where researchers have been seeking evidence that new neurons continue to be born throughout the lifespan – neurogenesis declines throughout childhood and is undetectable in adults.

“We find that if neurogenesis occurs in the adult hippocampus in humans, it is an extremely rare phenomenon, raising questions about its contribution to brain repair or normal brain function,” said Arturo Alvarez-Buylla, PhD, the Heather and Melanie Muss Professor of Neurological Surgery at UCSF, whose lab published the new study March 7, 2018, in Nature.

Alvarez-Buylla – a member of the Eli and Edythe Broad Center of Regeneration Medicine and Stem Cell Research at UCSF, the UCSF Weill Institute for Neuroscience, and the UCSF Helen Diller Family Comprehensive Cancer Center – is a leading expert in brain development who over the past 30 years has played a key role in convincing the scientific establishment that new neurons are born throughout life in animals such as songbirds and rodents. In recent years, however, the Alvarez-Buylla lab and others had already cast doubt on whether neurogenesis persists into adulthood in the human olfactory bulb, as it does in rodents, and have shown that while new neurons integrate into the human frontal lobe after birth, this process also ends during early infancy.

The lab’s new research, based on careful analysis of 59 samples of human hippocampus from UCSF and collaborators around the world, suggests new neurons may not be born in the adult human brain at all. The findings present a challenge to a large body of research which has proposed that boosting the birth of new neurons could help to treat brain diseases such as Alzheimer’s disease and depression. But the authors said it also opens the door to exciting new questions about how the human brain learns and adapts without a supply of new neurons, as in seen in mice and other animals.

Rodents, Songbirds Produce New Neurons Throughout Life
It was once neuroscientific dogma that the brain stops producing new neurons before birth. In the 1960s, experiments in rodents by Joseph Altman, PhD, at MIT first suggested that new neurons could be born in the adult mammalian brain, but these results remained highly controversial until the 1980s, when Fernando Nottebohm, PhD, at Rockefeller University, conclusively showed that new neurons are born and put to use throughout life in several parts of the songbird brain. As a graduate student in the Nottebohm lab at the time, Alvarez-Buylla contributed to understanding the mechanism of adult neurogenesis in songbirds.

These findings launched a whole field of research aimed at understanding how new neurons contribute to brain function in other animals and exploring the potential therapeutic effects of boosting brain regeneration in humans. Much work has focused on a region of the hippocampus called the dentate gyrus (DG), where rodents produce newborn neurons throughout life that are thought to help them form distinct new memories, among other cognitive functions.

Rodent studies have shown that DG neurogenesis declines with age, but is otherwise quite malleable — increasing with exercise, but decreasing with stress, for example — leading to popular claims that we can boost brain regeneration by living a healthy lifestyle. Animal experiments have also suggested that neurogenesis-boosting therapies could treat brain disorders of aging such as Alzheimer’s disease, and leading researchers have proposed that antidepressant medications like fluoxetine (Prozac) may work by increasing DG neurogenesis.

Beginning in the late ’90s, a handful of studies reported evidence of adult neurogenesis in the human brain, either by estimating the birth dates of cells present in postmortem brain specimens or by labeling telltale molecular markers of newborn neurons or dividing neural stem cells. However, these findings, some of which were based on small numbers of brain samples, have remained controversial.

In particular, researchers have questioned whether the limited number of markers used in each study were truly specific to newborn neurons, and have suggested alternative explanations, such as the inadvertent labeling of dividing non-neuronal cells called glia (which are well known to continue regenerating through life).

Early Loss of Neural Stem Cell Niche in Human Brain
In the new study, Shawn Sorrells, PhD, a senior researcher in the Alvarez-Buylla lab, and Mercedes Paredes, PhD, a UCSF assistant professor of neurology, led a team that collected and analyzed samples of the human hippocampus obtained by clinical collaborators on three continents: Zhengang Yang, PhD, in China; José Manuel García Verdugo, PhD, in Spain; Gary Mathern, MD, at UCLA; and Edward Chang, MD, and Kurtis Auguste, MD, of UCSF Health. The brain specimens included 37 postmortem brain samples, some from the UCSF Pediatric Neuropathology Consortium run by Eric Huang, MD, PhD, as well as 22 surgically excised tissue samples from patients who had been treated for epilepsy.

Sorrells and Paredes analyzed changes in the number of newborn neurons and neural stem cells present in these samples, from before birth to adulthood, using a variety of antibodies to identify cells of different types and states of maturity, including neural stem cells and progenitors, newborn and mature neurons, and non-neuronal glial cells. The researchers also examined the cells they labeled based on their shape and structure – including imaging with high-resolution electron microscopy for a subset of tissue samples – in order to confirm their identity as neurons, neuronal stem cells, or glial cells.

The researchers found plentiful evidence of neurogenesis in the dentate gyrus during prenatal brain development and in newborns, observing an average of 1,618 young neurons per square millimeter of brain tissue at the time of birth. But the number of newborn cells sharply declined in samples obtained during early infancy: dentate gyrus samples from year-old infants contained fivefold fewer new neurons than was seen in samples from newborn infants. The decline continued into childhood, with the number of new neurons declining by 23-fold between one and seven years of age, followed by a further fivefold decrease by 13 years, at which point neurons also appeared more mature than those seen in samples from younger brains. The authors observed only about 2.4 new cells per square millimeter of DG tissue in early adolescence, and found no evidence of newborn neurons in any of the 17 adult post-mortem DG samples or in surgically extracted tissue samples from 12 adult patients with epilepsy.

“In young children, we were able to see that substantial numbers of new neurons continue to be made and integrated into the dentate gyrus, but neurogenesis fades away completely by early adolescence,” Paredes said. “The fact that we could compare newborn brains, where new neurons were clearly present, to the adult, where we saw no evidence for young neurons, gave us added confidence that what we were seeing was correct.”

The researchers then turned to studying the stem cells that give birth to new neurons. They found that neural progenitors are plentiful during early prenatal brain development, but become extremely rare by early childhood. They noted that these cells fail to cluster early on into a concentrated “niche” in a region of the human DG known as the subgranular zone (SGZ). The researchers suspect that this configuration, which is seen in mice, could be necessary for prolonged neurogenesis, suggesting a potential explanation for why neurogenesis falters by adulthood in humans.

New Fundamental Questions for Neuroscientists
The authors acknowledge that however comprehensively and carefully they searched, it would be impossible to definitively show that there are never any new neurons in the adult hippocampus. “But I think that we need to step back and ask what that means,” Sorrells said. “If neurogenesis is so rare that we can’t detect it, can it really be playing a major role in plasticity or learning and memory in the hippocampus?”

The absence of neurogenesis in the human brain may not be a bad thing, the researchers point out, but instead point the way to understanding what makes the human brain distinct from other animals and set researchers on a better path to developing treatments for human brain diseases.

After coming full circle in the study of neurogenesis, from playing a role in proving its existence in other animals, to demonstrating that it appears not to play a major role in humans, Alvarez-Buylla is philosophical. “I always try to work against my assumptions in lab,” he said. “We’ve been working on adult neurogenesis so long, it is hard to see that it may not happen in humans, but we follow where the data leads us.”

Reference:
Sorrells, S. F., Paredes, M. F., Cebrian-Silla, A., Sandoval, K., Qi, D., Kelley, K. W., . . . Alvarez-Buylla, A. (2018). Human hippocampal neurogenesis drops sharply in children to undetectable levels in adults. Nature. doi:10.1038/nature25975

https://www.technologynetworks.com/neuroscience/news/birth-of-new-neurons-in-the-human-hippocampus-ends-in-childhood-298394?utm_campaign=Newsletter_TN_BreakingScienceNews&utm_source=hs_email&utm_medium=email&utm_content=61208436&_hsenc=p2ANqtz-9wXzuHgjTCBE-kfjy2aI1t3MUL9sd_5yCjnzo0oJb_R1HQdkMueXmiVXpB290Xv_tYEY8WdZxoDvtPtxyl3ajVpcPK1Q&_hsmi=61208436

A simple score may be able to determine the personal risk of developing Alzheimer’s disease

For the first time, an international team of scientists, led by researchers at University of California San Diego School of Medicine, have determined that an Alzheimer’s disease (AD) polygenic risk score can be used to correctly identify adults with mild cognitive impairment (MCI) who were only in their 50s. MCI is considered a precursor to AD.

Findings were published in the February 27 online edition of Molecular Psychiatry.

The AD polygenic risk score was created from genome-wide association studies of AD with a combination of genes weighted according to the association of single nucleotide polymorphisms (SNPs) with AD. SNPs are variations of a single nucleotide or DNA-building block that occur at a specific position in the genome. There is some SNP variation in genomic information in all humans, which affects individual susceptibility to disease.

“Current studies of the AD polygenic risk score typically occur in adults in their 70s, but the AD pathological process begins decades before the onset of dementia,” said William S. Kremen, PhD, professor of psychiatry and co-director of the Center for Behavior Genetics of Aging at UC San Diego School of Medicine. “By focusing on a younger population with cognitive impairment, we may be better able to identify patients for critical early interventions and clinical trials.”

Kremen and team found that someone with an AD polygenic risk score in the upper quartile was 2.5 to 3 times more likely to have MCI than someone with a score in the lowest quartile. Signs of MCI may include difficulty with word recall, forgetting appointments, or often losing personal belongings. The type of MCI most associated with memory loss is called amnestic MCI.

According to the National Institute on Aging, more people with MCI than those without it go on to develop Alzheimer’s. Approximately eight of every 10 persons who fit the definition of amnestic MCI develop Alzheimer’s disease within seven years.

“Our research team found that the polygenic score could differentiate individuals with mild cognitive impairment from those who were cognitively normal,” said Kremen. “We also noticed that for study participants who had cognitive deficits other than memory problems, diabetes was three-fold more likely.”

Kremen added that while this test is not yet available to primary care physicians, it may be an important tool to aid researchers in predicting MCI and AD, and, eventually, reducing the number of future cases.

“The Alzheimer’s Association and others have modeled how the impact of delaying the onset of AD by five years could reduce the number of cases by nearly 50 percent by 2050. We want to do what we can to make this projection a reality,” said Kremen.

Data for this study were collected from 1,329 men who participated in the Vietnam Era Twin Study of Aging (VESTA.). VESTA constitutes a national sample comparable to U.S. men in their age range with respect to health and lifestyle characteristics. Approximately 90 percent of subjects in this analysis were in their 50s. Diagnosis of AD was based on the Jak-Bondi actuarial/neuropsychological approach.

This article has been republished from materials provided by UCSD. Note: material may have been edited for length and content. For further information, please contact the cited source.

Reference: Logue, M. W., Panizzon, M. S., Elman, J. A., Gillespie, N. A., Hatton, S. N., Gustavson, D. E., … Kremen, W. S. (2018). Use of an Alzheimer’s disease polygenic risk score to identify mild cognitive impairment in adults in their 50s. Molecular Psychiatry, 1. https://doi.org/10.1038/s41380-018-0030-8

Case Western Reserve University Scientists generate first microscopic image of full length serotonin receptor


3D reconstruction of a serotonin receptor generated by cryo-electron microscopy

by Rebecca Pool

Claiming a world first and using cryo-electron microscopy, researchers from Case Western Reserve University School of Medicine, US, have observed full-length serotonin receptors. The proteins are common drug targets, and the new images provide details about molecular binding sites that could lead to more precise drug design. Serotonin receptors, which reside in cell membranes throughout the body, are highly dynamic and difficult to image. In the past, the receptors have been sectioned into pieces to study, but by capturing full-length samples, researchers have revealed how different portions interact.

Dr Sandip Basak from Physiology and Biophysics, and colleagues, describe ‘a finely tuned orchestration of three domain movements’ that allows the receptors to elegantly control passageways across cell membranes. “The serotonin receptor acts as a gateway, or channel, from outside the cell to inside,” he says. “When serotonin binds onto the receptor, the channel switches conformation from closed to open. It eventually twists into a ‘desensitized’ state, where the channel closes but serotonin remains attached,” he adds. “This prevents it from being reactivated.”

For this study, the researchers used a FEI Titan Krios microscope, operating at 300 kV, and equipped with a Gatan K2-Summit direct detector camera, at the National Cryo-Electron Microscopy Facility in Frederick, Maryland.

“Successful design of safer therapeutics [for cancer therapies and gastrointestinal diseases] has slowed because there is currently a limited understanding of the structure of the serotonin receptor itself, and what happens after serotonin binds,” says research leader, Professor Sudha Chakrapani. “Our new structure of the serotonin receptor in the resting state has the potential to serve as a structural blueprint to drive targeted drug design and better therapeutic strategies.”

This research is published in Nature Communications.

https://microscopy-analysis.com/editorials/editorial-listings/first-images-full-length-receptor-structure

How flashing lights and pink noise might banish Alzheimer’s, improve memory and more


Illustration by Paweł Jońca

by Helen Thomson

In March 2015, Li-Huei Tsai set up a tiny disco for some of the mice in her laboratory. For an hour each day, she placed them in a box lit only by a flickering strobe. The mice — which had been engineered to produce plaques of the peptide amyloid-β in the brain, a hallmark of Alzheimer’s disease — crawled about curiously. When Tsai later dissected them, those that had been to the mini dance parties had significantly lower levels of plaque than mice that had spent the same time in the dark.

Tsai, a neuroscientist at Massachusetts Institute of Technology (MIT) in Cambridge, says she checked the result; then checked it again. “For the longest time, I didn’t believe it,” she says. Her team had managed to clear amyloid from part of the brain with a flickering light. The strobe was tuned to 40 hertz and was designed to manipulate the rodents’ brainwaves, triggering a host of biological effects that eliminated the plaque-forming proteins. Although promising findings in mouse models of Alzheimer’s disease have been notoriously difficult to replicate in humans, the experiment offered some tantalizing possibilities. “The result was so mind-boggling and so robust, it took a while for the idea to sink in, but we knew we needed to work out a way of trying out the same thing in humans,” Tsai says.

Scientists identified the waves of electrical activity that constantly ripple through the brain almost 100 years ago, but they have struggled to assign these oscillations a definitive role in behaviour or brain function. Studies have strongly linked brainwaves to memory consolidation during sleep, and implicated them in processing sensory inputs and even coordinating consciousness. Yet not everyone is convinced that brainwaves are all that meaningful. “Right now we really don’t know what they do,” says Michael Shadlen, a neuroscientist at Columbia University in New York City.

Now, a growing body of evidence, including Tsai’s findings, hint at a meaningful connection to neurological disorders such as Alzheimer’s and Parkinson’s diseases. The work offers the possibility of forestalling or even reversing the damage caused by such conditions without using a drug. More than two dozen clinical trials are aiming to modulate brainwaves in some way — some with flickering lights or rhythmic sounds, but most through the direct application of electrical currents to the brain or scalp. They aim to treat everything from insomnia to schizophrenia and premenstrual dysphoric disorder.

Tsai’s study was the first glimpse of a cellular response to brainwave manipulation. “Her results were a really big surprise,” says Walter Koroshetz, director of the US National Institute of Neurological Disorders and Stroke in Bethesda, Maryland. “It’s a novel observation that would be really interesting to pursue.”


A powerful wave

Brainwaves were first noticed by German psychiatrist Hans Berger. In 1929, he published a paper describing the repeating waves of current he observed when he placed electrodes on people’s scalps. It was the world’s first electroencephalogram (EEG) recording — but nobody took much notice. Berger was a controversial figure who had spent much of his career trying to identify the physiological basis of psychic phenomena. It was only after his colleagues began to confirm the results several years later that Berger’s invention was recognized as a window into brain activity.

Neurons communicate using electrical impulses created by the flow of ions into and out of each cell. Although a single firing neuron cannot be picked up through the electrodes of an EEG, when a group of neurons fires again and again in synchrony, it shows up as oscillating electrical ripples that sweep through the brain.

Those of the highest frequency are gamma waves, which range from 25 to 140 hertz. People often show a lot of this kind of activity when they are at peak concentration. At the other end of the scale are delta waves, which have the lowest frequency — around 0.5 to 4 hertz. These tend to occur in deep sleep (see ‘Rhythms of the mind’).

At any point in time, one type of brainwave tends to dominate, although other bands are always present to some extent. Scientists have long wondered what purpose, if any, this hum of activity serves, and some clues have emerged over the past three decades. For instance, in 1994, discoveries in mice indicated that the distinct patterns of oscillatory activity during sleep mirrored those during a previous learning exercise. Scientists suggested that these waves could be helping to solidify memories.

Brainwaves also seem to influence conscious perception. Randolph Helfrich at the University of California, Berkeley, and his colleagues devised a way to enhance or reduce gamma oscillations of around 40 hertz using a non-invasive technique called transcranial alternating current stimulation (tACS). By tweaking these oscillations, they were able to influence whether a person perceived a video of moving dots as travelling vertically or horizontally.

The oscillations also provide a potential mechanism for how the brain creates a coherent experience from the chaotic symphony of stimuli hitting the senses at any one time, a puzzle known as the ‘binding problem’. By synchronizing the firing rates of neurons responding to the same event, brainwaves might ensure that the all of the relevant information relating to one object arrives at the correct area of the brain at exactly the right time. Coordinating these signals is the key to perception, says Robert Knight, a cognitive neuroscientist at the University of California, Berkeley, “You can’t just pray that they will self-organize.”


Healthy oscillations

But these oscillations can become disrupted in certain disorders. In Parkinson’s disease, for example, the brain generally starts to show an increase in beta waves in the motor regions as body movement becomes impaired. In a healthy brain, beta waves are suppressed just before a body movement. But in Parkinson’s disease, neurons seem to get stuck in a synchronized pattern of activity. This leads to rigidity and movement difficulties. Peter Brown, who studies Parkinson’s disease at the University of Oxford, UK, says that current treatments for the symptoms of the disease — deep-brain stimulation and the drug levodopa — might work by reducing beta waves.

People with Alzheimer’s disease show a reduction in gamma oscillations5. So Tsai and others wondered whether gamma-wave activity could be restored, and whether this would have any effect on the disease.

They started by using optogenetics, in which brain cells are engineered to respond directly to a flash of light. In 2009, Tsai’s team, in collaboration with Christopher Moore, also at MIT at the time, demonstrated for the first time that it is possible to use the technique to drive gamma oscillations in a specific part of the mouse brain6.

Tsai and her colleagues subsequently found that tinkering with the oscillations sets in motion a host of biological events. It initiates changes in gene expression that cause microglia — immune cells in the brain — to change shape. The cells essentially go into scavenger mode, enabling them to better dispose of harmful clutter in the brain, such as amyloid-β. Koroshetz says that the link to neuroimmunity is new and striking. “The role of immune cells like microglia in the brain is incredibly important and poorly understood, and is one of the hottest areas for research now,” he says.

If the technique was to have any therapeutic relevance, however, Tsai and her colleagues had to find a less-invasive way of manipulating brainwaves. Flashing lights at specific frequencies has been shown to influence oscillations in some parts of the brain, so the researchers turned to strobe lights. They started by exposing young mice with a propensity for amyloid build-up to flickering LED lights for one hour. This created a drop in free-floating amyloid, but it was temporary, lasting less than 24 hours, and restricted to the visual cortex.

To achieve a longer-lasting effect on animals with amyloid plaques, they repeated the experiment for an hour a day over the course of a week, this time using older mice in which plaques had begun to form. Twenty-four hours after the end of the experiment, these animals showed a 67% reduction in plaque in the visual cortex compared with controls. The team also found that the technique reduced tau protein, another hallmark of Alzheimer’s disease.

Alzheimer’s plaques tend to have their earliest negative impacts on the hippocampus, however, not the visual cortex. To elicit oscillations where they are needed, Tsai and her colleagues are investigating other techniques. Playing rodents a 40-hertz noise, for example, seems to cause a decrease in amyloid in the hippocampus — perhaps because the hippo-campus sits closer to the auditory cortex than to the visual cortex.

Tsai and her colleague Ed Boyden, a neuro-scientist at MIT, have now formed a company, Cognito Therapeutics in Cambridge, to test similar treatments in humans. Last year, they started a safety trial, which involves testing a flickering light device, worn like a pair of glasses, on 12 people with Alzheimer’s.

Caveats abound. The mouse model of Alzheimer’s disease is not a perfect reflection of the disorder, and many therapies that have shown promise in rodents have failed in humans. “I used to tell people — if you’re going to get Alzheimer’s, first become a mouse,” says Thomas Insel, a neuroscientist and psychiatrist who led the US National Institute of Mental Health in Bethesda, Maryland, from 2002 until 2015.

Others are also looking to test how manipulating brainwaves might help people with Alzheimer’s disease. “We thought Tsai’s study was outstanding,” says Emiliano Santarnecchi at Harvard Medical School in Boston, Massachusetts. His team had already been using tACS to stimulate the brain, and he wondered whether it might elicit stronger effects than a flashing strobe. “This kind of stimulation can target areas of the brain more specifically than sensory stimulation can — after seeing Tsai’s results, it was a no-brainer that we should try it in Alzheimer’s patients.”

His team has begun an early clinical trial in which ten people with Alzheimer’s disease receive tACS for one hour daily for two weeks. A second trial, in collaboration with Boyden and Tsai, will look for signals of activated microglia and levels of tau protein. Results are expected from both trials by the end of the year.

Knight says that Tsai’s animal studies clearly show that oscillations have an effect on cellular metabolism — but whether the same effect will be seen in humans is another matter. “In the end, it’s data that will win out,” he says.

The studies may reveal risks, too. Gamma oscillations are the type most likely to induce seizures in people with photosensitive epilepsy, says Dora Hermes, a neuroscientist at Stanford University in California. She recalls a famous episode of a Japanese cartoon that featured flickering red and blue lights, which induced seizures in some viewers. “So many people watched that episode that there were almost 700 extra visits to the emergency department that day.”

A brain boost

Nevertheless, there is clearly a growing excitement around treating neurological diseases using neuromodulation, rather than pharmaceuticals. “There’s pretty good evidence that by changing neural-circuit activity we can get improvements in Parkinson’s, chronic pain, obsessive–compulsive disorder and depression,” says Insel. This is important, he says, because so far, pharmaceutical treatments for neurological disease have suffered from a lack of specificity. Koroshetz adds that funding institutes are eager for treatments that are innovative, non-invasive and quickly translatable to people.

Since publishing their mouse paper, Boyden says, he has had a deluge of requests from researchers wanting to use the same technique to treat other conditions. But there are a lot of details to work out. “We need to figure out what is the most effective, non-invasive way of manipulating oscillations in different parts of the brain,” he says. “Perhaps it is using light, but maybe it’s a smart pillow or a headband that could target these oscillations using electricity or sound.” One of the simplest methods that scientists have found is neurofeedback, which has shown some success in treating a range of conditions, including anxiety, depression and attention-deficit hyperactivity disorder. People who use this technique are taught to control their brainwaves by measuring them with an EEG and getting feedback in the form of visual or audio cues.

Phyllis Zee, a neurologist at Northwestern University in Chicago, Illinois, and her colleagues delivered pulses of ‘pink noise’ — audio frequencies that together sound a bit like a waterfall — to healthy older adults while they slept. They were particularly interested in eliciting the delta oscillations that characterize deep sleep. This aspect of sleep decreases with age, and is associated with a decreased ability to consolidate memories.

So far, her team has found that stimulation increased the amplitude of the slow waves, and was associated with a 25–30% improvement in recall of word pairs learnt the night before, compared with a fake treatment7. Her team is midway through a clinical trial to see whether longer-term acoustic stimulation might help people with mild cognitive impairment.

Although relatively safe, these kinds of technologies do have limitations. Neurofeedback is easy to learn, for instance, but it can take time to have an effect, and the results are often short-lived. In experiments that use magnetic or acoustic stimulation, it is difficult to know precisely what area of the brain is being affected. “The field of external brain stimulation is a little weak at the moment,” says Knight. Many approaches, he says, are open loop, meaning that they don’t track the effect of the modulation using an EEG. Closed loop, he says, would be more practical. Some experiments, such as Zee’s and those involving neuro-feedback, already do this. “I think the field is turning a corner,” Knight says. “It’s attracting some serious research.”

In addition to potentially leading to treatments, these studies could break open the field of neural oscillations in general, helping to link them more firmly to behaviour and how the brain works as a whole.

Shadlen says he is open to the idea that oscillations play a part in human behaviour and consciousness. But for now, he remains unconvinced that they are directly responsible for these phenomena — referring to the many roles people ascribe to them as “magical incantations”. He says he fully accepts that these brain rhythms are signatures of important brain processes, “but to posit the idea that synchronous spikes of activity are meaningful, that by suddenly wiggling inputs at a specific frequency, it suddenly elevates activity onto our conscious awareness? That requires more explanation.”

Whatever their role, Tsai mostly wants to discipline brainwaves and harness them against disease. Cognito Therapeutics has just received approval for a second, larger trial, which will look at whether the therapy has any effect on Alzheimer’s disease symptoms. Meanwhile, Tsai’s team is focusing on understanding more about the downstream biological effects and how to better target the hippocampus with non-invasive technologies.

For Tsai, the work is personal. Her grandmother, who raised her, was affected by dementia. “Her confused face made a deep imprint in my mind,” Tsai says. “This is the biggest challenge of our lifetime, and I will give it all I have.”

https://www.nature.com/articles/d41586-018-02391-6

Heavy drinking leads to early-onset dementia

Research published in The Lancet Public Health indicated that alcohol use disorder is a major risk factor for dementia, especially early-onset dementia.

“The relationships between alcohol use and cognitive health in general, and dementia in particular, are complex,” Michaël Schwarzinger, MD, of the Translational Health Economics Network, France, and colleagues wrote. “Moderate drinking has been consistently associated with detrimental effects on brain structure, and nearly every review describes methodological problems of underlying studies, such as inconsistent measurement of alcohol use or dementia, or both, and insufficient control of potential confounders. By contrast, heavy drinking seems detrimentally related to dementia risk, whatever the dementia type.”

To determine how alcohol use disorders effect dementia risk, especially among those aged younger than 65 years, researchers conducted a nationwide retrospective cohort of hospitalized adults in France discharged with alcohol-related brain damage, vascular dementia or other dementias between 2008 and 2013. Alcohol use disorder was the primary exposure, and dementia was the main outcome. Using the French National Hospital Discharge database, they studied the prevalence of early-onset dementia and determined whether alcohol use disorders or other risk factors were associated with dementia onset.

In total, 1,109,343 adults discharged from hospital in France were diagnosed with dementia and included in the study. Of those, 35,034 cases of dementia were attributable to alcohol-related brain damage, and 52,625 cases had other alcohol use disorders. Among the 57,353 early-onset dementia cases, 22,338 (38.9%) were attributable to alcohol-related brain damage and 10,115 (17.6%) had an additional diagnosis of alcohol use disorders.

Analysis revealed that alcohol use disorders were linked to a threefold increased risk for all types of dementia and “were the strongest modifiable risk factor for dementia onset” (adjusted HR = 3.34 [95% CI, 3.28–3.41] for women; HR = 3.36 [95% CI, 3.31–3.41] for men). Alcohol use disorders remained associated with an increased risk for vascular and other dementias even after excluding alcohol-related brain damage, according to the findings. Furthermore, chronic heavy drinking was also linked to all other independent risk factors for dementia onset, including tobacco smoking, high blood pressure, diabetes, lower education, depression and hearing loss.

“Our findings suggest that the burden of dementia attributable to alcohol use disorders is much larger than previously thought, suggesting that heavy drinking should be recognized as a major risk factor for all types of dementia,” Schwarzinger said in a press release. “A variety of measures are needed, such as reducing availability, increasing taxation and banning advertising and marketing of alcohol, alongside early detection and treatment of alcohol use disorders.”

Previous research has largely focused on modest alcohol use, and its possible beneficial effect, thus overlooking the effect of heavy alcohol use as a modifiable risk factor for dementia, according to a related comment written by Clive Ballard, MBChB, MRCPsych, and Iain Lang, PhD, of the University of Exeter Medical School, U.K.

“Although many questions remain, several can be answered using existing data, which would provide an opportunity to refine our understanding of the pathways of modifiable risk and develop optimal prevention strategies,” Ballard and Lang wrote. “In our view, this evidence is robust, and we should move forward with clear public health messages about the relationship between both alcohol use disorders and alcohol consumption, respectively, and dementia.” – by Savannah Demko

https://www.healio.com/psychiatry/alzheimers-disease-dementia/news/online/%7B90f5e375-9dd3-4715-9206-7c148d563d80%7D/heavy-drinking-may-increase-risk-for-dementia?utm_source=selligent&utm_medium=email&utm_campaign=psychiatry%20news&m_bt=1162769038120