New evidence that dogs can recognize vowel changes in words

by Gege Li

Dogs pay much closer attention to what humans say than we realised, even to words that are probably meaningless to them.

Holly Root-Gutteridge at the University of Sussex, UK, and her colleagues played audio recordings of people saying six words to 70 pet dogs of various breeds. The dogs had never heard these voices before and the words only differed by their vowels, such as “had”, “hid” and “who’d”.

Each recording was altered so the voices were at the same pitch, ensuring that the only cue the dogs had was the difference between vowels, rather than how people said the words.

After hearing the recordings just once, 48 of the dogs reacted when either the same speaker said a new word or the same word was said by a different speaker. The remainder either didn’t visibly respond or got distracted.

The team based its assessment of the dogs’ reactions on how long they paid attention when the voice or word changed – if the dogs moved their ears or shifted eye contact, for example, it showed that they noticed the change. In contrast, when the dogs heard the same word repeated several times, their attention waned.

Until now, it was thought that only humans could detect vowels in words and realise that these sounds stay the same across different speakers. But the dogs could do both spontaneously without any previous training.

“I was surprised by how well some of the dogs responded to unfamiliar voices,” says Root-Gutteridge. “It might mean that they comprehend more than we give them credit for.”

This ability may be the result of domestication, says Root-Guttridge, as dogs that pay closer attention to human sounds are more likely to have been chosen for breeding.

The work highlights the strength of social interactions between humans and dogs, says Britta Osthaus at Canterbury Christ Church University, UK. “It would be interesting to see whether a well-trained dog would react differently to the command of ‘sat’ instead of ‘sit’,” she says.

Journal reference: Biology Letters, DOI: 10.1098/rsbl.2019.0555

Read more: https://www.newscientist.com/article/2225746-dogs-have-a-better-ear-for-language-than-we-thought/#ixzz679cb3PFN

Ketamine Could Help Cut Alcohol Consumption by Rewiring Memory of Alcohol Reward


Preliminary findings from a clinical trial of heavy drinkers suggest that the drug can weaken certain memories tied to the reward of imbibing, although the mechanisms aren’t fully clear.

by CATHERINE OFFORD

he anesthetic drug ketamine could be used to rewire heavy drinkers’ memories and help them cut down on alcohol consumption, according to a study published yesterday (November 26) in Nature Communications. In a clinical trial of people who reported consuming around 590 grams of alcohol—equivalent to nearly two cases of beer—per week on average, researchers found that a procedure that involved administering the drug while people were thinking about drinking durably reduced consumption.

While it’s not clear how the method works at a neurological level, the study represents “a really exciting development,” Amy Milton, a behavioral neuroscientist at the University of Cambridge who was not involved in the work, tells STAT. She adds that the findings mark “the first time it’s been shown in a clinical population that this can be effective.”

The study was designed to manipulate the brain’s retrieval and stabilization of memories—in this case, those linking the sight and thoughts of alcohol to the reward of drinking it, study coauthor Ravi Das, a psychopharmacologist at University College London, tells Science News. “We’re trying to break down those memories to stop that process from happening.”

To do that, the team asked 30 of the participants to look at a glass of beer, followed by a sequence of images of alcoholic and non-alcoholic drinks. On the first day of tests, the session ended with participants being invited to drink the beer. On the second day, after viewing the beer and images, the screen cut off, and instead of drinking the beer, participants were given a shot of ketamine.

Among various functions, ketamine blocks NMDA receptors—key proteins in the brain’s reward pathways—so the researchers hypothesized that administering the drug during memory retrieval would help weaken participants’ associations between the sight or contemplation of alcohol and the reward of drinking it. Their results somewhat support that hypothesis. Nine months following the several-day trial, the volunteers reported cutting their drinking back by half.

“To actually get changes in [participants’] behavior when they go home and they’re not in the lab is a big deal,” Mary Torregrossa, a neuroscientist at the University of Pittsburgh who was not involved in the work, tells Science. But she notes that it’s not clear whether it was the ketamine or some other part of the procedure that led to the effect.

Another 60 participants, split into two control groups, received slightly different procedures that involved either beer or ketamine and still showed, on average, a 35 percent decrease in alcohol consumption after nine months. The participants themselves were recruited to the study through online ads—meaning that the researchers may have selected for people already interested in reducing consumption.

Whatever the mechanisms behind the effect, the results so far suggest the method is worth investigating, David Epstein, an addiction researcher at the National Institute on Drug Abuse, tells Science News. “If a seemingly small one-time experience in a lab produces any effects that are detectable later in real life, the data are probably pointing toward something important.”

Catherine Offord is an associate editor at The Scientist. Email her at cofford@the-scientist.com.

https://www.the-scientist.com/news-opinion/ketamine-could-help-cut-alcohol-consumption-by-rewiring-memory-66792?utm_campaign=TS_DAILY%20NEWSLETTER_2019&utm_source=hs_email&utm_medium=email&utm_content=80070748&_hsenc=p2ANqtz-_mk5jB1Vyqx3xPsKPzk1WcGdxEqSmuirpfpluu4Opm4tMO6n7rXROJrCvQp0yKBw2eCo4R4TZ422Hk6FcfJ7tDWkMpyg&_hsmi=80070748

Study shows extra virgin olive oil staves off multiple forms of dementia in mice

Boosting brain function is key to staving off the effects of aging. And if there was one thing every person should consider doing right now to keep their brain young, it is to add extra virgin olive oil (EVOO) to their diet, according to research by scientists at the Lewis Katz School of Medicine at Temple University (LKSOM). EVOO is a superfood, rich in cell-protecting antioxidants and known for its multiple health benefits, including helping put the brakes on diseases linked to aging, most notably cardiovascular disease. Previous LKSOM research on mice also showed that EVOO preserves memory and protects the brain against Alzheimer’s disease.

In a new study in mice published online in the journal Aging Cell, LKSOM scientists show that yet another group of aging-related diseases can be added to that list—tauopathies, which are characterized by the gradual buildup of an abnormal form of a protein called tau in the brain. This process leads to a decline in mental function, or dementia. The findings are the first to suggest that EVOO can defend against a specific type of mental decline linked to tauopathy known as frontotemporal dementia.

Alzheimer’s disease is itself one form of dementia. It primarily affects the hippocampus—the memory storage center in the brain. Frontotemporal dementia affects the areas of the brain near the forehead and ears. Symptoms typically emerge between ages 40 and 65 and include changes in personality and behavior, difficulties with language and writing, and eventual deterioration of memory and ability to learn from prior experience.

Senior investigator Domenico Praticò, MD, Scott Richards North Star Foundation Chair for Alzheimer’s Research, Professor in the Departments of Pharmacology and Microbiology, and Director of the Alzheimer’s Center at Temple at LKSOM, describes the new work as supplying another piece in the story about EVOO’s ability to ward off cognitive decline and to protect the junctions where neurons come together to exchange information, which are known as synapses.

“EVOO has been a part of the human diet for a very long time and has many benefits for health, for reasons that we do not yet fully understand,” he said. “The realization that EVOO can protect the brain against different forms of dementia gives us an opportunity to learn more about the mechanisms through which it acts to support brain health.”

In previous work using a mouse model in which animals were destined to develop Alzheimer’s disease, Dr. Praticò’s team showed that EVOO supplied in the diet protected young mice from memory and learning impairment as they aged. Most notably, when the researchers looked at brain tissue from mice fed EVOO, they did not see features typical of cognitive decline, particularly amyloid plaques—sticky proteins that gum up communication pathways between neurons in the brain. Rather, the animals’ brains looked normal.

The team’s new study shows that the same is true in the case of mice engineered to develop tauopathy. In these mice, normal tau protein turns defective and accumulates in the brain, forming harmful tau deposits, also called tangles. Tau deposits, similar to amyloid plaques in Alzheimer’s disease, block neuron communication and thereby impair thinking and memory, resulting in frontotemporal dementia.

Tau mice were put on a diet supplemented with EVOO at a young age, comparable to about age 30 or 40 in humans. Six months later, when mice were the equivalent of age 60 in humans, tauopathy-prone animals experienced a 60 percent reduction in damaging tau deposits, compared to littermates that were not fed EVOO. Animals on the EVOO diet also performed better on memory and learning tests than animals deprived of EVOO.

When Dr. Praticò and colleagues examined brain tissue from EVOO-fed mice, they found that improved brain function was likely facilitated by healthier synapse function, which in turn was associated with greater-than-normal levels of a protein known as complexin-1. Complexin-1 is known to play a critical role in maintaining healthy synapses.

Dr. Praticò and colleagues now plan to explore what happens when EVOO is fed to older animals that have begun to develop tau deposits and signs of cognitive decline, which more closely reflects the clinical scenario in humans. “We are particularly interested in knowing whether EVOO can reverse tau damage and ultimately treat tauopathy in older mice,” Dr. Praticò added.

More information: Elisabetta Lauretti et al, Extra virgin olive oil improves synaptic activity, short‐term plasticity, memory, and neuropathology in a tauopathy model, Aging Cell (2019). DOI: 10.1111/acel.13076

https://m.medicalxpress.com/news/2019-11-extra-virgin-olive-oil-staves.html

People who cannot read may be three times as likely to develop dementia

New research has found that people who are illiterate, meaning they never learned to read or write, may have nearly three times greater risk of developing dementia than people who can read and write. The study is published in the November 13, 2019, online issue of Neurology®, the medical journal of the American Academy of Neurology.

According to the United States Department of Education, approximately 32 million adults in the country are illiterate.

“Being able to read and write allows people to engage in more activities that use the brain, like reading newspapers and helping children and grandchildren with homework,” said study author Jennifer J. Manly, Ph.D., of Columbia University Vagelos College of Physicians and Surgeons in New York. “Previous research has shown such activities may reduce the risk of dementia. Our new study provides more evidence that reading and writing may be important factors in helping maintain a healthy brain.”

The study looked at people with low levels of education who lived in northern Manhattan. Many were born and raised in rural areas in the Dominican Republic where access to education was limited. The study involved 983 people with an average age of 77. Each person went to school for four years or less. Researchers asked each person, “Did you ever learn to read or write?” Researchers then divided people into two groups; 237 people were illiterate and 746 people were literate.

Participants had medical exams and took memory and thinking tests at the beginning of the study and at follow-up appointments that occurred every 18 months to two years. Testing included recalling unrelated words and producing as many words as possible when given a category like fruit or clothing.

Researchers found of the people who were illiterate, 83 of 237 people, or 35 percent, had dementia at the start of the study. Of the people who were literate, 134 of 746 people, or 18 percent, had dementia. After adjusting for age, socioeconomic status and cardiovascular disease, people who could not read and write had nearly a three times greater chance of having dementia at the start of the study.

Among participants without dementia at the start of the study, during follow-up an average of four years later, 114 of 237 people who were illiterate, or 48 percent, had dementia. Of the people who were literate, 201 of 746 people, or 27 percent, had dementia. After adjusting for age, socioeconomic status and cardiovascular disease, researchers found that people who could not read and write were twice as likely to develop dementia during the study.

When researchers evaluated language, speed, spatial, and reasoning skills, they found that adults who were illiterate had lower scores at the start of the study. But their test scores did not decline at a more rapid rate as the study progressed.

“Our study also found that literacy was linked to higher scores on memory and thinking tests overall, not just reading and language scores,” said Manly. “These results suggest that reading may help strengthen the brain in many ways that may help prevent or delay the onset of dementia.”

Manly continued, “Even if they only have a few years of education, people who learn to read and write may have lifelong advantages over people who never learn these skills.”

Manly said future studies should find out if putting more resources into programs that teach people to read and write help reduce the risk of dementia.

A limitation of the study was that researchers did not ask how or when literate study participants learned to read and write.

The study was supported by the National Institutes of Health and National Institute on Aging.

Story Source:

Materials provided by American Academy of Neurology. Note: Content may be edited for style and length.

Journal Reference:

Miguel Arce Rentería, Jet M.J. Vonk, Gloria Felix, Justina F. Avila, Laura B. Zahodne, Elizabeth Dalchand, Kirsten M. Frazer, Michelle N. Martinez, Heather L. Shouel, Jennifer J. Manly. Illiteracy, dementia risk, and cognitive trajectories among older adults with low education. Neurology, 2019; 10.1212/WNL.0000000000008587 DOI: 10.1212/WNL.0000000000008587

https://www.sciencedaily.com/releases/2019/11/191114180033.htm

Scientists reverse cognitive symptoms in a mouse model of Down Syndrome

By Kristin Houser

Down syndrome is a cognitive disability that can affect a person’s memory or ability to learn — intellectual impairments researchers traditionally thought were untreatable and irreversible.

But now, researchers from the University of California San Francisco and Baylor College of Medicine say they’ve reversed the impairments in mouse models of Down syndrome — potentially foreshadowing an ethically-fraught future in which doctors can do the same for humans with the condition.

All people with Down syndrome share one thing in common: an extra copy of chromosome 21. For that reason, much of the research on Down syndrome has focused on genetics.

But for this new study, published Friday in the prestigious journal Science, researchers focused on the protein-producing cells in the brains of mice with Down syndrome. That led them to the discovery that the animals’ hippocampus regions produced 39 percent less protein than those of typical mice.

Further study led the researchers to conclude that the presence of an extra chromosome likely prompted the animals’ hippocampal cells to trigger the integrated stress response (ISR), which decreased protein production.

“The cell is constantly monitoring its own health,” researcher Peter Walter said in a press release. “When something goes wrong, the cell responds by making less protein, which is usually a sound response to cellular stress. But you need protein synthesis for higher cognitive functions, so when protein synthesis is reduced, you get a pathology of memory formation.”

By blocking the activity of PKR, the enzyme that prompted the ISR in the mouse model’s hippocampal cells, the researchers found they could not only reverse the decreased protein production but also improve the animals’ cognitive function.

Of course, just because something works in mice doesn’t mean it’ll work in humans.

However, when the researchers analyzed postmortem brain tissue samples of people with Down syndrome, they found evidence that the ISR had been activated. They also obtained a tissue sample from a person with Down syndrome who only had the extra copy of chromosome 21 in some of their cells — and those cells were the only ones with ISR activated.

“We started with a situation that looked hopeless,” Walter said. “Nobody thought anything could be done. But we may have struck gold.”

https://futurism.com/neoscope/scientists-reverse-cognitive-deficiets-of-down-syndrome-mice

Thanks to Kebmodee for bringing this to the It’s Interesting community.

Robert Provine, neuroscientist with pioneering work in laughter, yawning, hiccupping, and tears, dies.

by EMILY MAKOWSKI

Neuroscientist Robert Provine, known for his groundbreaking research on common but mysterious human behavior such as laughter and yawning, died October 17 of complications from non-Hodgkin’s lymphoma, according to The Washington Post. He was 76.

Provine studied human social behaviors through innovative methods. In one 1993 study, his team observed people laughing outside of the lab setting, such as in shopping malls or while walking down the street. He found that, contrary to scientific belief of the time, most instances of laughter were based not in response to overt humor, but instead in an effort to strengthen social bonds, acknowledge a superior’s authority, or, when used negatively, to exclude someone from a group.

“Laughter is part of this universal human vocabulary. Everyone speaks this language. Just as birds of a given species all sing their species’ typical song, laughter is part of our own human song,” Provine once told NPR, according to the Post.

Born on May 11, 1943 in Tulsa, Oklahoma, Provine showed an aptitude for science at a young age when he built telescopes in high school. He received his bachelor’s degree in psychology from Oklahoma State University in 1965 and PhD in psychology from Washington University in St. Louis in 1971. He was a member of the American Association for the Advancement of Science, the Association for Psychological Science, and the Psychonomic Society, and wrote two popular science books: Laughter: A Scientific Investigation in 2000 and Curious Behavior: Yawning, Laughing, Hiccupping, and Beyond in 2012.

“Provine’s research on topics such as yawning, laughter, tickling, and emotional tears provided fascinating insights into the fundamental building blocks of human social behavior,” according to a memorial on the University of Maryland, Baltimore County’s (UMBC) website. Provine taught at UMBC for four decades before becoming a professor emeritus in 2013.

“His approach was just amazing. It was different than what pretty much anyone was doing,” Robert Spencer, one of Provine’s former PhD students and currently the chief of neuropsychology at the VA Ann Arbor Healthcare System, tells The Scientist. “He was opening up a whole new set of methods, things that he would refer to as ‘sidewalk neuroscience,’ which was essentially ethology as applied to humans. And he answered questions you just can’t answer in synthetic lab situations,” he says.

Spencer remembers Provine as having a quirky personality, a distinctive Oklahoma accent, and a lab that “looked like a museum,” adding that it was “just full of equipment that he had built himself” in order to conduct experiments. In addition, he had many interests outside of the lab, such as saxophone playing, race car driving, and martial arts.

He is survived by his wife of 23 years, his son and daughter from his first marriage, and three grandchildren.

https://www.the-scientist.com/news-opinion/robert-provine–researcher-of-universal-human-behavior–dies-66622?utm_campaign=TS_DAILY%20NEWSLETTER_2019&utm_source=hs_email&utm_medium=email&utm_content=78428564&_hsenc=p2ANqtz-9rEINL9K9sYAcnp9kSxgo46D44ioSo_zR3e3MkXqwoeczqjTYDR5a4v3X7Cc4X3sqANvMx6eWvkiUiGKo7lYg5Cj8Sjw&_hsmi=78428564

How information is like snacks, money, and drugs—to your brain

By Laura Counts

Can’t stop checking your phone, even when you’re not expecting any important messages? Blame your brain.

A new study by researchers at UC Berkeley’s Haas School of Business has found that information acts on the brain’s dopamine-producing reward system in the same way as money or food.

“To the brain, information is its own reward, above and beyond whether it’s useful,” says Assoc. Prof. Ming Hsu, a neuroeconomist whose research employs functional magnetic imaging (fMRI), psychological theory, economic modeling, and machine learning. “And just as our brains like empty calories from junk food, they can overvalue information that makes us feel good but may not be useful—what some may call idle curiosity.”

The paper, “Common neural code for reward and information value,” was published this month by the Proceedings of the National Academy of Sciences. Authored by Hsu and graduate student Kenji Kobayashi, now a post-doctoral researcher at the University of Pennsylvania, it demonstrates that the brain converts information into the same common scale as it does for money. It also lays the groundwork for unraveling the neuroscience behind how we consume information—and perhaps even digital addiction.

“We were able to demonstrate for the first time the existence of a common neural code for information and money, which opens the door to a number of exciting questions about how people consume, and sometimes over-consume, information,” Hsu says.

Rooted in the study of curiosity

The paper is rooted in the study of curiosity and what it looks like inside the brain. While economists have tended to view curiosity as a means to an end, valuable when it can help us get information to gain an edge in making decisions, psychologists have long seen curiosity as an innate motivation that can spur actions by itself. For example, sports fans might check the odds on a game even if they have no intention of ever betting.

Sometimes, we want to know something, just to know.

“Our study tried to answer two questions. First, can we reconcile the economic and psychological views of curiosity, or why do people seek information? Second, what does curiosity look like inside the brain?” Hsu says.

The neuroscience of curiosity

To understand more about the neuroscience of curiosity, the researchers scanned the brains of people while they played a gambling game. Each participant was presented with a series of lotteries and needed to decide how much they were willing to pay to find out more about the odds of winning. In some lotteries, the information was valuable—for example, when what seemed like a longshot was revealed to be a sure thing. In other cases, the information wasn’t worth much, such as when little was at stake.

For the most part, the study subjects made rational choices based on the economic value of the information (how much money it could help them win). But that didn’t explain all their choices: People tended to over-value information in general, and particularly in higher-valued lotteries. It appeared that the higher stakes increased people’s curiosity in the information, even when the information had no effect on their decisions whether to play.

The researchers determined that this behavior could only be explained by a model that captured both economic and psychological motives for seeking information. People acquired information based not only on its actual benefit, but also on the anticipation of its benefit, whether or not it had use.

Hsu says that’s akin to wanting to know whether we received a great job offer, even if we have no intention of taking it. “Anticipation serves to amplify how good or bad something seems, and the anticipation of a more pleasurable reward makes the information appear even more valuable,” he says.

Common neural code for information and money

How does the brain respond to information? Analyzing the fMRI scans, the researchers found that the information about the games’ odds activated the regions of the brain specifically known to be involved in valuation (the striatum and ventromedial prefrontal cortex or VMPFC), which are the same dopamine-producing reward areas activated by food, money, and many drugs. This was the case whether the information was useful, and changed the person’s original decision, or not.

Next, the researchers were able to determine that the brain uses the same neural code for information about the lottery odds as it does for money by using a machine learning technique (called support vector regression). That allowed them to look at the neural code for how the brain responds to varying amounts of money, and then ask if the same code can be used to predict how much a person will pay for information. It can.

In other words, just as we can convert such disparate things as a painting, a steak dinner, and a vacation into a dollar value, the brain converts curiosity about information into the same common code it uses for concrete rewards like money, Hsu says.

“We can look into the brain and tell how much someone wants a piece of information, and then translate that brain activity into monetary amounts,” he says.

Raising questions about digital addiction

While the research does not directly address overconsumption of digital information, the fact that information engages the brain’s reward system is a necessary condition for the addiction cycle, he says. And it explains why we find those alerts saying we’ve been tagged in a photo so irresistible.

“The way our brains respond to the anticipation of a pleasurable reward is an important reason why people are susceptible to clickbait,” he says. “Just like junk food, this might be a situation where previously adaptive mechanisms get exploited now that we have unprecedented access to novel curiosities.”

How information is like snacks, money, and drugs—to your brain

Researchers implant a memory into a bird’s brain

by ABBY OLENA

Animals learn by imitating behaviors, such as when a baby mimics her mother’s speaking voice or a young male zebra finch copies the mating song of an older male tutor, often his father. In a study published today in Science, researchers identified the neural circuit that a finch uses to learn the duration of the syllables of a song and then manipulated this pathway with optogenetics to create a false memory that juvenile birds used to develop their courtship song.

“In order to learn from observation, you have to create a memory of someone doing something right and then use this sensory information to guide your motor system to learn to perform the behavior. We really don’t know where and how these memories are formed,” says Dina Lipkind, a biologist at York College who did not participate in the study. The authors “addressed the first step of the process, which is how you form the memory that will later guide [you] towards performing this behavior.”

“Our original goals were actually much more modest,” says Todd Roberts, a neuroscientist at UT Southwestern Medical Center. Initially, Wenchan Zhao, a graduate student in his lab, set out to test whether or not disrupting neural activity while a young finch interacted with a tutor could block the bird’s ability to form a memory of the interchange. She used light to manipulate cells genetically engineered to be sensitive to illumination in a brain circuit previously implicated in song learning in juvenile birds.

Zhao turned the cells on by shining a light into the birds’ brains while they spent time with their tutors and, as a control experiment, when the birds were alone. Then she noticed that the songs that the so-called control birds developed were unusual—different from the songs of birds that had never met a tutor but also unlike the songs of those that interacted with an older bird.

Once Zhao and her colleagues picked up on the unusual songs, they decided to “test whether or not the activity in this circuit would be sufficient to implant memories,” says Roberts.

The researchers stimulated birds’ neural circuits with sessions of 50- or 300-millisecond optogenetic pulses over five days during the time at which they would typically be interacting with a tutor but without an adult male bird present. When these finches grew up, they sang adult courtship songs that corresponded to the duration of light they’d received. Those that got the short pulses sang songs with sounds that lasted about 50 milliseconds, while the ones that received the extended pulses held their notes longer. Some song features—including pitch and how noisy harmonic syllables were in the song—didn’t seem to be affected by optogenetic manipulation. Another measure, entropy, which approximates the amount of information carried in the communication, was not distinguishable in the songs of normally tutored birds and those that received 50-millisecond optogenetic pulses, but was higher in the songs of birds who’d received tutoring than in the songs of either isolated birds or those that received the 300-millisecond light pulses.

While the manipulation of the circuit affected the duration of the sounds in the finches’ songs, other elements of singing behavior—including the timeline of vocal development, how frequently the birds practiced, and in what social contexts they eventually used the songs—were similar to juveniles who’d learned from an adult bird.

The researchers then determined that when the birds received light stimulation at the same time as they interacted with a singing tutor, their adult songs were more like those of birds that had only received light stimulation, indicating that optogenetic stimulation can supplant tutoring.

When the team lesioned the circuit before young birds met their tutors, they didn’t make attempts to imitate the adult courtship songs. But if the juveniles were given a chance to interact with a tutor before the circuit was damaged, they had no problem learning the song. This finding points to an essential role for the pathway in forming the initial memory of the timing of vocalizations, but not in storing it long-term so that it can be referenced to guide song formation.

“What we were able to implant was information about the duration of syllables that the birds want to attempt to learn how to sing,” Roberts tells The Scientist. But there are many more characteristics birds have to attend to when they’re learning a song, including pitch and how to put the syllables in the correct order, he says. The next steps are to identify the circuits that are carrying other types of information and to investigate the mechanisms for encoding these memories and where in the brain they’re stored.

Sarah London, a neuroscientist at the University of Chicago who did not participate in the study, agrees that the strategies used here could serve as a template to tease apart where other characteristics of learned song come from. But more generally, this work in songbirds connects to the bigger picture of our understanding of learning and memory, she says.

Song learning “is a complicated behavior that requires multiple brain areas coordinating their functions over long stretches of development. The brain is changing anyway, and then on top of that the behavior’s changing in the brain,” she explains. Studying the development of songs in zebra finches can give insight into “how maturing neural circuits are influenced by the environment,” both the brain’s internal environment and the external, social environment, she adds. “This is a really unique opportunity, not just for song, not just for language, but for learning in a little larger context—of kids trying to understand and adopt behavioral patterns appropriate to their time and place.”

W. Zhao et al., “Inception of memories that guide vocal learning in the songbird,” Science, doi:10.1126/science.aaw4226, 2019.

https://www.the-scientist.com/news-opinion/researchers-implant-memories-in-zebra-finch-brains-66527?utm_campaign=TS_DAILY%20NEWSLETTER_2019&utm_source=hs_email&utm_medium=email&utm_content=77670023&_hsenc=p2ANqtz-87EBXf6eeNZge06b_5Aa8n7uTBGdQV0pm3iz03sqCnkbGRyfd6O5EXFMKR1hB7lhth1KN_lMxkB_08Kb9sVBXDAMT7gQ&_hsmi=77670023

Alzheimer’s Directly Kills Brain Cells That Keep You Awake


Brain tissue from deceased patients with Alzheimer’s has more tau protein buildup (brown spots) and fewer neurons (red spots) as compared to healthy brain tissue.

By Yasemin Saplakoglu

Alzheimer’s disease might be attacking the brain cells responsible for keeping people awake, resulting in daytime napping, according to a new study.

Excessive daytime napping might thus be considered an early symptom of Alzheimer’s disease, according to a statement from the University of California, San Francisco (UCSF).

Some previous studies suggested that such sleepiness in patients with Alzheimer’s results directly from poor nighttime sleep due to the disease, while others have suggested that sleep problems might cause the disease to progress. The new study suggests a more direct biological pathway between Alzheimer’s disease and daytime sleepiness.

In the current study, researchers studied the brains of 13 people who’d had Alzheimer’s and died, as well as the brains from seven people who had not had the disease. The researchers specifically examined three parts of the brain that are involved in keeping us awake: the locus coeruleus, the lateral hypothalamic area and the tuberomammillary nucleus. These three parts of the brain work together in a network to keep us awake during the day.

The researchers compared the number of neurons, or brain cells, in these regions in the healthy and diseased brains. They also measured the level of a telltale sign of Alzheimer’s: tau proteins. These proteins build up in the brains of patients with Alzheimer’s and are thought to slowly destroy brain cells and the connections between them.

The brains from patients who had Alzheimer’s in this study had significant levels of tau tangles in these three brain regions, compared to the brains from people without the disease. What’s more, in these three brain regions, people with Alzheimer’s had lost up to 75% of their neurons.

“It’s remarkable because it’s not just a single brain nucleus that’s degenerating, but the whole wakefulness-promoting network,” lead author Jun Oh, a research associate at UCSF, said in the statement. “This means that the brain has no way to compensate, because all of these functionally related cell types are being destroyed at the same time.”

The researchers also compared the brains from people with Alzheimer’s with tissue samples from seven people who had two other forms of dementia caused by the accumulation of tau: progressive supranuclear palsy and corticobasal disease. Results showed that despite the buildup of tau, these brains did not show damage to the neurons that promote wakefulness.

“It seems that the wakefulness-promoting network is particularly vulnerable in Alzheimer’s disease,” Oh said in the statement. “Understanding why this is the case is something we need to follow up in future research.”

Though amyloid proteins, and the plaques that they form, have been the major target in several clinical trials of potential Alzheimer’s treatments, increasing evidence suggests that tau proteins play a more direct role in promoting symptoms of the disease, according to the statement.

The new findings suggest that “we need to be much more focused on understanding the early stages of tau accumulation in these brain areas in our ongoing search for Alzheimer’s treatments,” senior author Dr. Lea Grinberg, an associate professor of neurology and pathology at the UCSF Memory and Aging Center, said in the statement.

The findings were published Monday (Aug. 12) in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association.

https://www.livescience.com/alzheimers-attacks-wakefulness-neurons.html?utm_source=notification

Study suggests neuronal origin of ‘they all look alike’

by Bob Yirka

A team of researchers from the University of California and Stanford University has found that the tendency to see people from different racial groups as interchangeable has a neuronal basis. In their paper published in Proceedings of the National Academy of Sciences, the group describes studies they conducted with volunteers and what they found.

One often-heard phrase connected with racial profiling is “they all look the same to me,” a phrase usually perceived as racist. It implies that people of one race have difficulty discerning the facial characteristics of people of another race. In this new effort, the researchers conducted experiments to find out if this is valid—at least among one small group of young, white men.

In the first experiment, young, white male volunteers looked at photographs of human faces, some depicting black people, others white, while undergoing an fMRI scan. Afterward, the researchers found that the part of the brain involved in facial recognition activated more for white faces than it did for black faces.

In the second experiment, the same volunteers looked at photographs of faces that had been doctored to make the subjects appear more alike, regardless of skin color. The researchers report that the brains of the volunteers activated when dissimilarities were spotted, regardless of skin color, though it was more pronounced when the photo was of a white face.

In a third series of experiments, the volunteers rated how different they found faces in a series of photographs or whether they had seen a given face before. The researchers report that the volunteers had a tendency to rate the black faces as more similar to one another than the white faces. And they found it easier to tell if they had seen a particular white face before.

The researchers suggest that the results of their experiments indicate a neural basis that makes it more difficult for people to see differences between individuals of other races. They note that they did account for social contexts such as whether the volunteers had friends and/or associates of other races. They suggest that more work is required to determine if such neuronal biases can be changed based on social behavior.

Brent L. Hughes et al. Neural adaptation to faces reveals racial outgroup homogeneity effects in early perception, Proceedings of the National Academy of Sciences (2019). DOI: 10.1073/pnas.1822084116

https://medicalxpress.com/news/2019-07-neuronal-alike.html