Why cats are crazy for catnip

By Sofia Moutinho

Cat owners flood the internet with videos of their kitties euphorically rolling and flipping out over catnip-filled bags and toys. But exactly how catnip—and a substitute, known as silver vine—produces this feline high has long been a mystery. Now, a study suggests the key intoxicating chemicals in the plants activate cats’ opioid systems much like heroin and morphine do in people. Moreover, the study concludes that rubbing the plants protects the felines against mosquito bites.

“This study essentially has revealed a new potential mosquito repellent” by examining the “pharmaceutical knowledge” of cats, says Emory University biologist Jacobus de Roode, who did not participate in the study.

Catnip (Nepeta cataria) and silver vine (Actinidia polygama) both contain chemical compounds called iridoids that protect the plants against aphids and are known to be the key to the euphoria produced in cats. To determine the physiological effect of these compounds, Iwate University biologist Masao Miyazaki spent 5 years running different experiments using the plants and their chemicals.

First, his team extracted chemicals present in both catnip and silver vine leaves and identified the most potent component that produces the feline high: a minty silver vine chemical called nepetalactol that had not been shown to affect cats until this study. (The substance is similar to nepetalactone, the key iridoid in catnip.) Then, they put 10 leaves’ worth of nepetalactol into paper pouches and presented them, together with pouches containing only a saline substance, to 25 domestic cats to gauge their response. Most of the animals only showed interest in the pouches with nepetalactol.

To make sure this was the object of the felines’ attraction, they repeated the experiment with 30 feral cats—and one leopard, two lynxes, and two jaguars living in Japan’s Tennoji and Oji zoos. Big or small, the felines surrendered to the substance, rubbing their heads and bodies in the patches for an average of 10 minutes (see video, above). In contrast, dogs and mice that were tested showed no interest in the compound.

Next, the researchers measured beta-endorphins—one of the hormones that naturally relieves pain and induces pleasure by activating the body’s opioid system—in the bloodstreams of five cats 5 minutes before and after exposure. The researchers found that levels of this “happiness hormone” became significantly elevated after exposure to nepetalactol compared with controls. Five cats that had their opioid systems blocked did not rub on the nepetalactol-infused pouches.

But the researchers wanted to know whether there was a reason for the cats to go wild, beyond pure pleasure. That is when one of the scientists heard about the insect-repelling properties of nepetalactone, which about 2 decades ago was shown to be as good as the famed mosquito-stopper DEET. The researchers hypothesized that when felines in the wild rub on catnip or silver vine, they’re essentially applying an insect repellant.

They first showed cats can transfer the chemical to their skin, and then conducted a live mosquito challenge—similar to when people’s arms are used to evaluate insect repellants. They put the nepetalactol-treated heads of sedated cats into chambers full of mosquitoes and counted how many landed on them—it was about half the number that landed on feline heads treated with a neutral substance, they report today in Science Advances.

Most scientists and pet owners assumed the only reason that cats roll around in catnip was for the euphoric experience, Miyazaki says“Our findings suggest instead that rolling is rather a functional behavior.”

The researchers speculate that cat ancestors might have rubbed their bodies against the plants by chance, enjoyed the feeling, and kept doing it. It is not clear, though, whether it was the euphoric response—or the insect-repelling properties of the plant—that kept them rolling. “Anyone who has ever sat in the field to observe animals ambushing prey knows just how difficult it is for them to keep still when there are many biting mosquitoes around,” Miyazaki says. “It does not seem unreasonable, therefore, to argue that there is a strong selection pressure” to keep away annoying bugs.

The team, which has already patented an insect repellent based on nepetalactol, plans next to identify the cat genes involved in the catnip response and examine the substance’s action against other insect pests. De Roode, who is impressed by how thorough the experiments were, says the work provides a “really interesting” example of how insects can shape animal behavior. “It is amazing how much we can learn from animals.

https://www.sciencemag.org/news/2021/01/why-cats-are-crazy-catnip

A Tweak to Immune Cells Reverses Aging in Mice

by Abby Olena

Excess inflammation is a problem in aging, contributing to issues such as atherosclerosis, cancer, and cognitive decline. But the mechanisms behind age-related inflammation are not well understood. In a study published today (January 20) in Natureresearchers show that older immune cells have a defect in metabolism that when corrected in a mouse model of Alzheimer’s disease can decrease inflammation and restore cognitive function.

After a decade of progress in understanding metabolism and nutrient usage in immune cells and how that affects their function, this study is a “beautiful example” of now knowing enough to intervene, push buttons, and influence outcomes, says Eyal Amiel, who studies immune cell metabolism at the University of Vermont and was not involved in the new work. “To have a specific metabolic signature associated with a pathology is one thing. To be able to manipulate it is another thing. To be able to manipulate it and reverse the pathology is an incredible sequence of events.”

As a postdoc in the late 1990s, Katrin Andreasson, now a neurologist and researcher at Stanford University School of Medicine, was intrigued by epidemiological studies showing that people who took nonsteroidal anti-inflammatory drugs—such as ibuprofen and naproxen—occasionally for aches and pains had a decreased risk of Alzheimer’s disease. During her postdoc in Paul Worley’s lab at Johns Hopkins School of Medicine, she and her colleagues showed that overexpression of cyclooxygenase-2 (COX-2)—a major mediator of inflammation—in the brain led to Alzheimer’s disease-like symptoms in mice: age-dependent inflammation and cognitive loss.

COX-2 activation is the first step in the production of a lipid called prostaglandin E2 (PGE2), which can bind to one of its receptors, EP2, on immune cells and promote inflammation. To plug up the pathway, Andreasson’s group has shown that deleting the EP2 receptor in mouse macrophages and brain-specific microglia—the cells normally responsible for detecting and destroying immune invaders and cellular debris—reduces inflammation and increases neuronal survival in response to both a bacterial toxin and a neurotoxin. 

In the current study, the researchers wanted to understand how eliminating PGE2 signaling in macrophages could have these effects. They started by comparing macrophages from human blood donors either younger than 35 or older than 65. The cells from older donors made much more PGE2 and had higher abundance of the EP2 receptor than did macrophages from younger donors. When the researchers exposed human macrophages to PGE2, the cells altered their metabolism. Rather than using glucose to make energy, the cells converted it to glycogen and stored it, locking it up where the mitochondria couldn’t access it for ATP production.

“The result of that is that the cells are basically energy-depleted. They’re just fatigued, and they don’t work well,” explains Andreasson. “They don’t phagocytose. They don’t clear debris.” This debris includes misfolded proteins associated with neurodegeneration, the authors write in the paper.

When the scientists treated human macrophages from donors with an average age of about 48 with one of two EP2 receptor inhibitors, glycogen storage decreased, energy production increased, and cells shifted to express anti-inflammatory markers. As in human cells, aged mice also have higher levels of PGE2 in the blood and brain and EP2 receptor levels in macrophages, compared to younger mice. When the researchers knocked down the receptor in macrophages throughout the body in a mouse model of Alzheimer’s disease or treated animals with either of two drugs to suppress EP2 function, cells had improved metabolism. The mice’s age-associated inflammation also reversed and, with it, age-associated cognitive decline. Treating animals with an EP2 antagonist that couldn’t get in the brain and thus only targeted the receptor in peripheral macrophages also led to cognitive improvement in older mice.

“The most interesting thing that they were able to show is that the macrophages are causal in driving age-associated cognitive decline, and, in particular, that it’s sufficient to reprogram the macrophages outside of the brain,” says Jonas Neher, a neuroimmunologist at the German Center for Neurodegenerative Diseases and the University of Tübingen in Germany who authored an accompanying commentary. The next steps are “to figure out what the signal is that comes from the periphery and changes the microglia in the brain. If you can identify this particular signal, then you have another handle on how to reprogram microglia.”

“The hypothetical clinical promise of these findings is obviously outstanding because as you can imagine, it wouldn’t require brain surgery or any kind of gross-level, high-risk intervention,” says Amiel. “Rather, you can manipulate cells systemically and see these outcomes.”

Investigating how those systemic effects work is just one of the questions that Andreasson’s group is currently pursuing. They’re also interested in how and why metabolism declines during aging, as well as other mechanisms that might prevent it. In terms of translating the work to the clinic, one of the only ways to target the EP2 receptor is to go far upstream with COX-2 inhibitors, such as Vioxx, a drug that was withdrawn from the market after some people who took it experienced strokes or heart attacks. There aren’t any drugs that specifically block the EP2 receptor yet, Andreasson tells The Scientist. “There have been attempts made by pharmaceutical companies, but my understanding is it’s been very, very difficult to do.”

P.S. Minhas et al., “Restoring metabolism of myeloid cells reverses cognitive decline in ageing,” Naturedoi:10.1038/s41586-020-03160-0, 2021.

Gut bacteria help digest dietary fiber, release important antioxidant

Dietary fiber found in grains is a large component of many diets, but little is understood about how we digest the fiber, as humans lack enzymes to break down the complex molecules. Some species of gut bacteria break down the fiber in such a way that it not only becomes digestible, but releases ferulic acid, an important antioxidant with multiple health benefits, according to a new study led by researchers at the University of Illinois Urbana-Champaign.

Grains such as rice, oats, rye and wheat are rich in a class of dietary fiber called arabinoxylans, which humans cannot digest on their own. Many gut bacteria have enzymes to break down simple components of arabinoxylans; however, they lack the ability to break down complex ones—including those containing ferulic acid.

“Ferulic acid has been shown to have antioxidant, immunomodulatory and anti-inflammatory activities, and many reports have documented its protective activities in different disease conditions including diabetes, allergic inflammation, Alzheimer’s disease, cardiovascular disorders, microbial infections and cancer,” said study leader Isaac Cann, a professor of animal sciences and microbiology and a member of the Carl R. Woese Institute for Genomic Biology at Illinois.

“The question, then, is what is the benefit of arabinoxylans to us, since our human genomes do not encode the enzymes that can degrade them or access the ferulic acid they contain?” Cann said.

To answer that question, Cann’s group and collaborators at the University of Michigan and Mie University in Japan studied the genomes and digestive activity of bacteria in the intestine. They found that a group of Bacteroides bacteria have several enzymes that break down arabinoxylans, some of which had not been seen or catalogued before. One enzyme the group discovered is so active that it cuts off any ferulic acid it comes across, releasing large amounts of the antioxidant, Cann said. The group published its findings in the journal Nature Communications.

“These bacteria can sense the difference between simple and complex arabinoxylans to deploy a large set of enzymes that function like scissors to cut the linkages in complex arabinoxylans into their unit sugars, and at the same time release the ferulic acid,” Cann said.

Importantly, none of the bacteria the group studied used the ferulic acid after releasing it—thus making it available for absorption in the human gut.

Understanding this mechanism of how bacteria in the colon help the body break down dietary fiber and access ferulic acid has applications for personalized nutrition. With the compound’s protective activity against certain diseases and its role in modulating inflammation and immune response, patients may benefit from probiotic ingestion of the ferulic acid-releasing bacteria or from consuming a diet rich in arabinoxylan fiber, Cann said.

“This is one example of how the microbiome impacts human health and nutrition,” Cann said.

https://medicalxpress.com/news/2021-01-gut-bacteria-digest-dietary-fiber.html

A blood test could diagnose depression and bipolar disorder

Researchers found that levels of a nerve growth factor were lower in people with depression or bipolar disorder than in healthy controls. Doctors could potentially use levels of the growth factor to monitor the effects of antidepressant treatment.

In adults, a protein called brain-derived neurotrophic factor (BDNF) promotes the growth and survival of nerve cells and is known to play a vital role in learning, memory, and maintaining brain flexibility, or “plasticity.”

Psychological stress reduces blood levels of one form of the protein, called mature BDNF (mBDNF), and low levels are associated with depression.

However, commercially available blood tests are unable to differentiate accurately between mBDNF and its precursor, known as proBDNF.

This matters because proBDNF binds to a different receptor and causes inflammation and nerve degeneration.

“Growing evidence indicates that inflammation in brain cells is linked with depressive behaviors, and proBDNF seems to activate the immune system,” says Prof. Xin-Fu Zhou of the University of South Australia in Adelaide. “Therefore, we must separate it from mature BDNF to get an accurate reading.”

Recent studies in animals by Prof. Zhou and his colleagues found that injecting proBDNF into the brain or muscle triggers depressive behaviors.

Prof. Zhou and his team have now developed a test that can measure mBDNF much more accurately.

In collaboration with the University of Adelaide and Kunming Medical University in Yunnan, China, they used the new test to show that people with depression or bipolar disorder have significantly lower levels of mBDNF in their blood than healthy controls.

In a paper that appears in the Journal of Psychiatric Research, the study authors say that doctors could use the test to diagnose these conditions and monitor the success of treatment.

“This could be an objective biomarker, in addition to a clinical assessment by a doctor,” says Prof. Zhou.

Antibody-based test

This type of test, known as an “enzyme-linked immunosorbent assay,” or ELISA, uses antibodies to detect the presence of specific proteins.

The researchers applied their new test to blood samples from 90 inpatients with major depressive disorder, 15 inpatients with bipolar disorder, and 96 healthy controls. The healthy controls were people who had attended the medical center at the psychiatric hospital for a general medical examination and did not have a severe mental illness.

They also tested samples from 14 other people with a history of suicide attempts in the past 10 years. All of these individuals were living in the community and should, therefore, have had better mental health than the current inpatients.

The test revealed that the participants with major depression or bipolar disorder had significantly lower levels of mBDNF in their blood compared with the controls.

Those with severe symptoms of depression had significantly lower levels than those with moderate symptoms.

In addition, people who were taking antidepressants had higher levels than those who were not.

Interestingly, there was no significant difference in mBDNF levels between the individuals who had attempted suicide in the past and the healthy controls. However, the former group was living in the community at the time of the study and may or may not have had symptoms of depression.

The authors estimate that a diagnostic test based on their assay, with a cutoff point of 12.4 nanograms per milliliter of serum, would have a sensitivity of 82.2% and a specificity of 83.3%. This means that the test will miss approximately 1 in 5 people who have depression and deem 1 in 5 people without depression to be depressed.

There were similar findings in the small subgroup with bipolar disorder.

Electroconvulsive therapy

In the future, the team plans to investigate whether electroconvulsive therapy (ECT) can restore imbalances between proBDNF and mBDNF.

ECT is often effective in patients who do not respond to antidepressants or psychotherapy.

Prof. Zhou explains:

“Mood disorders affect millions of people worldwide. However, about one-third of people with depression and bipolar disorder are resistant to antidepressants or alternative therapies. The reasons are not understood, but it could have something to do with the imbalances between the different forms of BDNF, which we hope to investigate next.”

The authors acknowledge that their study had some limitations.

For example, they originally wanted to measure serum levels of proBDNF, in addition to mBDNF. However, for technical reasons, this was not possible. As a result, the researchers were unable to determine whether the balance between these two forms of BDNF or their absolute values had the most significant effect.

They also note that confounding variables, such as whether participants smoked and their body mass index (BMI), may have affected the levels of mBDNF in their blood.

It is important to note that the study participants with MDD were inpatients and, therefore, represent a very small proportion of all people with MDD. As the control group was attending a mental health hospital for a general medical examination, they do not represent the general population.

Further studies are necessary to see how the levels of mBDNF in people living in the community with MDD compare with those in the general population. By doing this, researchers could determine the relevance of these findings to psychiatric care for people with depression.

https://www.medicalnewstoday.com/articles/a-blood-test-could-diagnose-depression-and-bipolar-disorder#Electroconvulsive-therapy

Shared genetics across intelligence and depression.

IN THE SCIENTIFIC REALM OF PSYCHIATRY, researchers often navigate more unknowns than knowns. Despite decades of scientific leaps and technological breakthroughs, the brain’s complexity remains elusive.

Researchers are still searching for the exact underlying mechanisms behind a range of mental health conditions including depression.

In a recent report, scientists announce a crucial piece of this puzzle lies deep in our genetic code. The team discovered a “surprising” shared genetic architecture between depression and another, seemingly counterintuitive factor: intelligence.

Cognitive ability is synonymous with intelligence in the study. The team detected overlapping gene variants associated with both cognitive ability and self-reported depression.

Each of the cohort studies included in this larger analysis measured intelligence differently through various mathematical, knowledge, and verbal cognitive tests. Researchers also tested individuals on their memory, executive function, processing speed, and IQ. Collectively, this trove of information is categorized as information about intelligence.

“The current findings suggest there is a genetic link between intelligence and mood disorders,” study co-author Ole Andreassen tells Inverse. Andreassen is a psychiatry genetics researcher at the University of Oslo.

“The nature of the genetic link is, however, not straightforward,” Andreassen explains.

This finding was published last Monday in the journal Nature Human Behavior.

THE BACKSTORY — To date, the research exploring the connection between depression and intelligence has been mixed.

During a depressive episode, people often have reduced cognitive abilities, which is a key feature of the depressive phenomenon, as well as a diagnostic item, Andreassen explains. Depression can impair attention and memory, as well as decision-making skills.https://901fa6ee582a53d0c7fcca04caa820ab.safeframe.googlesyndication.com/safeframe/1-0-37/html/container.html

However, people with depression may also have more “positive” cognitive associations including involvement in the arts and music, the researcher adds. Studies suggest creative people are more likely to experience mood disorders.

Based on the mixed data, researchers previously suspected there wasn’t a clear relationship between the two factors.

But in this study, scientists found a “dual relationship” that helps explain the seemingly conflicting positive and negative associations between depression and intelligence.

DIGGING INTO THE DETAILS — Andreassen and his colleagues used a statistical approach to analyze recent large genome-wide association study datasets on major depression and intelligence. They collected data on major depression from the Psychiatric Genomics Consortium and 23andMe, which included cases where people reported any depression symptoms. The sample consisted of 135,458 cases of major depression and 344,901 controls.

Data on general cognitive ability were based on 269,867 individuals drawn from 14 different cohorts, with 72 percent from the research database UK BiobankEach of these people completed a battery of neurocognitive tests to capture their intelligence level.

The team harnessed statistical tools to analyze the genetic make-up of the participants. Specifically, using computational and mathematical models, they mapped out the gene sets and detected any overlapping variants between self-reported depression and cognitive function.

WHAT WAS DISCOVERED — Overall, the team found a large number of overlapping genes shared between depression and cognitive ability.

“We show that the traits share a substantial amount of genetic background though there is no genetic correlation between them,” Andreassen says.

The effects of the genes influencing both intelligence and mood are mixed, the researcher explains: Roughly half of the shared genes work in coordination promoting or suppressing both traits, and another half promotes one trait while suppressing another.

Essentially, the genes underpinning depression and intelligence appear to work in haphazard ways — sometimes the more depressed an individual is, the worse their cognitive function; other times, the more depressed, the higher their brainpower.

“The results provide insights into the shared genetic architecture between two important human traits, suggesting a shared neurobiological basis,” Andreassen and his co-authors write.

The study suggests similar genetic factors may be regulating brain pathways involved in regulating cognition and mood, a finding that provides clues for what causes depression in the first place.

Better understanding these shared mechanisms could lead to novel treatments or diagnostic strategies for depression, Andreassen says. For now, the psychiatrist just hopes it helps depressed individuals and their loved ones better understand their mood disorder.

WHAT’S NEXT — In the future, Andreassen and his team hope to characterize the overlapping genetic factors between a range of other brain-related traits and disorders, including substance use.

“Our hypothesis is that much of the clinical comorbidity could be due to overlapping molecular genetic factors,” Andreassen says.

Abstract: Genome-wide association studies (GWAS) have identified several common genetic variants influencing major depression and general cognitive abilities, but little is known about whether the two share any of their genetic aetiology. Here we investigate shared genomic architectures between major depression (MD) and general intelligence (INT) with the MiXeR statistical tool and their overlapping susceptibility loci with conjunctional false discovery rate (conjFDR), which evaluate the level of over- lap in genetic variants and improve the power for gene discovery between two phenotypes. We analysed GWAS data on MD (n=480,359) and INT (n=269,867) to characterize polygenic architecture and identify genetic loci shared between these phenotypes. Despite non-significant genetic correlation (rg = −0.0148, = 0.50), we observed large polygenic overlap and identified 92 loci shared between MD and INT at conjFDR < 0.05. Among the shared loci, 69 and 64 are new for MD and INT, respectively. Our study demonstrates polygenic overlap between these phenotypes with a balanced mixture of effect.

https://www.inverse.com/innovation/diy-brain-stimulation

Apathy can signal the onset of dementia years before other symptoms appear

Apathy can be a dangerous warning sign for many conditions tied to mental health. While the loss of interest and motivation typically signals the onset of depression, hormone changes, or other mental conditions, a new study finds it may also be a red flag for dementia. Researchers from the University of Cambridge say feelings of apathy can predict if someone will develop dementia years before symptoms like memory loss ever appear.

Frontotemporal dementia is one of the leading causes of dementia in younger patients. Doctors typically diagnose the condition in patients between 45 and 65 years-old. This form of cognitive decline can also affect behavior, language, and personality. Patients can begin to act more impulsively and engage in inappropriate or compulsive behavior.

One of the common threads in frontotemporal dementia cases is patients become apathetic, losing interest in things they normally do. Researchers say this isn’t depression, even though physicians may mistake it for another condition. The study finds frontotemporal dementia is triggered by shrinkage in particular regions in the front of the brain. The worse the shrinkage gets, the more apathetic patients become. While this will eventually lead to cognitive decline, study authors say the process can begin years and possibly decades before dementia becomes visible.

“Apathy is one of the most common symptoms in patients with frontotemporal dementia. It is linked to functional decline, decreased quality of life, loss of independence and poorer survival,” says Maura Malpetti, a cognitive scientist at Cambridge’s Department of Clinical Neurosciences, in a university release.

“The more we discover about the earliest effects of frontotemporal dementia, when people still feel well in themselves, the better we can treat symptoms and delay or even prevent the dementia.”

‘Brain shrinkage in areas that support motivation and initiative’

The study reveals that frontotemporal dementia can be a genetic condition. Nearly a third of patients with this form of dementia have family members who also had it too.

Researchers examined 304 healthy people who carry a faulty gene which can trigger frontotemporal dementia and 296 members of their family who have normal genes. The study followed each person for several years and most of the patients didn’t know whether they had the gene abnormality or not. Researchers monitored each person for changes in apathy, memory, and took MRI scans of their brains.

“By studying people over time, rather than just taking a snapshot, we revealed how even subtle changes in apathy predicted a change in cognition, but not the other way around,” Malpetti explains. “We also saw local brain shrinkage in areas that support motivation and initiative, many years before the expected onset of symptoms.”

Apathy’s impact on the brain

The results reveal that people with this genetic mutation display more apathy than their relatives who don’t carry the defect. Over two years, this behavior increased significantly more than in people with normal genes. Apathy also predicted the onset of cognitive decline as patients approached the typical age dementia symptoms tend to appear.

“Apathy progresses much faster for those individuals who we know are at greater risk of developing frontotemporal dementia, and this is linked to greater atrophy in the brain. At the start, even though the participants with a genetic mutation felt well and had no symptoms, they were showing greater levels of apathy. The amount of apathy predicted cognitive problems in the years ahead,” Professor Rogier Kievit says.

“From other research, we know that in patients with frontotemporal dementia, apathy is a bad sign in terms of independent living and survival. Here we show its importance in the decades before symptoms begin,” adds joint senior author Professor James Rowe.

Prof. Rowe says the study shows why it’s important to not only find out if someone is displaying apathy, but why they’re feeling this way.

“There are many reasons why someone feels apathetic. It may well be an easy to treat medical condition, such as low levels of thyroid hormone, or a psychiatric illness such as depression. But doctors need to keep in mind the possibility of apathy heralding a dementia, and increasing the chance of dementia if left unaddressed, particularly if someone has a family history of dementia,” he concludes.

“Treating dementia is a challenge, but the sooner we can diagnose the disease, the greater our window of opportunity to try and intervene and slow or stop its progress.”

The study appears in the Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association.

https://www.studyfinds.org/apathy-signals-dementia-years-before-symptoms/

Yes, more money will always make your life better, but that’s not all there is to happiness, says new study

by Alexandru Micu

In a twist that’s bound to surprise nobody, a new study finds that there isn’t actually any limit past which more money won’t make you happier. Yes, that sounds disheartening, but the authors also caution that it’s not the only thing that makes us happy by a long shot. Chasing money at the expense of everything else might actually make us less happy.

The relationship between wealth and happiness has always fascinated researchers. One widely-known bit of research in the past suggested that the magic number is $75,000 per year. You won’t gain more happiness by gaining more than that, it added. But if you’ve had to bear through the pandemic jobless or in a job you hate but had to take, struggling to make ends meet, while watching rich people ‘suffer’ in mansions with gardens or spending their holidays on private islands, you might not put too much stock in that idea.

New research agrees with you.

The more the merrier

“[The relationship between money and well-being is] one of the most studied questions in my field,” says Matthew Killingsworth, a senior fellow at Penn’s Wharton School who studies human happiness, lead author of the paper. “I’m very curious about it. Other scientists are curious about it. Laypeople are curious about it. It’s something everyone is navigating all the time.”

Killingsworth set out to answer the question with a wealth of data. The technique he used is called experience sampling, and it involves having people to fill out short surveys at random times of the day. These serve as ‘snapshots’ of their feelings and moods over time, and how these fluctuate.

All in all, he collected 1.7 million data points (‘snapshots’) from more than 33,000 participants aged 18 to 65 from the US through an app called Track Your Happiness that he developed. This allowed him to obtain measurements from each participant a few times every day, with check-in times being randomized for each participant. These were measured on a scale ranging from “very bad” to “very good”, and every participant also answered the question “Overall, how satisfied are you with your life?” (on a scale of “not at all” to “extremely”) at least once. These all measured evaluative well-being, he explains.

“It tells us what’s actually happening in people’s real lives as they live them, in millions of moments as they work and chat and eat and watch TV,” he explains.

But the study also tracked experienced well-being by asking about 12 specific feelings. Five were positive — confident, good, inspired, interested, and proud — and seven negative — afraid, angry, bad, bored, sad, stressed, and upset. Two other measures of life satisfaction collected on an intake survey were also factored in here. Evaluative well-being measures our overall satisfaction with life, while experienced well-being indicates how we feel in the moment.

All in all, Killingsworth says the findings suggest that there is no dollar value past which more money won’t matter to an individual’s well-being and happiness.

“It’s a compelling possibility, the idea that money stops mattering above that point, at least for how people actually feel moment to moment,” he adds . “But when I looked across a wide range of income levels, I found that all forms of well-being continued to rise with income. I don’t see any sort of kink in the curve, an inflection point where money stops mattering. Instead, it keeps increasing.”

“We would expect two people earning $25,000 and $50,000, respectively, to have the same difference in well-being as two people earning $100,000 and $200,000, respectively. In other words, proportional differences in income matter the same to everyone.”

Killingsworth used the logarithm of a person’s income, rather than the actual income, for his study. In essence, this takes into account how much money someone already has. This approach means that rather than being just as important for everyone, each dollar will matter less the more a person earns.

He found that higher earners are happier in part because they feel more in control over their life. More money means more choices, options, and possibilities in regards to how we live life and spend our time, as the pandemic brutally showed. Someone living paycheck to paycheck will have less autonomy over their choices than someone who’s better-off — such as not having to take any job, even if you dislike it, due to financial constraints. Still, in Killingsworth’s eyes, this doesn’t mean we should chase money, and I feel the same way.

“Although money might be good for happiness, I found that people who equated money and success were less happy than those who didn’t. I also found that people who earned more money worked longer hours and felt more pressed for time,” Killingsworth explains.

“If anything, people probably overemphasize money when they think about how well their life is going. Yes, this is a factor that might matter in a way that we didn’t fully realize before, but it’s just one of many that people can control and ultimately, it’s not one I’m terribly concerned people are undervaluing.”

He hopes the findings bring forth more pieces of that ever-elusive puzzle: what exactly makes us happy? Money definitely plays a part, but, according to the findings, only “modestly”, Killingsworth explains.

The paper has been published in the journal PNAS and on the Penn State University’s blog.

Eye Tests Predict Parkinson’s-Associated Cognitive Decline 18 Months Ahead

Simple vision tests can predict which people with Parkinson’s disease will develop cognitive impairment and possible dementia 18 months later, according to a study published in Movement Disorders.

The findings add to evidence that vision changes precede the cognitive decline that occurs in many, but not all, people with Parkinson’s disease.

In another study published in Communications Biology, the same research team found that structural and functional connections of brain regions become decoupled throughout the entire brain in people with Parkinson’s disease, particularly among people with vision problems.

The 2 studies together show how losses and changes to the brain’s wiring underlie the cognitive impairment experienced by many people with Parkinson’s disease.

“We have found that people with Parkinson’s disease who have visual problems are more likely to get dementia, and that appears to be explained by underlying changes to their brain wiring,” said lead author Angeliki Zarkali, MD, Dementia Research Centre, University College London Queen Square Institute of Neurology, London, United Kingdom. “Vision tests might provide us with a window of opportunity to predict Parkinson’s dementia before it begins, which may help us find ways to stop the cognitive decline before it’s too late.”

For the Movement Disorders paper, the researchers studied 77 patients with Parkinson’s disease and found that simple vision tests predicted who would go on to get dementia 1.5 years later. The study also found that those who went on to develop Parkinson’s dementia had losses in the wiring of the brain, including in areas relating to vision and memory. The researchers identified white matter damage to some of the long-distance wiring connecting the front and back of the brain, which helps the brain to function as a cohesive whole network.

The Communications Biology study involved 88 people with Parkinson’s disease (33 of whom had visual dysfunction and were thus judged to have a high risk of dementia) and 30 healthy adults as a control group, whose brains were imaged using MRI scans. The researchers found that people with Parkinson’s disease exhibited a higher degree of decoupling across the whole brain. Areas at the back of the brain, and less specialised areas, had the most decoupling in patients with Parkinson’s disease. Patients with Parkinson’s disease with visual dysfunction had more decoupling in some, but not all brain regions, particularly in memory-related regions in the temporal lobe.

The research team also found changes to the levels of some neurotransmitters in people at risk of cognitive decline, suggesting that receptors for those transmitters may be potential targets for new drug treatments for Parkinson’s dementia. Notably, while dopamine is known to be implicated in Parkinson’s, the researchers found that other neurotransmitters — acetylcholine, serotonin, and noradrenaline — were particularly affected in people at risk of cognitive decline.

“The 2 papers together help us to understand what’s going on in the brains of people with Parkinson’s who experience cognitive decline, as it appears to be driven by a breakdown in the wiring that connects different brain regions,” said Dr. Angeliki.

“Our findings could be valuable for clinical trials, by showing that vision tests can help us identify who we should be targeting for trials of new drugs that might be able to slow Parkinson’s — and ultimately if effective treatments are found, then these simple tests may help us identify who will benefit from which treatments,” said senior author Rimona Weil, MD, University College London.

References: https://onlinelibrary.wiley.com/doi/10.1002/mds.28477 and https://www.nature.com/articles/s42003-020-01622-9

Brain imaging study reveals blunted empathic response to others’ pain when following orders

BY BETH ELLWOOD 

A brain imaging study has found that inflicting pain on another person in compliance with an order is accompanied by reduced activation in parts of the brain associated with the perception of others’ pain. The study was published in NeuroImage.

There exists a well-documented psychological phenomenon where people will go to great lengths to comply with authority even if it means harming others. The most famous example is the Milgram experiment, where subjects pressed a button to deliver what they believed were increasingly painful electric shocks to strangers at the request of experimenters. While this experiment has been widely replicated, researchers Emilie A. Caspar and associates point out that studies have yet to uncover a neurological explanation for this effect.

Caspar and her colleagues set out to explore the possibility that causing someone pain under someone else’s direction reduces empathy for that pain. With a brain imaging study, they tested whether being coerced to inflict harm on someone would be associated with reduced activation in areas of the brain involved in the perception of others’ pain, when compared to inflicting the same harm out of one’s own free will.

The researchers recruited 40 subjects with an average age of 25 to partake in their study. The participants were paired up, and each took turns being the ‘agent’ and the ‘victim’ in a controlled experiment. During a series of trials, the agent had control of administering a mildly painful shock to the victim who was seated in another room. The agent received a small monetary reward of €0.05 for every shock given.

Importantly, the agent went through two different conditions. In the coerced condition, an experimenter who was present in the room instructed the agent on whether or not to deliver a shock at a given trial. In the free condition, the experimenter remained in another room, and the agent was told that they could choose whether or not to give the other participant a shock. Throughout the entire task, the agent’s brain activity was recorded using a magnetic resonance imaging (MRI) scanner.

As expected, the agents delivered more shocks during the coerced conditions than the free conditions. While in the coerced conditions, the experimenters had ordered the subjects to deliver shocks on half the trials, in the free conditions, the subjects delivered less than that with an average of 23 shocks out of 60 trials. The agents also reported feeling more “bad”, more “sorry”, and more “responsible” for administering the shocks in the free conditions, compared to the coerced conditions.

Interestingly, when obeying orders, the subjects appeared to downplay the pain they were inflicting. While administering each shock, the subjects could see a live video of the victims’ hand reacting to the shock with a visible muscle twitch. After each shock, the agents rated how painful they believed it was. The researchers found that the subjects rated the shocks as less painful when they were administered as part of an order — despite having been told at the beginning of the experiment that the shocks would be of the same intensity at every trial. “Here,” Caspar and her team emphasize, “our results would support the fact that obeying orders has such a strong influence on the perception of pain felt by others that it even impacts perceptual reports of observed shock intensity rather than only modulating how the observer feels about the pain of the other.”

The MRI results offered further evidence that obeying orders alters one’s empathy response. When researchers zeroed in on areas of the brain associated with the processing of others’ pain — areas such as the anterior cingulate cortex (ACC), dorsal striatum, middle temporal gyrus (MTG), temporoparietal junction (TPJ), and insula — they found that these areas showed reduced activation during the coerced condition. As the authors illustrate, “even in the case of a pain that is fully caused by the participants’ own actions, brain activity is altered by a lack of responsibility.”

The authors note that previous research has suggested that parts of the ACC and insula show greater activation when people are uniquely to blame for others’ pain. This falls in line with the current findings since the coerced condition was linked to reduced feelings of responsibility and reduced activation of the ACC and insula.

Overall, the findings present the unsettling possibility that following someone else’s order “relaxes our aversion against harming others” even if we are the ones carrying out the action.

The study, “Obeying orders reduces vicarious brain activation towards victims’ pain”, was authored by Emilie A. Caspar, Kalliopi Ioumpa, Christian Keysers, and Valeria Gazzola.

Mothers of Children With Autism Found to Have Significantly Different Metabolite Levels

Summary: Two to five years after birth, mothers of children on the autism spectrum have several significantly different metabolite levels than mothers of typically developing children.

Blood sample analysis showed that, two to five years after they gave birth, mothers of children with autism spectrum disorder (ASD) had several significantly different metabolite levels compared to mothers of typically developing children. That’s according to new research recently published in BMC Pediatrics by a multidisciplinary team from Rensselaer Polytechnic Institute, Arizona State University, and the Mayo Clinic.

Researchers analyzed blood samples from 30 mothers whose young children had been diagnosed with ASD and 29 mothers of typically developing children. At the time that the samples were taken, the women’s children were between 2 and 5 years old. The team found differences in several metabolite levels between the two groups of mothers.

When examined further, researchers were able to group those differences into five subgroups of correlated metabolites. While the samples analyzed were taken several years after pregnancy, these research findings raise the question of whether or not the differences in metabolites may have been present during pregnancy as well, suggesting further research is needed in this area.

Many of the variances, the researchers said, were linked to low levels of folate, vitamin B12, and carnitine-conjugated molecules. Carnitine can be produced by the body and can come from meat sources like pork or beef, but there wasn’t a correlation between mothers who ate more meat and mothers who had higher levels of carnitine.

According to Juergen Hahn, the head of the Department of Biomedical Engineering at Rensselaer and co-author on this paper, this finding suggests that the differences may be related to how carnitine is metabolized in some mothers’ bodies.

“We had multiple metabolites that were associated with the carnitine metabolism,” said Hahn, who is also a member of the Center for Biotechnology and Interdisciplinary Studies at Rensselaer. “This suggests that carnitine and mothers is something that should be looked at.”

The team’s big data approach proved to be highly accurate in using a blood sample analysis to predict which group a mother belonged to, which suggests that the development of a blood test to screen for mothers who are at a higher risk of having a child with ASD may be possible.

“A blood test would not be able to tell if your child has autism or not, but it could tell if you’re at a higher risk,” Hahn said. “And the classification of higher risk, in this case, can actually be significant.”

“Based on these results, we are now conducting a new study of stored blood samples collected during pregnancy, to determine if those metabolites are also different during pregnancy,” said James Adams, a President’s Professor in the School of Engineering of Matter, Transport and Energy, and director of the Autism/Asperger’s Research Program, both at Arizona State University. Adams co-authored this paper with Hahn.

This research builds upon Hahn’s other work. He previously discovered patterns with certain metabolites in the blood of children with autism that can be used to successfully predict diagnosis. He has used this same method to investigate a mother’s risk for having a child with ASD. He and Adams have also done similar work studying children with autism who have chronic gastrointestinal issues.

Source: Rensselaer Polytechnic Institute

Original Research: Open access.
Altered metabolism of mothers of young children with Autism Spectrum Disorder: a case control study” by Kathryn Hollowood-Jones, James B. Adams, Devon M. Coleman, Sivapriya Ramamoorthy, Stepan Melnyk, S. Jill James, Bryan K. Woodruff, Elena L. Pollard, Christine L. Snozek, Uwe Kruger, Joshua Chuah& Juergen Hahn. BMC Pediatrics