New DNA-editing method effectively treats mouse model of progeria

Progeria is caused by a mutation in the nuclear lamin A gene in which one DNA base C is changed to a T. Researchers used the base editing method, which substitutes a single DNA letter for another without damaging the DNA, to reverse that change. Credit: Ernesto del Aguila III, NHGRI

Researchers have successfully used a DNA-editing technique to extend the lifespan of mice with the genetic variation associated with progeria, a rare genetic disease that causes extreme premature aging in children and can significantly shorten their life expectancy. The study was published in the journal Nature, and was a collaboration between the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health; Broad Institute of Harvard and MIT, Boston; and the Vanderbilt University Medical Center, Nashville, Tennessee.

DNA is made up of four chemical bases—A, C, G and T. Progeria, which is also known as Hutchinson-Gilford progeria syndrome, is caused by a mutation in the nuclear lamin A (LMNA) gene in which one DNA base C is changed to a T. This change increases the production of the toxic protein progerin, which causes the rapid aging process.

Approximately 1 in 4 million children are diagnosed with progeria within the first two years of birth, and virtually all of these children develop health issues in childhood and adolescence that are normally associated with old age, including cardiovascular disease (heart attacks and strokes), hair loss, skeletal problems, subcutaneous fat loss and hardened skin.

For this study, researchers used a breakthrough DNA-editing technique called base editing, which substitutes a single DNA letter for another without damaging the DNA, to study how changing this mutation might affect progeria-like symptoms in mice.

“The toll of this devastating illness on affected children and their families cannot be overstated,” said Francis S. Collins, M.D., Ph.D., a senior investigator in NHGRI’s Medical Genomics and Metabolic Genetics Branch, NIH director and a corresponding author on the paper. “The fact that a single specific mutation causes the disease in nearly all affected children made us realize that we might have tools to fix the root cause. These tools could only be developed thanks to long-term investments in basic genomics research.”

The study follows another recent milestone for progeria research, as the U.S. Food and Drug Administration approved the first treatment for progeria in November 2020, a drug called lonafarnib. The drug therapy provides some life extension, but it is not a cure. The DNA-editing method may provide an additional and even more dramatic treatment option in the future.

David Liu, Ph.D., and his lab at the Broad Institute developed the base-editing method in 2016, funded in part by NHGRI.

“CRISPR editing, while revolutionary, cannot yet make precise DNA changes in many kinds of cells,” said Dr. Liu, a senior author on the paper. “The base-editing technique we’ve developed is like a find-and-replace function in a word processor. It is extremely efficient in converting one base pair to another, which we believed would be powerful in treating a disease like progeria.”

To test the effectiveness of their base-editing method, the team initially collaborated with the Progeria Research Foundation to obtain connective tissue cells from progeria patients. The team used the base editor on the LMNA gene within the patients’ cells in a laboratory setting. The treatment fixed the mutation in 90% of the cells.

“The Progeria Research Foundation was thrilled to collaborate on this seminal study with Dr. Collins’s group at the NIH and Dr. Liu’s group at Broad Institute,” said Leslie Gordon, M.D., Ph.D., a co-author and medical director of The Progeria Research Foundation, which partially funded the study. “These study results present an exciting new pathway for investigation into new treatments and the cure for children with progeria.”

Following this success, the researchers tested the gene-editing technique by delivering a single intravenous injection of the DNA-editing mix into nearly a dozen mice with the progeria-causing mutation soon after birth. The gene editor successfully restored the normal DNA sequence of the LMNA gene in a significant percentage of cells in various organs, including the heart and aorta.

Many of the mice cell types still maintained the corrected DNA sequence six months after the treatment. In the aorta, the results were even better than expected, as the edited cells seemed to have replaced those that carried the progeria mutation and dropped out from early deterioration. Most dramatically, the treated mice’s lifespan increased from seven months to almost 1.5 years. The average normal lifespan of the mice used in the study is two years.

“As a physician-scientist, it’s incredibly exciting to think that an idea you’ve been working on in the laboratory might actually have therapeutic benefit,” said Jonathan D. Brown, M.D., assistant professor of medicine in the Division of Cardiovascular Medicine at Vanderbilt University Medical Center. “Ultimately our goal will be to try to develop this for humans, but there are additional key questions that we need to first address in these model systems.”

More information: In vivo base editing rescues Hutchinson–Gilford progeria syndrome in mice, Nature (2021). DOI: 10.1038/s41586-020-03086-7 , www.nature.com/articles/s41586-020-03086-7

https://medicalxpress.com/news/2021-01-dna-editing-method-mouse-progeria.html

Too Much Meat During Ice Age Winters Gave Rise to Dogs, New Research Suggests

by George Dvorsky

Two prevailing theories exist about the origin of domesticated dogs. One proposes that prehistoric humans used early dogs as hunting partners, and the other says that wolves were attracted to our garbage piles. New research suggests both theories are wrong and that the real reason has to do with our limited capacity to digest protein.

Dogs were domesticated from wild wolves during the last ice age between 14,000 and 29,000 years ago, and they were the first animals to be domesticated by humans. That humans and wolves should form a collaborative relationship is an odd result, given that both species are pack hunters who often target the same prey.

“The domestication of dogs has increased the success of both species to the point that dogs are now the most numerous carnivore on the planet,” wrote the authors of a new study published today in Scientific Reports. “How this mutually beneficial relationship emerged, and specifically how the potentially fierce competition between these two carnivores was ameliorated, needs to be explained.”

Indeed, given this context, it’s not immediately obvious why humans would want to keep wolves around. Moreover, the two prevailing theories about the origin of dogs—either as partners used for hunting or as self-domesticated animals attracted to our garbage—aren’t very convincing. Wolves, even when tamed, would’ve made for awful hunting partners, as they lacked the collaborative and advanced communication skills found in domesticated dogs. And sure, wild wolves were probably attracted to human scraps, but this would’ve required some unlikely interactions between humans and wolves.

“In our opinion, the self-domestication in this way is not fully explained,” Maria Lahtinen, a chemist and archaeologist at the Finnish Food Authority in Finland and the first author of the new study, said in an email. “Hunter-gatherers do not necessarily leave waste in the same place over and over again. And why would they tolerate a dangerous carnivore group in their close surroundings? Humans tend to kill their competitors and other carnivores.”

Lahtinen and her colleagues say there’s a more likely reason for the domestication of dogs, and it has to do with an abundance of protein during the harsh ice age winters, which subsequently reduced competition between the two species. This in turn allowed humans and incipient dogs to live in symbiotic harmony, paving the way for the ongoing evolution of both species.

The researchers have “introduced a really interesting hypothesis that seeks to address the long-debated mechanism by which early dog domestication occurred,” James Cole, an archaeologist at the University of Brighton who’s not involved with the new study, wrote in an email. “The idea is that human populations and wolves could have lived alongside each other during the harsh climatic conditions [of the last ice age] because human populations would have produced enough protein, through hunting activities, to keep both populations fed during the harsh winter months.”

Seems hard to believe, but humans likely had more food during ice age winters than they could handle. This is due to our inability to subsist exclusively on lean protein for months at a time—something wolves have no issues with. For humans, excessive consumption of protein can lead to hyperinsulinemia (insulin resistance), hyperammonia (excess ammonia in blood), diarrhea, and in some extreme cases even death, according to the authors. To overcome this biological limitation, Pleistocene hunters adapted their diets during the winter months, targeting animal parts rich in fat, grease, and oils, such as lower limbs, organs, and the brain. And in fact, “there is evidence for such processing behavior during the Upper Palaeolithic,” according to the paper.

Consequently, wolves and humans were able to “share their game without competition in cold environments,” said Lahtinen. This in turn made it possible for humans to keep wolves as pets.

“Therefore, in the short term over the critical winter months, wolves and humans would not have been in competition over resources and may have mutually benefited from each other’s companionship,” wrote the authors. “This would have been critical in keeping the first proto-dogs for years and generations.”

It’s very possible, said Lahtinen, that the earliest dogs were wolf pups. Hunter-gatherers, she said, “do take pets in most cultures, and humans tend to find young animals cute,” so it would “not be a surprise if this would have happened.”

So dogs exist because wolf pups were cute and we had plenty of leftovers? Seems a legit theory, if you ask me.

Only later, due to traits introduced by artificial selection, were dogs used for hunting, guarding, pulling sleds, and so on, according to the researchers. This theory may also explain the complexity of early dog domestication, which appears to have occurred in Eurasia at multiple times, with dogs continuing to interbreed with wild wolves. The new theory may also explain why the domestication of dogs appears to have occurred in arctic and subarctic regions.

As for the summer months, that wasn’t as crucial for humans, given the relative abundance of food alternatives. During the critical winter months, however, “hunter-gatherers tend to give up their pets if there is a need to give up resources from humans,” said Lahtinen.

Importantly, Lahtinen and her colleagues did not pull this theory from thin air. To reach this conclusion, the team performed energy content calculations to estimate the amount of energy that would be left over from prey animals also hunted by wolves, such as deer, moose, and horses. The authors reasoned that, if humans and wolves were having to compete for these resources, there would be little to no cooperation between the two species. But their calculations showed that, aside from animals like weasels, all animals preyed upon by humans would have provided more lean protein than required.

“Therefore, the early domesticated wolves could have survived living alongside human populations by consuming the excess protein from hunting that humans could not,” explained Cole. “By having enough food for both populations, the competitive niche between the species is eliminated, thereby paving the way to domestication and the benefits of such a relationship to the two species.”

Cole described it as a “really intriguing hypothesis” because it provides a “mechanism that can explain the domestication of the wolf across a wide geographic and temporal range,” and it does so by “explaining how two carnivorous species could overcome the competition…under harsh climatic conditions.” Looking ahead, Cole said a similar approach would be useful for studying the interactions of humans and other species on this planet over time.

As a relevant aside, Cole is the author of a fascinating Scientific Reports paper published in 2017 arguing that ancient humans didn’t turn to cannibalism for nutrition. Using an approach similar to the one taken in the Lahtinen paper, Cole showed that human flesh simply doesn’t pack the same amount of calories as wild animals, and cannibalism wouldn’t have been worth all the trouble.

https://gizmodo.com/too-much-meat-during-ice-age-winters-gave-rise-to-dogs-1846008092

Tiny Townhouses Proposed for Vancouver’s Homeless

Bryn Davidson is known to Treehugger for his wonderful laneway houses that I previously noted may not be the answer to the housing crisis. Vancouver, British Columbia has a serious crisis now and after seeing the encampments of homeless people, Davidson has been thinking about low-cost alternatives. Davidson tells Treehugger that real estate developers get a tax break for letting their undeveloped land be used for community allotment gardens, and is suggesting that such sites also could be used for temporary housing.https://bb87a73eb70fd61292bfa71a57d8dcd7.safeframe.googlesyndication.com/safeframe/1-0-37/html/container.html

“We bring a skill set that can fill a real need – sleeping shelters that can serve as a transitional stop. Tiny houses aren’t legal in Vancouver but a shed less than 100 square feet and under 15 feet high is.”

https://bb87a73eb70fd61292bfa71a57d8dcd7.safeframe.googlesyndication.com/safeframe/1-0-37/html/container.html

Tiny townhouse with suppliers
Credit to the suppliers who helped out.Bryn Davidson

Davidson has been pitching the idea for a while, but got fed up with the city talking about the problem with no action, and says he just decided “to get on with it.” He built the prototype himself with donations from his usual suppliers, his own resources, and donations from the public via Gofundme.

Assembly
Bryn Davidson

The 8′-by-12’6″ units are built out of structural insulated panels (SIPs) and include a Zehnder heat recovery ventilator to control moisture buildup and deliver fresh air. The units could be built on-site or prefabricated and delivered in two pieces on a conventional flatbed trailer. They cost about $11,700 (C$15,000) to build.

Corridor between units
Bryn Davidson

There is no plumbing in the unit; the City of Vancouver is currently servicing the tent encampment with a shared bathroom module, similar to what would be used in the tiny townhouse community The idea is that it would be short-term accommodation while the city is going through the process of buying or leasing hotel rooms for permanent accommodation.https://bb87a73eb70fd61292bfa71a57d8dcd7.safeframe.googlesyndication.com/safeframe/1-0-37/html/container.html

doublewide
Bryn Davidson

However, Davidson has designed a “double-wide” unit that would have a kitchen and bathroom if there were plumbing connections available.https://bb87a73eb70fd61292bfa71a57d8dcd7.safeframe.googlesyndication.com/safeframe/1-0-37/html/container.html

Community
Bryn Davidson

An important question is “Why tiny townhouses and not a tiny house?” The reasons are the same ones we talk about all the time for urban housing; greater density, and you can fit more housing on a piece of land. It is also far more energy-efficient – the side walls are the biggest ones, and a townhouse attached on both sides uses 60% less energy. Sound transmission can be a problem in a wood party wall, but since these are modular, there are two complete walls and 12 inches of material between two units, so this will be better than most modern apartments.https://bb87a73eb70fd61292bfa71a57d8dcd7.safeframe.googlesyndication.com/safeframe/1-0-37/html/container.html

looking up to loft
Bryn Davidson

The one complaint I had with the design was the inclusion of lofts, which I consider to be dangerous and often uncomfortable. Davidson noted that because of the 15-foot height limit, the loft was a very inexpensive expansion of space that could be used for storage or other uses that did not necessarily involve climbing ladders in the middle of the night.

single unit interior
Bryn Davidson

The accommodation here is pretty minimal, but as Davidson notes, it is meant to be transitional. The brilliance of the idea is that as it meets the definition of a shed, it’s legal; and it is ephemeral, it has no foundations, so it can be picked up and moved on short notice. That’s critical if you are going to install a community without a massive NIMBY battle.

Exterior
Bryn Davidson

Years ago I was involved in a proposal to build temporary housing for homeless people on the Toronto waterfront, with a very similar solution. After months of work, my partner and I sat around a big table at city hall where the head of every department laid out their reasons why it couldn’t be done, whether it was health or safety or plumbing or the final nail in the coffin, that the site was on a flood plain. But in the interim, the problem has just got worse, exacerbated by the Covid-19 crisis.

townhouse units in row
Bryn Davidson

Bryn Davidson has proposed a solution that addresses so many of the problems and complications faced when trying to address the problem of homelessness. Because of the unit sizes, it dances around the building code and zoning issues. it can accommodate a lot of people on a small site. And unlike a tent, it is warm, dry, and secure.

From the looks of it, Bryn and his son enjoyed their night in it too. Help him finish the project by contributing through Gofundme; I just did.

kid in front of house
Bryn Davidson

Self-control in childhood linked to slower aging

By Chrissy Sexton

Children with self-control are more likely to grow up to be healthier adults with younger brains and bodies, according to a new study from Duke University. The researchers tracked 1,000 individuals from birth to age 45 and found that people who had higher levels of self-control as children were aging more slowly than their peers. 

Self-control is the ability to control one’s own emotions and behaviors, even when faced with difficult situations. Interviews with the study participants indicated that those in the higher self-control group were better equipped to handle health, financial, and social challenges later in life. 

Furthermore, individuals who had more self control in childhood expressed more positive views of aging and felt more satisfied with life in middle age.

“Our population is growing older, and living longer with age-related diseases,” said study first author Professor Leah Richmond-Rakerd. “It’s important to identify ways to help individuals prepare successfully for later-life challenges, and live more years free of disability. We found that self-control in early life may help set people up for healthy aging.”

The researchers emphasized that childhood is not destiny, and that some study participants had shifted their self-control levels as adults and had better health outcomes than their childhood assessments would have predicted.

Self-control can be taught, and the experts propose that a societal investment in such training could improve life span and quality of life, not only in childhood, but also perhaps in midlife. A growing collection of evidence suggests that changing behaviors in midlife, such as quitting smoking or taking up exercise, can lead to improved outcomes.

“Everyone fears an old age that’s sickly, poor, and lonely, so aging well requires us to get prepared, physically, financially, and socially,” said study co-author Professor Terrie Moffitt. “We found people who have used self-control since childhood are far more prepared for aging than their same-age peers.”

The Dunedin Multidisciplinary Health and Development Study, based in New Zealand, has tracked the individuals since they were born in 1972 and 1973. The participants were assessed using a variety of psychological and health assessments at regular intervals, most recently at age 45.

Childhood self-control was assessed by teachers, parents, and the children themselves from the ages of three to 11. The experts measured factors such as impulsive aggression, over-activity, perseverance, and inattention.

From ages 26 to 45, the participants were measured for physiological signs of aging in several organ systems, including the brain. Across all of these measures, higher self-control in childhood was associated with slower aging. In addition, the people with the highest self-control were found to walk faster and have younger-looking faces at age 45.

The study is published in the Proceedings of the National Academy of Sciences.

Your Partner’s Genome May Affect Your Health

A study using data from more than 80,000 couples finds evidence of indirect genetic effects on traits ranging from smoking habits to mental health.

By Catherine Offord

People’s health and lifestyle are influenced by the genes of their partners, according to a study published last month (December 14) in Nature Human Behavior. Using data from more than 80,000 couples in the UK Biobank, researchers identified multiple correlations between individuals’ traits and their partners’ genomes, and concluded that around one-quarter of those associations were partly causal, with one person’s DNA having indirect effects on the other person’s health or behavior.

“I was really excited to see this paper,” says Emily McLean, an evolutionary biologist at Oxford College of Emory University in Georgia who was not involved in the work. “Intuitively, it seems like, of course our behaviors are influenced by the individuals around us, and likely by the genes that those individuals are carrying. So it was really great to see some empirical support for that intuitive idea.”

Unlike direct genetic effects, which reflect the influence of your own genes on your phenotype, indirect genetic effects are a form of environmental influence, driven by the genetic traits of people around you. In a simple hypothetical example, a person who is genetically predisposed to smoking might raise their partner’s risk of lung cancer via exposure to cigarette smoke or by encouraging them to smoke more.


Several
 studies have provided evidence of these indirect effects in nonhuman animal populations, and a couple of studies on specific traits in humans—including schoolchildren’s propensity to smoke and their educational attainment—have suggested that people, too, are affected by the genetic makeup of their peers. But it hasn’t been clear how widespread these effects are in human relationships, nor whether the associations themselves are causal rather than correlational.

In the current study, the University of Edinburgh’s Charley Xia, Albert Tenesa, and colleagues used data from 80,889 heterosexual couples of European ancestry whose genetic variation and health and lifestyle habits are recorded in the UK Biobank. The researchers selected 105 complex traits—those influenced by variation in multiple genes such as height, smoking status, and susceptibility to mood swings—and used a statistical model to look for broad associations between each individual’s traits and their partner’s DNA.

The team found that around 50 percent of these traits showed some correlation with the partner’s genetic makeup. Many of those correlations could have been due to assortative mating, Xia says. For instance, people may be more likely to choose partners with traits similar to their own, creating spurious relationships in the data. Height is a typical example of a trait likely to be correlated in couples due to assortative mating rather than any indirect genetic effects, he adds.

The researchers ran computer simulations of mix-and-match combinations of individuals in their dataset to see if they could distinguish between associations due to assortative mating and those due to true indirect genetic effects. They concluded that around 25 percent of the associations did indeed involve at least some causation—that is, one person’s genotype was having a detectable effect on another person’s phenotype. 

These associations included several dietary traits, such as self-reported poultry and beef intake, time spent watching television, susceptibility to mood swings, and smoking habits, although the team did not explore specific traits or genes in detail. Height did not show evidence of a causal relationship using this analysis, Xia says, increasing the researcher’s confidence in their method.

While it’s hard to draw conclusions about individual traits from this kind of broad analysis, the team’s study represents a proof of concept that indirect genetic effects may be important in humans, says Daniel Belsky, an epidemiologist at Columbia University Mailman School of Public Health who was not involved in the work. He calls it a “creative application of a large and powerful database to address an important and open question in behavioral genetics.” 

Belsky adds that while the results “seem broadly sensible,” there remain some questions for future studies regarding the extent to which indirect genetic effects can be distinguished from assortative mating. “The design that [the authors] use is quite strong in controlling for assortment on the trait under analysis,” he notes, but it’s less effective “for controlling for assortment on traits that are genetically correlated with the trait under analysis but which may not be measured.” 

McLean says she’d be interested to learn more about the mechanisms behind the associations the team identified, and about which genes in one person are related to which trait in the other. She cautions that some of the UK Biobank data used in the study is self-reported, and so researchers would need to check that responses accurately reflect people’s traits. Determining the direction of genes’ effects on behavior—that is, whether a particular trait is positively or negatively associated with a genotype in the partner—could also be an interesting next step from an evolutionary perspective, McLean adds. 

Xia notes that to properly understand the mechanisms responsible for the effects the team identified, the researchers would need to focus more closely on individual traits, and use data on genes and lifestyle from the same people spanning many years—a follow-up project that some of the team members are currently considering, he adds.

Such data on indirect genetic effects could one day have applications in public health, Belsky says. “It may be possible, as genomes become a routine part of a person’s medical record, to provide clinical guidance and risk management information to patients based on partner genotypes,” he says.

More immediately, the study is an important reminder of the complexity of genotype-phenotype relationships, Belsky adds. “Observations like this . . . illustrate ways in which a wide range of environments—in this case, another person you’ve chosen to share your life with—intercede between the genetic risk a person is born with and the health outcome that we’re interested in protecting them from. This is another argument against a deterministic interpretation of a person’s genetic background, when you think about the kind of life that they’re going to lead and the sort of health risks they’re going to have.”

C. Xia et al., “Evidence of horizontal indirect genetic effects in humans,” Nat Hum Behav, doi:10.1038/s41562-020-00991-9, 2020.

AI algorithm can predict Alzheimer’s disease in 1 minute

Left is the visualization of the top five brain regions representing feature impacts pushing the model’s decision to AD, along with average feature impact. Right is the visualization of the top five brain regions representing feature impacts pushing the model’s decision to healthy controls, along with moderate feature impact. (Credit: American Journal of Neuroradiology)

출처 : KBR(http://www.koreabiomed.com)

A study by Vuno, a Korean artificial intelligence (AI) developer, showed that a deep learning algorithm could predict Alzheimer’s disease (AD) within one minute.

Jointly with Asan Medical Center, Vuno verified an AI algorithm using MRI scans of 2,727 patients registered at domestic medical institutions. Vuno found that the algorithm predicted AD and mild cognitive impairment (MCI) accurately.

Vuno’s deep learning-based algorithm used an area under the curve (AUC) to predict dementia. The closer the AUC value is, the higher the algorithm’s performance is. The AUC was 0.840-0.982 in AD and 0.668-0.870 in MCI, Vuno said.

Vuno’s algorithm quickly delivered the microscopic atrophy in the brain to physicians and shortened the analysis time compared to “FreeSurfer,” the conventional brain MRI analysis model. FreeSurfer took seven hours to analyze brain MRI, while Vuno’s algorithm yielded results in one minute, the company said.

Vuno said its algorithm could be used for pre-screening before the positron emission tomography (PET) test used for early detection of AD.

Jung Kyu-hwan, chief technology officer of Vuno, said the study proved Vuno’s brain MRI analysis was practically helpful in diagnosing dementia. “We will make an effort to expand the pipelines of AI solutions related to brain diseases, including the latest algorithm that we have verified,” he said.

Professor Kim Sang-Joon of radiology at Asan Medical Center said the findings had high clinical values because the study verified the AI algorithm’s performance based on large-scale clinical data for the first time in the world.

If Vuno’s algorithm is more refined through further research and introduced to clinical settings, it will significantly contribute to the early diagnosis of dementia, he added.

Vuno and Asan Medical Center’s joint study was published in the SCI-grade journal, American Journal of Neuroradiology (AJNR).

출처 : KBR(http://www.koreabiomed.com)

http://www.koreabiomed.com/news/articleView.html?idxno=10038

High Blood Pressure While Pregnant Linked to Poorer Memory Years Later

High blood pressure and pre-eclampsia during pregnancy may follow women through the years, causing lower scores on tests of memory and thinking skills, a Dutch study suggests.

The study of nearly 600 pregnant women included 481 with normal blood pressure and 115 who developed high blood pressure during their pregnancies.

Of those 115 women, 70% had gestational hypertension, which is high blood pressure that starts after 20 weeks of pregnancy in women who previously had normal readings. The other 30% had pre-eclampsia, a pregnancy complication marked by high blood pressure and elevated protein levels in the urine that develop after 20 weeks of pregnancy.

“Women with high blood pressure that starts in pregnancy, as well as women with pre-eclampsia, should be monitored closely after their pregnancy, and they and their physicians should consider lifestyle changes and other treatments that may help reduce their risk of decline in their thinking and memory skills later in life,” said study author Dr. Maria Adank. She is with the department of obstetrics and gynecology at Erasmus University Medical Center in Rotterdam, the Netherlands.

Adank’s team tested the study participants after 15 years, asking them to recall a list of 15 words, first right away and then again after 20 minutes.

On the immediate recall test, which was given three times, women who had no high blood pressure problems 15 years earlier scored an average 28 points out of a possible 45. The women who had high blood pressure during pregnancy posted an average score of 25.

After adjusting for other factors that could affect thinking skills, such as a woman’s weight before pregnancy, her education and ethnicity, the researchers found that women who had high blood pressure during pregnancy performed worse on the immediate and delayed recall task.

The investigators found no differences between the two groups on tests of fine motor skills, verbal fluency, processing speed and visual-spatial ability.

The women were not given memory or thinking tests before or during their pregnancies, the authors noted in the report published online Dec. 30 in the journal Neurology.

Adank said the study does not show a cause-and-effect relationship between high blood pressure and test scores, only an association.

“It’s important to consider gestational hypertension and pre-eclampsia as risk factors for cognitive impairment that are specific to women,” Adank said in a news release from the American Academy of Neurology. “Many women may think of this as a temporary issue during pregnancy and not realize that it could potentially have long-lasting effects.”

More study is needed to learn whether early treatment can prevent thinking and memory problems in women with a history of high blood pressure in pregnancy, she added.

More information

The U.S. Centers for Disease Control and Prevention has more on high blood pressure during pregnancy.

SOURCE: American Academy of Neurology, news release, Dec. 30, 2020

https://consumer.healthday.com/b-1-4-high-blood-pressure-while-pregnant-linked-to-poorer-memory-years-later-2649660979.html

Study of 50,000 people finds brown fat may protect against numerous chronic diseases

In these PET scans, the person on the left has abundant brown fat around the neck and cervical spine. The person on the right has no detectable brown fat. Credit: Andreas G. Wibmer and Heiko Schöder.

Brown fat is that magical tissue that you would want more of. Unlike white fat, which stores calories, brown fat burns energy and scientists hope it may hold the key to new obesity treatments. But it has long been unclear whether people with ample brown fat truly enjoy better health. For one thing, it has been hard to even identify such individuals since brown fat is hidden deep inside the body.

Now, a new study in Nature Medicine offers strong evidence: among over 52,000 participants, those who had detectable brown fat were less likely than their peers to suffer cardiac and metabolic conditions ranging from type 2 diabetes to coronary artery disease, which is the leading cause of death in the United States.

The study, by far the largest of its kind in humans, confirms and expands the health benefits of brown fat suggested by previous studies. “For the first time, it reveals a link to lower risk of certain conditions,” says Paul Cohen, the Albert Resnick, M.D., Assistant Professor and senior attending physician at The Rockefeller University Hospital. “These findings make us more confident about the potential of targeting brown fat for therapeutic benefit.”

A valuable resource

Although brown fat has been studied for decades in newborns and animals, it was only in 2009 that scientists appreciated it can also be found in some adults, typically around the neck and shoulders. From then on, researchers have scrambled to study the elusive fat cells, which possess the power to burn calories to produce heat in cold conditions.

Large-scale studies of brown fat, however, have been practically impossible because this tissue shows up only on PET scans, a special type of medical imaging. “These scans are expensive, but more importantly, they use radiation,” says Tobias Becher, the study’s first author and formerly a Clinical Scholar in Cohen’s lab. “We don’t want to subject many healthy people to that.”

A physician-scientist, Becher came up with an alternative. Right across the street from his lab, many thousands of people visit Memorial Sloan Kettering Cancer Center each year to undergo PET scans for cancer evaluation. Becher knew that when radiologists detect brown fat on these scans, they routinely make note of it to make sure it is not mistaken for a tumor. “We realized this could be a valuable resource to get us started with looking at brown fat at a population scale,” Becher says.

Protective fat

In collaboration with Heiko Schoder and Andreas Wibmer at Memorial Sloan Kettering, the researchers reviewed 130,000 PET scans from more than 52,000 patients, and found the presence of brown fat in nearly 10 percent of individuals. (Cohen notes that this figure is likely an underestimate because the patients had been instructed to avoid cold exposure, exercise, and caffeine, all of which are thought to increase brown fat activity).

Several common and chronic diseases were less prevalent among people with detectable brown fat. For example, only 4.6 percent had type 2 diabetes, compared with 9.5 percent of people who did not have detectable brown fat. Similarly, 18.9 percent had abnormal cholesterol, compared to 22.2 percent in those without brown fat.

Moreover, the study revealed three more conditions for which people with brown fat have lower risk: hypertension, congestive heart failure, and coronary artery disease—links that had not been observed in previous studies.

Another surprising finding was that brown fat may mitigate the negative health effects of obesity. In general, obese people have increased risk of heart and metabolic conditions; but the researchers found that among obese people who have brown fat, the prevalence of these conditions was similar to that of non-obese people. “It almost seems like they are protected from the harmful effects of white fat,” Cohen says.

More than an energy burning powerhouse

The actual mechanisms by which brown fat may contribute to better health are still unclear, but there are some clues. For example, brown-fat cells consume glucose in order to burn calories, and it’s possible that this lowers blood glucose levels, a major risk factor for developing diabetes.

The role of brown fat is more mysterious in other conditions like hypertension, which is tightly connected to the hormonal system. “We are considering the possibility that brown fat tissue does more than consume glucose and burn calories, and perhaps actually participates in hormonal signaling to other organs,” Cohen says.

The team plans to further study the biology of brown fat, including by looking for genetic variants that may explain why some people have more of it than others—potential first steps toward developing pharmacological ways to stimulate brown fat activity to treat obesity and related conditions.

“The natural question that everybody has is, ‘What can I do to get more brown fat?'” Cohen says. “We don’t have a good answer to that yet, but it will be an exciting space for scientists to explore in the upcoming years.”

https://medicalxpress.com/news/2021-01-people-brown-fat-numerous-chronic.html

Ted DeLaney, Conscience of a Roiled University, Dies at 77

Professor Ted DeLaney on the campus of Washington and Lee University in Virginia in 2015. His fondness for the school, his alma mater, was both wholehearted and complicated. Credit…Kevin Remington/Washington and Lee University

By Clay Risen

Ted DeLaney, who began his nearly 60-year career at Washington and Lee University as a custodian, accumulated enough credits to graduate at 41, returned a decade later as a history professor, became the school’s first Black department head and later helped lead its reckoning with the Confederate general its very name honored, Robert E. Lee, died on Dec. 18 at his home in Lexington, Va. He was 77.

His son, Damien DeLaney, said the cause was pancreatic cancer.

Professor DeLaney’s fondness for his alma mater was both wholehearted and complicated. He took pride in his decades of hard work — overcoming obstacles, he often pointed out, that a white academic would never have had to face — and he bristled at suggestions that he was a poster child for the university’s racial liberalization.

In fact, he was a prime mover in driving what was still a very conservative institution forward. As a member of countless faculty committees, he urged the university to recognize its own difficult past — it once owned scores of slaves — and to increase students’ exposure to Black history and culture.

“He was always willing to call out the institution on its failure to live up to its promise,” said Molly Michelmore, the chairwoman of the Washington and Lee history department.

But Professor DeLaney’s primary target was Lee himself, and Lee’s defining role in the university’s identity.

Lee, a slaveowner, resigned from the U.S. Army at the start of the Civil War to fight for the Confederacy. In 1865, just months after surrendering to Gen. Ulysses S. Grant, he accepted the job as president of what was then Washington College. When he died, in 1870, the school changed its name to Washington and Lee and had him buried in a crypt on campus; his horse, Traveller, is buried nearby. Generations of freshmen have had to file into Lee Chapel to sign the “honor book” near a recumbent statue of the general.

Professor DeLaney attacked Lee’s legacy with the tools of his profession. A common story about Lee has him kneeling to pray alongside a Black congregant — proof, his defenders say, of his colorblind heart. But Professor DeLaney’s research showed that the incident had almost certainly never happened.

“If it had been written into a history essay, we would have given it an F,” he said in a 2019 conversation with the Rev. Robert W. Lee IV, a descendant of the general.

In 2017, in the wake of the white supremacist Unite the Right rally in Charlottesville, about an hour from Lexington, Washington and Lee created a commission to address the university’s troubled history. Professor DeLaney was one of three faculty members appointed to it, and by many accounts its motive force.

The commission’s report, delivered in May 2018, made a number of recommendations, among them that Lee Chapel be turned into a museum and that the university “discontinue programming at the chapel that celebrates the mythic Lee, particularly events with characters in period costumes and horses that resemble Traveller.”

According to one account, the university rejected 75 percent of the commission’s recommendations, including anything having to do with Lee Chapel. But a Washington and Lee spokeswoman said the university had accepted at least 50 percent of the recommendations and that additional steps were underway.

Respectful of the institution he called home for so many decades, Professor DeLaney muted his criticism — perhaps, his son speculated, because his training as a historian had taught him to take the long view.

“Knowing my dad and the arc of his career, I don’t believe he thought it was over,” Damien DeLaney said.

Professor DeLaney in 2019. When one colleague poked fun at his fancy wardrobe, he shot back, “I don’t have the white privilege to dress the way you do.”
Professor DeLaney in 2019. When one colleague poked fun at his fancy wardrobe, he shot back, “I don’t have the white privilege to dress the way you do.” Credit…Kevin Remington/Washington and Lee University

Theodore Carter DeLaney Jr. was born in Lexington on Oct. 18, 1943. His mother, Theodora (Franklin) DeLaney, was a barber in Lexington. His father was a doorman at a local hotel. His parents divorced when Ted was 11. (His mother later remarried and became Theodora Morgan.)

In addition to his son, he is survived by his wife, Patricia (Scott) DeLaney; three sisters, Carla Cooks, Janet Jones and Theresa Morgan; and two grandchildren. His brother, Charles DeLaney, died in 1992.

Professor DeLaney graduated from high school in 1961 and planned to attend Morehouse College, in Atlanta, on a scholarship. But his mother refused to let him go, fearing that the direct-action tactics of the city’s civil rights movement could spur a violent backlash, with her son caught in the middle.

Instead, he worked a series of jobs around town, including as a butler for a Washington and Lee fraternity. Having converted to Catholicism in high school, he spent seven months as a postulant at a Franciscan monastery in upstate New York, but left after he grew frustrated with the rules.

Returning to Lexington, he got a job as a custodian in the biology department at Washington and Lee in 1963. Within a year he was working as a lab assistant and, once the school allowed Black students, taking night classes. (Today the university has an undergraduate student body of about 2,000.)

By 1981 he had accumulated enough credits to become a full-time student. He and his wife sold their house to pay for his studies. Ms. DeLaney was the city treasurer for Lexington, and Professor DeLaney often brought his infant son to class when day care wasn’t available.

After graduating cum laude in 1985, he taught at a private school for two years before pursuing a doctorate at William & Mary in Williamsburg, Va. He didn’t plan to focus his career on Black history; he wrote his dissertation on Julia Gardiner Tyler, the second wife of President John Tyler.

But when he returned to Washington and Lee, he found a university caught between its legacy and its future, between alumni pressure to honor the Lost Cause and a diversifying student body critical of the school’s racist past.

Professor DeLaney pushed the school to add courses on slavery and civil rights, as well as gay and lesbian history and even the history of the university itself. In 2005 he co-founded its African-American studies program, which he later expanded to include African studies. He was chairman of the university’s history department from 2007 to 2013.

Professor DeLaney, who was retiring, accepted applause after holding his last class at Washington and Lee in December 2019. He had a nearly 60-year career there. 
Professor DeLaney, who was retiring, accepted applause after holding his last class at Washington and Lee in December 2019. He had a nearly 60-year career there. Credit…Kevin Remington/Washington and Lee University

He became a fixture on campus, a natty dresser with a soft drawl whose classes counted among the most popular on campus, even though they often indicted the wealthy, white Southern society that produced a majority of his students. But he also suffered under the pressure he felt to play the model minority. When one colleague poked fun at his fancy wardrobe, according to a 2019 profile, he shot back, “I don’t have the white privilege to dress the way you do.”

For all his intellectual activism, Professor DeLaney was also a realist; he knew the limits to what he could achieve on a campus that, even well into the 21st century, remains tradition-bound.

When he joined the university’s post-Charlottesville commission, there was pressure to recommend dropping Lee from its name. But Professor DeLaney demurred, recognizing that such a call from a small group of faculty and students could backfire, and that only widespread, grass roots activism could force real change.

Professor DeLaney retired in 2019, and although he taught a class that fall, he was increasingly occupied with his fight against cancer, and could only watch from the sidelines as the racial tumult over the summer of 2020 brought renewed calls to remove Lee’s name.

In early July, the student government, which plays a large role in the university’s governance, voted overwhelmingly in favor of changing the university’s name; days later the faculty did the same. The board of trustees has formed a committee to consider the idea.

Professor DeLaney, his son said, was pleased.

“The reckoning,” he said, “will go on without him.

Jack Steinberger (1921–2020)

Particle physicist who shared Nobel for discovering muon neutrinos.

By Christine Sutton

When particle physicist Jack Steinberger began his career in 1945, scientists knew about only a handful of subatomic particles. Today, dozens are evident, and their basic building blocks are codified in the standard model of particle physics. Steinberger, who has died aged 99, contributed throughout — from discovering particles to grouping them. He shared the Nobel Prize in Physics in 1988 (with Melvin Schwartz and Leon Lederman) for a 1962 experiment that revealed the existence of two distinct types of an enigmatic particle: the neutrino.

Neutrinos barely interact — they can pass right through Earth. They have no electric charge and respond to the ‘weak’ nuclear force, which acts within atomic nuclei and governs radioactivity. They were predicted in the 1930s, to account for the unexplained energy released alongside electrons in radioactive decay. Steinberger and his colleagues showed that there was more than one type of neutrino. They found a second, associated with the muon — a particle similar to the stable electron but 200 times heavier and with a shorter lifetime. Steinberger also helped to pin down the properties of quarks, the ultimate constituents of protons and neutrons.

Born into a Jewish family in Germany, Steinberger was evacuated soon after the Nazis came to power, arriving in the United States in 1934. His foster parents in Chicago, Illinois, ensured that he received a high-school education and reunited his family there in 1938. Steinberger first studied chemistry at the University of Chicago. He joined the US Army on his graduation in 1942, less than a year after the United States entered the Second World War. He worked with physicists at the Massachusetts Institute of Technology in Cambridge on the use of radar to improve the accuracy of aircraft bombing.

After the war, Steinberger returned to the University of Chicago to pursue physics research. He was supervised by Nobel laureate Enrico Fermi, who had demonstrated the first nuclear chain reaction. Fermi, who had also worked on the theory of the neutrino and coined its name, pointed him to a puzzle concerning the decay of the muon, which had been found in cosmic rays in 1936. The particle broke down into an electron and missing energy. Steinberger attributed the energy to not one but two neutrinos, a hypothesis he confirmed experimentally in 1948.

The desire to exploit new facilities and techniques was a hallmark of Steinberger’s research. In 1949, he joined the Radiation Laboratory at the University of California, Berkeley, where he used an innovative accelerator to study another cosmic-ray particle, the pion. He showed the existence of the short-lived electrically neutral pion, the lifetime of which he had earlier calculated theoretically. But he left for Columbia University in New York City in 1950, owing in part to his refusal to sign an anti-communist oath. There, he exploited the newly invented bubble chamber — which reveals trails of fast-moving particles in liquid propane or hydrogen — to make discoveries about the plethora of new particles that were being unearthed. These included ‘strange’ particles, so called because they decay more slowly than expected.

It was at Columbia that Steinberger and his colleagues conducted their Nobel-prize-winning experiment. Steinberger’s former student Schwartz worked out how to make a beam of high-energy neutrinos. The team harnessed a new accelerator at the Brookhaven National Laboratory, New York, and built a fast-acting detector that made particle tracks visible as trails of sparks. It was on a massive scale. A 13.5-metre-thick steel shield, made from armour plates from scrapped warships, was used to block all particles except neutrinos, and a 10-tonne spark chamber was constructed to spot the ones produced.

Steinberger moved to Europe in 1968 to CERN, the particle-physics laboratory near Geneva, Switzerland. The multiwire chamber, which had just been invented there, was able to collect data thousands of times faster than a bubble chamber could. Steinberger used it to extend his studies of strange particles. In 1983, he led the design and construction of a large experiment called ALEPH to exploit CERN’s Large Electron–Positron collider. In 1989, ALEPH helped to demonstrate that there can be no more than three types of neutrino — the electron and the muon neutrinos, and a third associated with the tau particle, another ‘heavy electron’ discovered in 1975. This neatly completed the story that Steinberger had begun with his PhD thesis.

Steinberger retired from CERN in 1986, but continued to work with ALEPH researchers until the mid-1990s. He extended his interests to astrophysics and climate change, and in 2015 joined other Nobel prizewinners in signing the Mainau Declaration, urging governments to limit greenhouse-gas emissions.

Jack Steinberger was admired for his instinct and prowess as an experimental physicist, his intellect as a teacher and supervisor, and for being a great friend. He was not always right — after losing one bet with theorist friends about an aspect of physics, he paid up with good wine. He had a deep interest in how nature works, and enjoyed mountaineering and sailing.

Uninterested in prizes, he often reiterated his belief that “the pretension that some of us are better than others [is not] a good thing”. He felt he been dealt lucky cards in his life, and expressed his deep gratitude to the Chicago family who gave him opportunities as a child. In his words: “You have only one life: whatever crops up, crops up.”