Algorithm Spots COVID-19 Cases from Eye Images


A small study shows artificial intelligence can pick out individuals with coronavirus infections, but ophthalmologists and AI experts say the approach is far from proven to be capable of distinguishing infections with SARS-CoV-2 from other ills.

by Anthony King

Scientists describe a potential screening method for COVID-19 based on eye images analyzed by artificial intelligence. Scanning a set of images from several hundred individuals with and without COVID-19, the tool accurately diagnosed coronavirus infections more than 90 percent of the time, the developers reported in a preprint posted to medRxiv September 10.

“Our model is quite fast,” Yanwei Fu, a computer scientist at Fudan University in Shanghai, China, who led the study, tells The Scientist. “In less than a second it can check results.”

Currently, screening for coronavirus infection involves CT imaging of the lungs or analyzing samples from the nose or throat, both of which take time and require professional effort. A system based on a few images of the eyes that could triage or even diagnose people would save on both costs and time, says Fu. But the investigation by Fu’s team is preliminary and both ophthalmologists and AI specialists say they’d want to see much more information on the technique—and its performance—before being convinced it could work.

Volunteers at Shanghai Public Health Clinical Centre in Fudan each had five photos of their eyes taken using common CCD or CMOS cameras. Of 303 patients, 104 had COVID-19, 131 had other pulmonary conditions, and 68 had eye diseases. A neural network tool extracted and quantified the features from different regions of the eye and an algorithm recognized the ocular characteristics of each disease. A neural network is a series of algorithms for solving AI problems, learning as it goes along in a way that mimics the human brain. The researchers then carried out a validation experiment on a small dataset from healthy people, COVID-19 patients, pulmonary patients, and ocular patients.

Of 24 people with confirmed coronavirus infections, the tool correctly diagnosed 23, Fu tells The Scientist. And the algorithm accurately identified 30 out of 30 uninfected individuals.

Coronavirus infections, not just those caused by SARS-CoV-2, have long had associations with the eye, causing inflammation of the transparent membrane that covers the inside of the eyelid and whites of the eyeball, a condition called conjunctivitis, or pink eye. The eyes also offer a route to infection for respiratory viruses, including coronaviruses.

Human coronavirus NL63, which causes common cold symptoms, was first identified in 2004 in a baby with bronchiolitis and conjunctivitis. Subsequent studies showed that a minority of children infected with this coronavirus suffer from this eye condition.

Although conjunctivitis remains a potential symptom of coronavirus infections, less than 5 percent of COVID-19 patients actually present with eye symptoms, notes Daniel Ting, ophthalmologist at the Singapore National Eye Centre, who has published on this topic and deep learning in ophthalmology. “If you look to develop an AI system to detect COVID-19 based on [limited numbers of] eye images, I think the performance is not going to be great,” especially given the low prevalence of eye symptoms. He doubts the performance of the algorithm also because “a lot of eye manifestations could be due to reasons other than COVID-19.”

Ting cautions that the sample size of 303 patients and 136 healthy individuals in the Shanghai study is too small to draw strong conclusions. “To develop a good deep learning system to automatically detect some unique features from any medical imaging requires more patients,” he says. “In order to increase the reliability of this study, the same size would need to be multiplied by at least ten times, so, thousands of patients.”

Fu has started down this road, increasing the number of participants and broadening the types of subjects. “We are now doing more double-blind tests in the hospitals, with patients, some with eye diseases,” he says. The group also plans to introduce an online screening platform that uses the algorithm to screen for COVID-19.

“As an ophthalmologist it would be very surprising if there is a distinct COVID viral conjunctivitis pattern as opposed to other similar forms of viral conjunctivitis,” ophthalmologist Alastair Denniston, the director of the Health Data Research Hub for Eye Health in Birmingham, UK, writes in an email to The Scientist. “This is unlike building an algorithm for conditions which are biologically more distinct like macular degeneration,” he writes.

He notes that if there were a unique pattern evident in COVID-19 cases, “then the comparison for training and testing should be against cases that look similar,” such as non–COVID-19 viral conjunctivitis or other causes of a red eye associated with colds caused by adenovirus or rhinovirus. He also faults the paper in not providing “the necessary description to really critique the science in terms of how they built and (tried to) validate the model.”

Denniston recently reviewed more than 20,000 AI studies on detecting disease from medical imaging, but found that less than 1 percent were sufficiently robust in their design and reporting that independent reviewers had high confidence in their claims. This led him to convene a group of experts to define the international standards for the design and reporting of clinical trials of AI systems. These standards were published this month in Nature Medicine, The BMJ, and Lancet Digital Health and are supported by leading medical journals.

The Shanghai study has some potentially controversial applications, even if the AI works. Their algorithm could be used in public places, Fu says, though this would raise data privacy concerns in many countries. “In China, for example, we have a lot of high-resolution cameras everywhere,” he notes. “In airports or at train stations, we could use these surveillance cameras to check people’s eyes.” The program would be most accurate if people looked directly at the camera, but Fu says “as long as our camera can clearly watch the eye region it would be good enough.”

Screening the public without expressed consent using this algorithm would be ruled out of bounds in some parts of the world. “In Europe, this would be highly problematic and most likely illegal, in violation of the EU Charter of Fundamental Rights and general data protection legislation,” says computer scientist Barry O’Sullivan of University College Cork in Ireland who is an expert in AI. The gathering of health data and biometric data in Europe requires consent.

O’Sullivan echoes the concern that the paper falls short on detail regarding its methodology. “It is an interesting hypothesis,” he says. But, as currently written, it isn’t ready for publication in a machine learning journal, he concludes.

https://www.the-scientist.com/news-opinion/algorithm-spots-covid-19-cases-from-eye-images-preprint-67950?utm_campaign=TS_COVID_2020&utm_medium=email&_hsmi=95982719&_hsenc=p2ANqtz-9Yy1B2Xmi9R6CwrN9ytEJhx3fVqUcuwGY-VPFb8WfDvmsP-YpW1o88w_FFE4c2gEC3FaXCS8EcsQJE9dcmxX3h1iub7A&utm_content=95982719&utm_source=hs_email

Too much or too little sleep may increase dementia risk

By Brian P. Dunleavy

Getting too much or too little sleep may increase the risk for cognitive decline, or dementia, in older adults, according to a study published Monday by JAMA Network Open.

In an analysis of the sleep habits of more than 20,000 English and Chinese adults age 48 to 75, people who slept for fewer than four hours or more than 10 hours per day showed evidence of declines in cognitive function, including memory and language comprehension, researchers said.

“This study is an observational study and cannot demonstrate a causal relationship,” study co-author Yanjun Ma told UPI, so the findings don’t necessarily prove that lack of sleep or excessive sleep causes a decline in cognitive function.

Observational studies are intended to assess only the effect of an intervention — in this case, sleep — on study participants, without trying to modify it to compare differences.

It’s possible that diminished or excessive sleep is an early sign of cognitive decline or dementia, as opposed to a risk factor, researchers said.

“Future mechanism studies, as well as intervention studies examining the association between sleep duration and cognitive decline are required,” said Ma, of the Peking University Clinical Research Institute in China.

As many as 6 million Americans have some form of dementia, and changes in sleep patterns are common, according to the Alzheimer’s Association.

To date, research has shown that sleep disturbances can result from cognitive impairment, while animal studies have found links between lack of sleep and increased levels of brain proteins that are thought to be signs for Alzheimer’s disease, said Dr. Yue Leng, who authored a commentary on the study findings.

Leng is an assistant professor of psychiatry at the University of California-San Francisco.

For their research, Ma and colleagues analyzed data on sleep behaviors and cognitive function in 20,065 adults from the English Longitudinal Study of Aging and the China Health and Retirement Longitudinal Study, and tracked them for about eight years, on average.

In addition to finding higher levels of cognitive decline among those who slept fewer than four or more than 10 hours per day, the researchers also observed that people with these sleep habits had “faster cognitive decline” than those who slept seven to nine hours per day, Ma said.

“It’s usually believed that sleep deprivation might lead to cognitive decline, but it’s unclear why too much sleep might be bad for cognitive health,” Leng added. “Older adults should pay more attention to their sleep habits, as these might have implications for their cognitive health.”

https://www.upi.com/Health_News/2020/09/21/Too-much-or-too-little-sleep-may-increase-dementia-risk/5341600697569/

Record-Breaking Whale Stays Underwater for 3 Hours and 42 Minutes

By George Dvorsky

Marine biologists are astonished after a Cuvier’s beaked whale held its breath for nearly four hours during a deep dive. The unexpected observation shows there’s much to learn about these medium-sized whales.

Scientists from Duke University and the Cascadia Research Collective recorded the unbelievable dive during field observations off the coast of Cape Hatteras, North Carolina, in 2017. In the first of two epic dives, the Cuvier beaked whale, wearing tag ZcTag066, stayed underwater for nearly three hours. A week later, the whale outdid itself, holding its breath for a bewildering three hours and 42 minutes.

“We didn’t believe it at first, because these are mammals after all, and any mammal spending that long underwater just seemed incredible,” Nicola Quick, the lead author of the new study and a biologist at Duke University, said in an email.

The record-breaking observations occurred in the midst of a five-year survey, in which Quick and her colleagues were measuring the time it takes Cuvier beaked whales (Ziphius cavirostris) to perform their deep foraging dives. During these dives, the whales venture to depths exceeding 9,800 feet (3,000 meters) and hunt squid and deep-sea fish. Unfortunately, the two recordings of ZcTag066 had to be excluded from the researchers’ primary data set “because they were recorded 17 and 24 days after a known [one-hour] exposure to a Navy mid-frequency active sonar signal,” as the authors wrote in the study, adding that these two extreme dives “are perhaps more indicative of the true limits of the diving behaviour of this species.” It’s possible the exposure to sonar might have altered the whale’s normal diving habits, but the researchers don’t know.

Going into the study, the scientists had estimated a maximum length of 33 minutes for the deep dives, after which time the whales need to resurface and gulp some precious atmospheric oxygen, or resume “anaerobic respiration,” in the parlance of the researchers. The team conducted field observations to test this assumption and to measure the length of time it takes for these toothed whales to recover once at the surface. Details of their work were published today in the Journal of Experimental Biology.

Cuvier’s beaked whales are elusive and skittish, having developed fascinating strategies to avoid predators, namely orcas. Thus, it was a challenge for the team to place their satellite-linked tags onto the whales.

“Because the animals spend so little time at the surface, we needed calm seas and experienced observers to look for them,” said Quick in a press release, adding that the “average period they spend at the surface is about two minutes, so getting a tag on [them] takes a dedicated crew and a manoeuvrable vessel.”

The researchers managed to tag 23 individuals, with field observations ongoing from 2014 to 2018. In total, the scientists recorded more than 3,600 foraging dives, the median duration of which was clocked at 59 minutes. The shortest dives lasted just 33 minutes, but the longest dive (excluding ZcTag066’s) was recorded at 2 hours and 13 minutes.

With this data in hand, the researchers had to revise their models. They re-visited the breath-holding patterns and abilities of other aquatic mammals, which led to a new estimate of 77.7 minutes. This obviously still fell considerably short of their field observations, as 5% of dives exceeded this apparent limit.

Clearly, the scientists are missing something about these whales and the unique abilities that allow for their extended stays beneath the water. This sad fact was driven home even further when the team analyzed the whales’ recovery time, that is, the time spent on the surface after a long foraging dive in preparation for a subsequent dive.

It stands to reason that, after a super-long dive, a Cuvier’s beaked whale might want to chill on the surface for a bit to replenish its oxygen supply and rest its weary muscles. Weirdly, this assumption did not jibe with the field observations, as no clear pattern emerged from the data. For example, a whale that dove for 2 hours needed just 20 minutes of rest before it went back for more, while another whale, after diving for 78 minutes, stayed on the surface for 4 hours before foraging again. The new study raises more questions than it answers.

We asked Quick how it’s possible for these mammals to stay underwater for so long.

“These animals are really adapted to diving, so they have lots of myoglobin in their muscles, which helps them to hold more oxygen in their bodies,” she replied. “They are also able to reduce their energy expenditure by being streamlined to dive, and we think reducing their metabolic rate. It’s likely they have many other adaptations as well that we still don’t fully understand, such as being able to reduce their heart rates and restrict the movement of blood flow to tissues.”

As to why some some of the dives lasted so long, the authors said the whales may have been enjoying their time in areas rich in food or reacting to a perceived threat, such as a noise disturbance (U.S. Navy, we’re looking at you).

An encouraging aspect of this study is how much there is still to learn about these aquatic mammals. Clearly, it’s a case of biology exceeding our expectations, which can only be described as exciting.

https://gizmodo.com/record-breaking-whale-stays-underwater-for-mind-bending-1845155205

Mediterranean diet helps offset the health impacts of obesity

By Chrissy Sexton

The Mediterranean diet helps to counter the health impacts of obesity, according to a new study from Uppsala University in Sweden.

In 2015, four million deaths were attributed to excessive weight and more than two-thirds of those deaths were caused by cardiovascular disease (CVD).

“Despite the increasing prevalence of obesity, the rates of CVD-related death continue to decrease in Western societies, a trend not explained by medical treatment alone,” wrote the study authors. “These observations suggest that other factors might modify the higher risk of CVD associated with higher body mass. Potentially, one such factor is diet.”

The Mediterranean diet centers mainly around plant-based foods such as vegetables, fruits, herbs, nuts, beans, and whole grains. The diet also includes moderate amounts of dairy, poultry, eggs, and seafood, while red meat is only eaten occasionally.

A team led by Karl Michaëlsson set out to investigate how a Mediterranean-style diet among individuals with a higher body mass index (BMI) may affect all-cause mortality, with a particular focus on fatal cardiovascular events.

The study was focused on data from more than 79,000 Swedish adults enrolled in the Swedish Mammography Cohort and Cohort of Swedish Men.

Adherence to a Mediterranean-like diet (mMED) was assessed on a scale of 0 to 8, based on intake of fruits and vegetables, legumes, nuts, high-fiber grains, fish, red meat, and olive oil.

Over 21 years of follow-up, more than 30,000 participants died. The researchers found that individuals classified as overweight with high mMED had the lowest risk of all-cause mortality. Obese individuals who had high mMED did not have a higher mortality risk compared with those in the healthy weight group with the same diet.

By contrast, individuals with a healthy weight but low mMED had higher mortality rates compared to people in the same weight range who regularly adhered to a Mediterranean-style diet.

The findings were very similar among 12,000 participants who died from cardiovascular disease. The researchers determined that CVD mortality associated with high BMI was reduced by adherence to a Mediterranean diet, although it was not fully countered. Furthermore, lower BMI did not help offset the elevated CVD mortality risk associated with a low mMED.

“These results indicate that adherence to healthy diets such as a Mediterranean-like diet may be a more appropriate focus that avoidance of obesity for the prevention of overall mortality,” wrote the study authors. “Nonetheless, a healthy diet may not completely counter higher CVD mortality related with obesity.”

The research is published in the journal PLOS Medicine .

Mediterranean diet helps offset the health impacts of obesity

Why do we sleep? The answer may change right before we turn 3.

By Nicoletta Lanese

Humans spend about a third of our lives sleeping, and scientists have long debated why slumber takes up such a huge slice of our time. Now, a new study hints that our main reason for sleeping starts off as one thing, then changes at a surprisingly specific age.

Two leading theories as to why we sleep focus on the brain: One theory says that the brain uses sleep to reorganize the connections between its cells, building electrical networks that support our memory and ability to learn; the other theory says that the brain needs time to clean up the metabolic waste that accumulates throughout the day. Neuroscientists have quibbled over which of these functions is the main reason for sleep, but the new study reveals that the answer may be different for babies and adults.

In the study, published Sep. 18 in the journal Science Advances, researchers use a mathematical model to show that infants spend most of their sleeping hours in “deep sleep,” also known as random eye movement (REM) sleep, while their brains rapidly build new connections between cells and grow ever larger. Then, just before toddlers reach age 2-and-a-half, their amount of REM sleep dips dramatically as the brain switches into maintenance mode, mostly using sleep time for cleaning and repair.

“It was definitely shocking to us that this transition was so sharp,” from growth mode to maintenance mode, senior author Van Savage, a professor of ecology and evolutionary biology and of computational medicine at the University of California, Los Angeles and the Santa Fe Institute, told Live Science in an email. The researchers also collected data in other mammals — namely rabbits, rats and guinea pigs — and found that their sleep might undergo a similar transformation; however, it’s too soon to tell whether these patterns are consistent across many species.

That said, “I think in actuality, it may not be really so sharp” a transition, said Leila Tarokh, a neuroscientist and Group Leader at the University Hospital of Child and Adolescent Psychiatry and Psychotherapy at the University of Bern, who was not involved in the study. The pace of brain development varies widely between individuals, and the researchers had fairly “sparse” data points between the ages of 2 and 3, she said. If they studied individuals through time as they aged, they might find that the transition is less sudden and more smooth, or the age of transition may vary between individuals, she said.

An emerging hypothesis

In a previous study, published in 2007 in the journal Proceedings of the National Academy of Sciences, Savage and theoretical physicist Geoffrey West found that an animal’s brain size and brain metabolic rate accurately predict the amount of time the animal sleeps — more so than the animal’s overall body size. In general, big animals with big brains and low brain metabolic rates sleep less than small animals with the opposite features.

This rule holds up across different species and between members of the same species; for instance, mice sleep more than elephants, and newborn babies sleep more than adult humans. However, knowing that sleep time decreases as brains get bigger, the authors wondered how quickly that change occurs in different animals, and whether that relates to the function of sleep over time.

To begin answering these questions, the researchers pooled existing data on how much humans sleep, compiling several hundred data points from newborn babies and children up to age 15. They also gathered data on brain size and metabolic rate, the density of connections between brain cells, body size and metabolic rate, and the ratio of time spent in REM sleep versus non-REM sleep at different ages; the researchers drew these data points from more than 60 studies, overall.

Babies sleep about twice as much as adults, and they spend a larger proportion of their sleep time in REM, but there’s been a long-standing question as to what function that serves, Tarokh noted.

The study authors built a mathematical model to track all these shifting data points through time and see what patterns emerged between them. They found that the metabolic rate of the brain was high during infancy when the organ was building many new connections between cells, and this in turn correlated with more time spent in REM sleep. They concluded that the long hours of REM in infancy support rapid remodeling in the brain, as new networks form and babies pick up new skills. Then, between age 2 and 3, “the connections are not changing nearly as quickly,” and the amount of time spent in REM diminishes, Savage said.

At this time, the metabolic rate of cells in the cerebral cortex — the wrinkled surface of the brain — also changes. In infancy, the metabolic rate is proportional to the number of existing connections between brain cells plus the energy needed to fashion new connections in the network. As the rate of construction slows, the relative metabolic rate slows in turn.

“In the first few years of life, you see that the brain is making tons of new connections … it’s blossoming, and that’s why we see all those skills coming on,” Tarokh said. Developmental psychologists refer to this as a “critical period” of neuroplasticity — the ability of the brain to forge new connections between its cells. “It’s not that plasticity goes away” after that critical period, but the construction of new connections slows significantly, as the new mathematical model suggests, Tarokh said. At the same time, the ratio of non-REM to REM sleep increases, supporting the idea that non-REM is more important to brain maintenance than neuroplasticity.

Looking forward, the authors plan to apply their mathematical model of sleep to other animals, to see whether a similar switch from reorganization to repair occurs early in development, Savage said.

“Humans are known to be unusual in the amount of brain development that occurs after birth,” lead author Junyu Cao, an assistant professor in the Department of Information, Risk, and Operations Management at The University of Texas at Austin, told Live Science in an email. (Cao played a key role in compiling data and performing computations for the report.) “Therefore, it is conceivable that the phase transition described here for humans may occur earlier in other species, possibly even before birth.”

In terms of human sleep, Tarokh noted that different patterns of electrical activity, known as oscillations, occur in REM versus non-REM sleep; future studies could reveal whether and how particular oscillations shape the brain as we age, given that the amount of time spent in REM changes, she said. Theoretically, disruptions in these patterns could contribute to developmental disorders that emerge in infancy and early childhood, she added — but again, that’s just a hypothesis.

https://www.livescience.com/why-we-sleep-brain-study.html?utm_source=Selligent&utm_medium=email&utm_campaign=9160&utm_content=LVS_newsletter+&utm_term=3675605&m_i=8UY_jNTynHe_3zcTUUgbNBfsu5EMTsCU43mgi8tOnteS3vPmqAlJk16Q6TSxIHJi1tgAdgnm2Gm4GezgFd85bVdOj8L2hG9inkdIOF888c

Playing video games in childhood improves working memory years later

By Chrissy Sexton

Playing video games as a child leads to long-lasting cognitive benefits, according to new research from the Universitat Oberta de Catalunya (UOC). The study suggests that gaming improves working memory and concentration.

Previous studies have shown that gaming improves attention, enhances visual-spatial skills, and causes structural changes in the brain – even increasing the size of some regions. The current study is the first to show that video games promote positive cognitive changes that can take place years after people stop playing them.

“People who were avid gamers before adolescence, despite no longer playing, performed better with the working memory tasks, which require mentally holding and manipulating information to get a result,” said study lead author Dr. Marc Palaus.

The research was focused on 27 people between the ages of 18 and 40 with and without any kind of video gaming experience.

The experts analyzed cognitive skills, including working memory, at three points during the study period: before training the volunteers to play Nintendo’s Super Mario 64, at the end of the training, and fifteen days later.

The findings revealed that participants who had not played video games in childhood did not benefit from improvements in processing and inhibiting irrelevant stimuli. As expected, these individuals were initially slower than those who had played games as children.

“People who played regularly as children performed better from the outset in processing 3D objects, although these differences were mitigated after the period of training in video gaming, when both groups showed similar levels,” said Dr. Palaus.

The experts also performed 10 sessions of a non-invasive brain stimulation known as transcranial magnetic stimulation on the individuals.

“It uses magnetic waves which, when applied to the surface of the skull, are able to produce electrical currents in underlying neural populations and modify their activity,” explained Palaus.

The researchers theorized that combining video gaming with this type of stimulation could improve cognitive performance, but that was not the case.

“We aimed to achieve lasting changes. Under normal circumstances, the effects of this stimulation can last from milliseconds to tens of minutes. We wanted to achieve improved performance of certain brain functions that lasted longer than this.”

The game used in the study had a 3D platform, but there are many types of video games that can influence cognitive functions. According to Dr. Palaus, what most video games have in common is that they involve elements that make people want to continue playing, and that they gradually get harder and present a constant challenge.

“These two things are enough to make it an attractive and motivating activity, which, in turn, requires constant and intense use of our brain’s resources,” said Dr. Palaus. “Video games are a perfect recipe for strengthening our cognitive skills, almost without our noticing.”

The study is published in the journal Frontiers in Human Neuroscience.

Playing video games in childhood improves working memory years later

Man dies from eating bags of black licorice

By MARILYNN MARCHIONE

A Massachusetts construction worker’s love of black licorice wound up costing him his life. Eating a bag and a half every day for a few weeks threw his nutrients out of whack and caused the 54-year-old man’s heart to stop, doctors reported Wednesday.

“Even a small amount of licorice you eat can increase your blood pressure a little bit,” said Dr. Neel Butala, a cardiologist at Massachusetts General Hospital who described the case in the New England Journal of Medicine.

The problem is glycyrrhizic acid, found in black licorice and in many other foods and dietary supplements containing licorice root extract. It can cause dangerously low potassium and imbalances in other minerals called electrolytes.

Eating as little as 2 ounces of black licorice a day for two weeks could cause a heart rhythm problem, especially for folks over 40, the U.S. Food and Drug Administration warns.

“It’s more than licorice sticks. It could be jelly beans, licorice teas, a lot of things over the counter. Even some beers, like Belgian beers, have this compound in it,” as do some chewing tobaccos, said Dr. Robert Eckel, a University of Colorado cardiologist and former American Heart Association president. He had no role in the Massachusetts man’s care.

The death was clearly an extreme case. The man had switched from red, fruit-flavored twists to the black licorice version of the candy a few weeks before his death last year. He collapsed while having lunch at a fast-food restaurant. Doctors found he had dangerously low potassium, which led to heart rhythm and other problems. Emergency responders did CPR and he revived but died the next day.

The FDA permits up to 3.1% of a food’s content to have glycyrrhizic acid, but many candies and other licorice products don’t reveal how much of it is contained per ounce, Butala said. Doctors have reported the case to the FDA in hope of raising attention to the risk.

Jeff Beckman, a spokesman for the Hershey Company, which makes the popular Twizzlers licorice twists, said in an email that “all of our products are safe to eat and formulated in full compliance with FDA regulations,” and that all foods, including candy, “should be enjoyed in moderation.”

https://abcnews.go.com/Health/wireStory/candy-man-dies-eating-bags-black-licorice-73203407

New RNA-Based Tool Could Assess Preeclampsia Risk


Transcripts circulating in the blood provide real-time information about maternal, fetal, and placental health.

by Amanda Heidt

Preeclampsia, a potentially fatal complication that affects roughly 5 percent of pregnancies worldwide, can only be diagnosed after the onset of symptoms such as high blood pressure, so treatment is always reactive. “The next really big need is better methods to diagnose or predict risk of pregnancy complications such as preeclampsia,” says Fiona Kaper, a senior director of scientific research at the biotech company Illumina.

To identify possible biomarkers of the condition, Kaper and her colleagues drew blood from 40 pregnant women with early-onset severe preeclampsia and 73 unaffected expecting mothers. Circulating in the blood of each mom-to-be is her own RNA, as well as transcripts from the placenta and the fetus. Studying these circulating RNAs (cRNAs), the team identified 30 maternal, fetal, or placental genes with altered expression patterns in women with preeclampsia compared with controls. A machine algorithm also identified 49 genes with altered expression, including 12 that overlapped with the earlier list, suspected of being linked to preeclampsia.

To test the ability of the 49 suspect genes to predict preeclampsia, the researchers classified an independent cohort of two dozen women, half with early-onset preeclampsia and half without signs of the condition. The model predicted which women had preeclampsia with 85 percent to 89 percent accuracy.

While large-scale, prospective studies are still needed, cRNA screening represents a step toward earlier preemptive diagnosis, says Kathryn Gray, an obstetrician at Brigham and Women’s Hospital who was not involved in the study. She notes that researchers have been doing something similar in detecting circulating tumor DNA for cancer screening. “It’s really exciting that we’re applying some of these . . . strategies that have been used in cancer to pregnancy. We’re always a bit behind in women’s health and pregnancy in applying the most cutting-edge technologies.”

S. Munchel et al., “Circulating transcripts in maternal blood reflect a molecular signature of early-onset preeclampsia,” Sci Transl Med, 12:eaaz0131, 2020.

https://www.the-scientist.com/the-literature/new-rna-based-tool-could-assess-preeclampsia-risk-67873?utm_campaign=TS_DAILY%20NEWSLETTER_2020&utm_medium=email&_hsmi=95863075&_hsenc=p2ANqtz-8H-ikdjnvKrmnzwtrGHlOIath9Qs78m–DSqudO6tO-Y6Y2DAvu65i9JT3SBSgMACaMx4xfNiVpw5StKx8sw1URGXMeg&utm_content=95863075&utm_source=hs_email

DNA data shows not all Vikings were Scandinavian

In the public imagination, the Vikings were closely-related clans of Scandinavians who marauded their way across Europe, but new genetic analysis paints a more complicated picture.

For the last six years, researchers in Britain and Denmark have been sequencing and analyzing DNA from more than 400 Viking skeletons recovered from dig sites across Europe and Greenland.

The data, published Wednesday in the journal Nature, suggests Vikings were more genetically diverse than researchers thought.

“We have this image of well-connected Vikings mixing with each other, trading and going on raiding parties to fight kings across Europe, because this is what we see on television and read in books — but genetically we have shown for the first time that it wasn’t that kind of world,” lead researcher Eske Willerslev said in a news release.

“This study changes the perception of who a Viking actually was — no one could have predicted these significant gene flows into Scandinavia from Southern Europe and Asia happened before and during the Viking Age,” said Willerslev, a professor of evolutionary genetics at Cambridge University.

The so-called Viking Age begins with the earliest record of a Viking raid, dated to 800 A.D. The age lasted through the 1050s. During that time, Vikings raided monasteries and coastal cities, but also engaged in less violent activities, trading fur, tusks and seal fat.

Researchers knew the Vikings altered the political and economic landscape of Europe. In the 11th century, a Viking, Cnut the Great, ascended to the thrown of the North Sea Empire, comprising Denmark, England and Norway. But until now, researchers weren’t really sure what the Vikings looked like, genetically speaking.

“We found genetic differences between different Viking populations within Scandinavia which shows Viking groups in the region were far more isolated than previously believed,” said Willerslev, director of the Lundbeck Foundation GeoGenetics Center at the University of Copenhagen.

“Our research even debunks the modern image of Vikings with blonde hair as many had brown hair and were influenced by genetic influx from the outside of Scandinavia,” he said.

The DNA recovered from Viking burial sites showed raiding parties from what’s now Norway traveled to Ireland, Scotland, Iceland and Greenland, while groups from what’s now Sweden traveled to Baltic countries.

“We discovered that a Viking raiding party expedition included close family members as we discovered four brothers in one boat burial in Estonia who died the same day,” said study co-author Ashot Margaryan.

“The rest of the occupants of the boat were genetically similar suggesting that they all likely came from a small town or village somewhere in Sweden,” said Margaryan, an assistant professor of evolutionary genomics at the University of Copenhagen.

Researchers also found evidence that local people in Scotland, Celtic-speaking people known as Picts, adopted Viking identities and were buried as Vikings, but never genetically mixed with Scandinavians.

The DNA sequencing efforts showed Viking populations in Scandinavia continued to receive genetic inflows from throughout Europe during the Viking Age.

“Individuals with two genetically British parents who had Viking burials were found in Orkney [Scotland] and Norway,” said Daniel Lawson, lead author from the University of Bristol in Britain. “This is a different side of the cultural relationship from Viking raiding and pillaging.”

https://www.upi.com/Science_News/2020/09/16/DNA-data-shows-not-all-Vikings-were-Scandinavian/9231600264027/

Thanks to Mr. C for bringing this to the It’s Interesting community.

The True Origins of Gold in Our Universe May Have Just Changed, Again

By MICHELLE STARR

When humanity finally detected the collision between two neutron stars in 2017, we confirmed a long-held theory – in the energetic fires of these incredible explosions, elements heavier than iron are forged.

And so, we thought we had an answer to the question of how these elements – including gold – propagated throughout the Universe.

But a new analysis has revealed a problem. According to new galactic chemical evolution models, neutron star collisions don’t even come close to producing the abundances of heavy elements found in the Milky Way galaxy today.

“Neutron star mergers did not produce enough heavy elements in the early life of the Universe, and they still don’t now, 14 billion years later,” said astrophysicist Amanda Karakas of Monash University and the ARC Centre of Excellence for All Sky Astrophysics in 3 Dimensions (ASTRO 3D) in Australia.

“The Universe didn’t make them fast enough to account for their presence in very ancient stars, and, overall, there are simply not enough collisions going on to account for the abundance of these elements around today.”

Stars are the forges that produce most of the elements in the Universe. In the early Universe, after the primordial quark soup cooled enough to coalesce into matter, it formed hydrogen and helium – still the two most abundant elements in the Universe.

The first stars formed as gravity pulled together clumps of these materials. In the nuclear fusion furnaces of their cores, these stars forged hydrogen into helium; then helium into carbon; and so on, fusing heavier and heavier elements as they run out of lighter ones until iron is produced.

Iron itself can fuse, but it consumes huge amounts of energy – more than such fusion produces – so an iron core is the end point.

“We can think of stars as giant pressure cookers where new elements are created,” Karakas said. “The reactions that make these elements also provide the energy that keeps stars shining bright for billions of years. As stars age, they produce heavier and heavier elements as their insides heat up.”

To create elements heavier than iron – such as gold, silver, thorium and uranium – the rapid neutron-capture process, or r-process, is required. This can take place in really energetic explosions, which generate a series of nuclear reactions in which atomic nuclei collide with neutrons to synthesise elements heavier than iron.

But it needs to happen really quickly, so that radioactive decay doesn’t have time to occur before more neutrons are added to the nucleus.

We know now that the kilonova explosion generated by a neutron star collision is an energetic-enough environment for the r-process to take place. That’s not under dispute. But, in order to produce the quantities of these heavier elements we observe, we’d need a minimum frequency of neutron star collisions.

To figure out the sources of these elements, the researchers constructed galactic chemical evolution models for all stable elements from carbon to uranium, using the most up-to-date astrophysical observations and chemical abundances in the Milky Way available. They included theoretical nucleosynthesis yields and event rates.

They laid out their work in a periodic table that shows the origins of the elements they modelled. And, among their findings, they found the neutron star collision frequency lacking, from the early Universe to now. Instead, they believe that a type of supernova could be responsible.

These are called magnetorotational supernovae, and they occur when the core of a massive, fast-spinning star with a strong magnetic field collapses. These are also thought to be energetic enough for the r-process to take place. If a small percentage of supernovae of stars between 25 and 50 solar masses are magnetorotational, that could make up the difference.

“Even the most optimistic estimates of neutron star collision frequency simply can’t account for the sheer abundance of these elements in the Universe,” said Karakas. “This was a surprise. It looks like spinning supernovae with strong magnetic fields are the real source of most of these elements.”

Previous research has found a type of supernova called a collapsar supernova can also produce heavy elements. This is when a rapidly rotating star over 30 solar masses goes supernova before collapsing down into a black hole. These are thought to be much rarer than neutron star collisions, but they could be a contributor – it matches neatly with the team’s other findings.

They found that stars less massive than about eight solar masses produce carbon, nitrogen, fluorine, and about half of all the elements heavier than iron. Stars more massive than eight solar masses produce most of the oxygen and calcium needed for life, as well as most of the rest of the elements between carbon and iron.

“Apart from hydrogen, there is no single element that can be formed only by one type of star,” explained astrophysicist Chiaki Kobayashi of the University of Hertfordshire in the UK.

“Half of carbon is produced from dying low-mass stars, but the other half comes from supernovae. And half the iron comes from normal supernovae of massive stars, but the other half needs another form, known as Type Ia supernovae. These are produced in binary systems of low mass stars.”

This doesn’t necessarily mean that the estimated 0.3 percent of Earth’s gold and platinum traced back to a neutron star collision 4.6 billion years ago has a different origin story. It’s just not necessarily the whole story.

But we’ve only been detecting gravitational waves for five years. It could be, as our equipment and techniques improve, that we find neutron star collisions are much more frequent than we think they are at this current time.

Curiously, the researchers’ models also turned out more silver than observed, and less gold. That suggests something needs to be tweaked. Perhaps it’s the calculations. Or perhaps there are some aspects of stellar nucleosynthesis that we are yet to understand.

The research has been published in The Astrophysical Journal.

https://www.sciencealert.com/neutron-star-collisions-may-not-be-making-much-gold-after-all