Hardy water bears survive bullet impacts—up to a point

Water bear (Macrobiotus sapiens) in moss. Color enhanced scanning electron micrograph (SEM) of a water bear in its active state. Water bears (or tardigrades) are tiny invertebrates that live in aquatic and semi-aquatic habitats such as lichen and damp moss. They require water to obtain oxygen by gas exchange. In dry conditions, they can enter a cryptobiotic state of desiccation, known as a tun, to survive. In this state, water bears can survive for up to a decade. This species was found in moss samples from Croatia. It feeds on plant and animal cells. Water bears are found throughout the world, including regions of extreme temperature, such as hot springs, and extreme pressure, such as deep underwater. They can also survive high levels of radiation and the vacuum of space. Magnification: x250 when printed 10cm wide.

By Jonathan O’Callaghan

They can survive temperatures close to absolute zero. They can withstand heat beyond the boiling point of water. They can shrug off the vacuum of space and doses of radiation that would be lethal to humans. Now, researchers have subjected tardigrades, microscopic creatures affectionately known as water bears, to impacts as fast as a flying bullet. And the animals survive them, too—but only up to a point. The test places new limits on their ability to survive impacts in space—and potentially seed life on other planets.

The research was inspired by a 2019 Israeli mission called Beresheet, which attempted to land on the Moon. The probe infamously included tardigrades on board that mission managers had not disclosed to the public, and the lander crashed with its passengers in tow, raising concerns about contamination. “I was very curious,” says Alejandra Traspas, a Ph.D. student at Queen Mary University of London who led the study. “I wanted to know if they were alive.”

Traspas and her supervisor, Mark Burchell, a planetary scientist at the University of Kent, wanted to find out whether tardigrades could survive such an impact—and they wanted to conduct their experiment ethically. So after feeding about 20 tardigrades moss and mineral water, they put them into hibernation, a so-called “tun” state in which their metabolism decreases to 0.1% of their normal activity, by freezing them for 48 hours. 

They then placed two to four at a time in a hollow nylon bullet and fired them at increasing speeds using a two-stage light gas gun, a tool in physics experiments that can achieve muzzle velocities far higher than any conventional gun. When shooting the bullets into a sand target several meters away, the researchers found the creatures could survive impacts up to about 900 meters per second (or about 3000 kilometers per hour), and momentary shock pressures up to a limit of 1.14 gigapascals (GPa), they report this month in Astrobiology. “Above [those speeds], they just mush,” Traspas says.

The results suggest the tardigrades on Beresheet were unlikely to survive. Although the lander is thought to have crashed at a few hundred meters per second, the shock pressure its metal frame generated hitting the surface would have been “well above” 1.14 GPa, Traspas says. “We can confirm they didn’t survive.”

The research also places new limits on a theory known as panspermia, which suggests some forms of life could move between worlds, as stowaways on meteorites kicked up after an asteroid strikes a planet or moon. Eventually, the meteorite could impact another planet—along with its living cargo.

Charles Cockell, an astrobiologist at the University of Edinburgh who was not involved in the study, says the research shows how unlikely panspermia is. “What this paper is showing is that complex multicellular animals cannot be easily transferred,” he says. “In other words, Earth is a biogeographical island with respect to animals. They’re trapped, like a flightless bird on an island.”

Traspas, however, says it shows panspermia “is hard,” but not impossible. Meteorite impacts on Earth typically arrive at speeds of more than 11 kilometers per second. On Mars, they collide at least at 8 kilometers per second. These speeds are well above the threshold for tardigrades to survive. However, some parts of a meteorite impacting Earth or Mars would experience lower shock pressures that a tardigrade could live through, Traspas says.

Objects strike the Moon at still lower speeds. When impacts on Earth send bits of rock and debris hurtling toward the Moon, about 40% of that material could travel at speeds low enough for any tardigrades to survive, Traspas and Burchell say, theoretically allowing them to jump from our planet to the Moon. A similar passage, they add, could take place from Mars to its moon, Phobos. And other life might have an even better chance of surviving; compared with water bears, some microbes can survive even faster impacts of up to 5000 meters per second, according to previous research.

The new experiment also has implications for our ability to detect life on icy moons in the outer Solar System. Saturn’s moon Enceladus, for example, ejects plumes of water into space from a subsurface ocean that could support life, as might Jupiter’s moon Europa. If the findings of the new study apply to potential life trapped in the plumes, a spacecraft orbiting Enceladus—at relatively low speeds of hundreds of meters per second—might sample and detect existing life without killing it.

No such orbiting mission is currently planned for Enceladus or Europa—upcoming NASA and European flyby missions will swoosh by the latter at high speeds of several kilometers per second. But perhaps one day far in the future an orbiter might be in the cards, with an ability to detect life at gentler speeds. “If you collect it and it died on impact, how do you know whether it’s been dead for millions of years?” asks Anna Butterworth, a planetary scientist at the University of California, Berkeley, who has studied plume impacts on spacecraft. “If you collect microscopic life and it’s moving around, you can say it’s alive.”


Brain-Computer Interface User Types 90 Characters Per Minute with Mind

The experimental system, developed and tested in just one patient so far, relies on brain signals associated with handwriting to achieve the fastest communication yet seen with BCI.

Abrain-implant system trained to decode the neural signals for handwriting from a paralyzed man enabled a computer to type up to 90 characters per minute with 94 percent accuracy, researchers report yesterday (May 12) in Nature. The study’s authors say this brain-computer interface (BCI) is a considerable improvement over other experimental devices aimed at facilitating communication for people who cannot speak or move, but many steps remain before it might be used clinically.

“There are so many aspects of [the study] that are great,” says Emily Oby, who works on BCIs at the University of Pittsburgh and was not involved in the work. “It’s a really good demonstration of human BCI that is working towards clinical viability,” and also contributes to understanding why the handwriting-based system seems to work better than BCIs based on translating the neural signals for more straightforward physical motions such as pointing at letters on a display.

The study came out of a long-term clinical trial called BrainGate2 in which participants who are paralyzed have sensors implanted in the motor cortex of their brains and work with researchers who aim to use the sensors’ data to develop BCIs. “Because of the animal model heritage and the history of the [BCI] field, a lot of the early stuff is focused on this point-and-click typing method where you move a cursor on a screen, and you type on keys individually,” explains Frank Willett, a member of the Neural Prosthetics Translational Laboratory (NPTL) at Stanford University and a Howard Hughes Medical Institute research specialist. “We’re interested in kind of pushing the boundaries and looking at other ways to let people communicate.”

Willett and his colleagues worked with a BrainGate2 participant nicknamed T5 who has a spinal injury, is able to talk, and has a sensor in an area of the brain known as the hand knob that is associated with hand movement. In several sessions, they asked T5 to pretend he was holding a pen and writing hundreds of sentences they showed him on a screen. They then used the activity detected by T5’s sensor to train a neural network to identify the letters T5 was writing, and tested the program’s ability to generate text in real time based on brain signals generated as he imagined writing new sentences. 

An algorithm interpreted patterns of electrical signals from T5’s brain as he imagined writing letters.

The researchers report that the trained network enabled T5 to “type” at a speed of up to 90 characters per minute and had 94.1 percent accuracy in deciphering the letters he wrote. That’s a considerable improvement on a previous BCI the group developed that was based on having participants control a computer mouse with their brain signals and click on letters, which achieved about 40 characters per minute. In fact , the authors write, to their knowledge, it’s the fastest typing rate for any BCI so far.

Speed is critical for people who need BCIs to communicate, notes Oby, because “the faster and more efficiently that they can communicate the better, in terms of increasing their quality of life, and just making interactions more easy and smooth and less stressful.”

To see what accounts for this superior performance, the authors analyzed the neural patterns corresponding to letters and to the straight reaching movements used in the point-and-click BCI. They found that the patterns for the letters are more distinct from one another, making them easier for a neural network to decipher. They also devised their own 26-letter alphabet, replete with curvy lines, that their simulations indicate would enable an even more accurate BCI by eschewing letters that are written similarly to one another.

“[It] makes a lot of sense . . . that having more complex movement dynamics can really help improve the communication rate, the accuracy of the decoding,” says Edward Chang, a neurosurgeon at the University of California, San Francisco, who has worked informally with the NPTL group but was not involved in the current study. “They’re really exploiting a new dimension of features that help make the signals more discriminable.” 

There are several improvements that would be needed to make the BCI ready for clinical use. Those include tweaks to the brain implant itself, such as making it smaller and capable of wireless signal transmission, says study coauthor Jaimie Henderson, a neurosurgeon in the NPTL who consults for the BCI company Neuralink and is on the medical advisory board for Enspire, a company exploring deep-brain stimulation for stroke recovery. In addition, in the study the researchers needed to regularly calibrate the BCI to account for minute shifts in the positions of the sensors that alter what neural activity they pick up; ideally, Henderson and Willett say, this process, as well as the initial training of the neural network, would be automated.  

Henderson, Willett, and senior author Krishna Shenoy, another NPTL member and a Howard Hughes Medical Institute investigator who consults for or serves on the advisory boards of several BCI-related companies, have filed a patent application for the neural decoding method they used and are talking with companies about the possibility of licensing it, Shenoy says. Ultimately, Willett and Henderson say, they’re interested in exploring neural signals for speech as a way to enable even faster communication than with handwriting. The rate of speech is about 150–200 words per minute, Henderson notes, and decoding it is an interesting scientific endeavor because it’s uniquely human and because it’s not fully understood how speech is produced in the brain. “We feel like that’s a very rich area of exploration, and so one of our big goals over the next five to ten years is to really tackle the problem of understanding speech and decoding it into both text and spoken word.” 

F.R. Willett et al., “High-performance brain-to-text communication via handwriting,” Nature, 593:249–54, 2021.

What’s the Right Amount of Sleep for a Healthy Heart?

There’s a “sweet spot” for the amount of sleep you should get to reduce your risk of heart attack and stroke, new research shows.

Folks who get six to seven hours a sleep a night — no more, no less — have the lowest chance of dying from a heart attack or stroke, according to new findings.

Waking early or dozing on past that ideal window increases your risk of heart-related death by about 45%, researchers found.

This trend remained true even after they accounted for other known risk factors for heart disease or stroke, including age, high blood pressure, diabetes, smoking, BMI (body mass index) and high cholesterol levels.

“Even then, sleep came out to be an independent risk factor,” said lead researcher Dr. Kartik Gupta, a resident in internal medicine at Henry Ford Hospital in Detroit.

For the study, Gupta and his colleagues analyzed data from more than 14,000 participants in the federally funded U.S. National Health and Nutrition Examination Survey between 2005 and 2010. As part of the survey, these folks were asked how long they usually slept.

Researchers tracked participants for an average of 7.5 years to see if they died from heart attack, heart failure or stroke. They also assessed their heart health risk scores as well as their blood levels of C-reactive protein (CRP), which increases when there’s inflammation in your body. High CRP levels have been associated with heart disease.

The research team found a U-shaped relationship between heart risk and sleep duration, with risk at its lowest among people who got between six and seven hours of sleep on average.

A lack of sleep already has been linked to poor heart health, said Dr. Martha Gulati, editor-in-chief of CardioSmart.org, the American College of Cardiology’s educational site for patients.

“We have a lot of data related to less sleep,” said Gulati, a cardiologist. She noted that a number of key heart risk factors — blood pressure, glucose tolerance, diabetes and inflammation — are exacerbated by too little sleep.

There’s not as much evidence regarding those who slumber too long and their heart risk, however, Gulati and Gupta said.

Gupta and his colleagues found one possible explanation in their research. Based on patients’ levels of CRP, inflammation accounted for about 14% of heart-related deaths among short sleepers and 13% among long sleepers, versus just 11% of folks who got the optimal six to seven hours of sleep.

“Patients who sleep for six to seven hours have the least CRP, so this inflammation might be driving increased cardiovascular risk,” Gupta said.

It might be that people who sleep longer than seven hours are just getting lousy sleep, and so have to doze longer, Gulati said. Poor quality sleep could be driving the increased risk among late snoozers.

“You wonder if somebody is sleeping longer because they just didn’t get a good night’s sleep,” Gulati said. “I always say there’s good sleep and there’s bad sleep. You might be in bed for eight hours, but is it good quality sleep?”

Here are some tips for improving your sleep, courtesy of Harvard Medical School:

  • Avoid caffeine and nicotine four to six hours from bedtime.
  • Keep your bedroom dark, quiet and cool to promote better sleep.
  • Establish a relaxing routine an hour or so before bed.
  • Don’t try to force yourself to sleep — if you aren’t asleep within about 20 minutes, get up and do something relaxing for a bit until you feel sleepy.
  • Eat dinner several hours before bedtime and avoid foods that can upset your stomach.
  • Exercise earlier in the day, at least three hours before bed.

“In the medical community we know it’s important to sleep, but we still don’t treat it like something we should be asking about routinely,” Gulati said. “I wish I could say doctors were good enough at asking about sleep. I think it should be like a vital sign.”

The findings will be presented virtually May 15 at the annual meeting of the American College of Cardiology. Findings presented at medical meetings are considered preliminary until published in a peer-reviewed journal.

More information

The U.S. Centers for Disease Control and Prevention offers more sleep basics.

SOURCES: Kartik Gupta, MD, internist, Henry Ford Hospital, Detroit; Martha Gulati, MD, editor-in-chief, CardioSmart.org; online presentation, American College of Cardiology virtual annual meeting, May 15, 2021


Children’s brain development appears to be affected by their mother’s depressive symptoms in early pregnancy

A new study has found evidence of a link between prenatal maternal depressive symptoms and alterations in early brain development. The findings have been published in the journal Psychiatry Research: Neuroimaging.

“Child behavioral and emotional development as well as adult mental and physical health might be shaped by maternal depressive symptoms during pregnancy,” said researcher Henriette Acosta of the University of Turku. “The underlying biological mechanisms are not yet well understood and could involve alterations in fetal brain development.”

The study examined neuroimaging data from 28 children, who were scanned using magnetic resonance imaging when they were 4 years old. The children’s mothers had completed multiple assessments of anxiety and depressive symptoms during and after their pregnancy.

Acosta and her colleagues were particularly interested in a brain region known as the amygdala, which has been implicated in psychiatric disorders such as depression, post-traumatic stress disorder, schizophrenia and autism spectrum disorder.

After controlling for maternal anxiety, the researchers found that the children tended to have smaller right amygdalar volumes when their mothers experienced more depressive symptoms during pregnancy. Postnatal depressive symptoms, however, were not associated with amygdalar volumes.

“Higher maternal depressive symptoms during early and late pregnancy were associated with smaller subcortical brain volumes in 4-year-olds, which were more pronounced in boys than girls. The affected brain area, the amygdala, plays a prominent role in emotion processing and emotional memory and is implicated in several psychiatric disorders,” Acosta told PsyPost.

“The study results suggest that maternal depressive symptoms as early as in the prenatal period alter early brain development and might thus influence the offspring’s vulnerability to develop a mental disorder over the lifespan.”

The researchers controlled for a number of factors besides anxiety, including childhood maltreatment, maternal education, maternal age, prenatal medication, and maternal substance use. But the study — like all research — includes some limitations.

“A major caveat is the unknown role of underlying genetic effects that the child inherits from the mother and could impact child’s brain developmental trajectory as well as their vulnerability to stress and depression,” Acosta explained. “Moreover, the sample size of this study was rather small. Hence, the here reported findings should be addressed in future studies with larger sample sizes and genetically informed designs.”

Nevertheless, the study indicates that prenatal depression could have long-lasting effects on offspring health.

“The findings of this study support the notion that pregnancy constitutes a vulnerable period of an individual’s development and that the protection of the expectant mother from adversity should be a primary concern of society,” Acosta said.

The study, “Prenatal maternal depressive symptoms are associated with smaller amygdalar volumes of four-year-old children“, was published October 30, 2020.


Biggest genetic study of supercentenarians reveals clues to healthy aging

A whhole genome study discovered a number of genetic characteristics unique to those who live well past 100

By Rich Haridy

In the most detailed genomic study ever conducted of individuals over the age of 100 years, researchers have homed in on several particular genetic characteristics that seem to confer protection from age-related diseases. Gene variants improving DNA repair processes were particularly prominent in this cohort of supercentenarians.

If you eat well, exercise frequently and avoid those detrimental vices, you can reasonably hope to live a long and healthy life. Of course, many age-related diseases seem almost inevitable, whether they catch up with you in your 80s or your 90s. But some people show a propensity for extreme longevity, living healthily well past the age of 100.

Research has shown those who live beyond the age of 100 tend to present extraordinarily healthy signs of aging. They are less likely to have been hospitalized in earlier life and have seemed to avoid many age-related conditions most people battle in their 80s or 90s, such as heart disease or neurodegeneration.

This new study presents a comprehensive investigation of 81 semi-supercentenarians (aged over 105) and supercentenarians (aged over 110). The researchers also matched this cohort against a group of healthy, geographically matched subjects aged in their late 60s. The goal was to genetically distinguish those generally healthy people in their late 60s from those extremely healthy supercentenarians.

Five particular genetic changes were commonly detected in the supercentenarian cohort, concentrated around two genes called STK17A and COA1.

STK17A is known to be involved in DNA damage response processes. As we age, the body’s DNA repair mechanisms become less effective. Accumulated DNA damage is known to be responsible for some signs of aging, so increased expression of STK17A can favor healthy aging by preserving DNA repair processes in old age.

Reduced expression of COA1 in the supercentenarians was also detected. This gene plays a role in communications between a cell’s nucleus and mitochondria.

“Previous studies showed that DNA repair is one of the mechanisms allowing an extended lifespan across species,” explains senior author on the new study, Cristina Giuliani. “We showed that this is true also within humans, and data suggest that the natural diversity in people reaching the last decades of life are, in part, linked to genetic variability that gives semi-supercentenarians the peculiar capability of efficiently managing cellular damage during their life course.”

The researchers also found the supercentenarians displayed an unexpectedly low level of somatic gene mutations, which are the mutations we all generally accumulate as we grow older. It is unclear why these older subjects have avoided the age-related exponential increase usually seen with these kinds of mutations.

“Our results suggest that DNA repair mechanisms and a low burden of mutations in specific genes are two central mechanisms that have protected people who have reached extreme longevity from age-related diseases, says Claudio Franceschi, another senior author on the study.

The new research was published in the journal eLife.


Head-injury risk higher for female soccer players, massive survey finds

Data on the rates and causes of concussion in US high-school athletes reveal striking differences between the sexes.

by Katharine Sanderson

Female soccer players are twice as likely to suffer concussion as their male counterparts, a study of more than 80,000 teenage players across US high schools has found.

Researchers analysed survey data from around 43,000 male and 39,000 female players from schools in Michigan over 3 academic years. A striking difference emerged between the sexes in their likelihood of having a sports-related head injury, with the girls’ chance of concussion 1.88 times higher than the boys’, according to the findings published on 27 April in JAMA Network Open1.

Scientists already suspected that head injuries were more common, and required longer recovery times, in female athletes. But concrete data were lacking, says neuropathologist Willie Stewart at the University of Glasgow, UK, who led the study. “We’re doing so little research in female athletes,” he says. Such a large volume of data on sports injuries, collected by the Michigan High School Athletic Association, offered an opportunity to investigate whether female athletes really are at higher risk of concussion (see ‘Concussion risk’).

Concussion risk. A survey of ~80k high-school soccer players found girls are twice as likely to suffer a concussion than boys.

“There were indeed differences between male and female athletes,” says Stewart. How the high-school players sustained their injuries also differed significantly between male and female adolescents: the boys’ most common way of becoming concussed was through bashing into another player, with almost half of all concussions reported happening in this way. Girls were most likely to be concussed after colliding with another object, such as the ball or one of the goalposts. Boys were also more likely to be removed from play immediately after a suspected head injury than were girls.

The different mechanism for head injuries in girls is an important finding, Stewart says. “It might be one reason girls with concussion were not being picked up on the field so regularly,” he adds. Concussion-management systems currently in use — from how potential head injuries are spotted during a match, to how athletes are treated and how quickly they return to play — are almost exclusively dictated by research on male athletes, says Stewart. “Rather than the current, male-informed, one-size-fits-all approach to concussion management, there might need to be consideration of sex-specific approaches,” he says. This could include restrictions on heading footballs, or having more medically trained personnel present during female matches.

Liz Williams, who researches biomechanics and head injuries at Swansea University, UK, conducted a large international study into female rugby players and their experiences of injury in 2020. Stewart’s findings don’t surprise her. “We’re all finding the same thing, females are more predisposed to brain injury than males,” she says, “and the incidence is likely higher, in my opinion, than what is being reported.”doi: https://doi.org/10.1038/d41586-021-01184-8


  1. 1.Bretzin, A. C. et al. JAMA Netw. Open 4, e218191(2021).

Four distinct variants of Alzheimer’s identified in brain imaging study

Researchers have detected four types of Alzheimer’s by tracking different patterns of tau protein accumulation in the brains of patients.

By Rich Haridy

A new international study has found four distinct patterns of toxic protein spread in the brains of patients with Alzheimer’s disease. The findings indicate these patterns correspond with particular symptoms, and the researchers hypothesize these four variants could respond to different treatments.

The research focused the accumulation and spread of a toxic protein in the brain called tau. Alongside amyloid beta, another protein known to be implicated in neurodegeneration, the spread of tau has been associated with cognitive decline seen in Alzheimer’s.

Positron emission tomography (PET) imaging was used to monitor levels of tau, and patterns of spread, in the brains of over 1,000 subjects. The cohort spanned the spectrum of Alzheimer’s patients, from those yet to display symptoms of cognitive decline to those in advanced stages of dementia.

“In contrast to how we have so far interpreted the spread of tau in the brain, these findings indicate that tau pathology in the brain varies according to at least four distinct patterns,” says Jacob Vogel, lead author on the new study. “This would suggest that Alzheimer’s is an even more heterogeneous disease than previously thought.”

The four variants of tau spread clearly corresponded with symptomatic experiences. These four variants were also quite evenly spread across the cohort meaning they all were common and there likely is no one single dominant type of Alzheimer’s disease.

Variant one was the most prevalent, detected in 33 percent of cases. This pattern of tau spread was primarily located in the temporal lobe and influenced memory.

Variant two, on the other hand, displayed greater tau spread in other parts of the cerebral cortex. Around 18 percent of cases showed this kind of spread and it manifested in difficulties with executive functions such as self-regulation and focus.

Variant three, the second-most prevalent subtype, was noted in 30 percent of cases. It showed distinct tau accumulations in the visual cortex. Symptoms of this variant included difficulties distinguishing distance, shapes, contours and general orientation.

The final variant described in the study saw asymmetric spread of tau across the left hemisphere of the brain. This mostly influenced language skills and was detected in 19 percent of cases.

The four different patterns of tau spread detected in the new study
The four different patterns of tau spread detected in the new study

“Because different regions of the brain are affected differently in the four subtypes of Alzheimer’s, patients develop different symptoms and also prognoses,” notes Oskar Hansson, from Lund University and corresponding author on the study “This knowledge is important for doctors who assess patients with Alzheimer’s, and it also makes us wonder whether the four subtypes might respond differently to different treatments.”

This is not the first research to divide Alzheimer’s disease into different subtypes. Currently the disease is only classified as either early-onset Alzheimer’s or late-onset Alzheimer’s, and a milestone 2018 study presented six different disease categories based on specific cognitive and genomic characteristics.

more recent brain tissue study divided the disease into three different molecular subtypes. However, until now it has been difficult translating these findings into a potential diagnostic tool.

A strength of this new study is the way it takes an accessible brain imaging tool and uses it to categorize tau accumulations alongside symptomatic presentation in a large number of patients. The researchers cautiously note follow-up work is needed to validate these patterns over longer periods of time, but it seems to be increasingly clear Alzheimer’s is a much more diverse disease that previously assumed.

“We now have reason to reevaluate the concept of typical Alzheimer’s, and in the long run also the methods we use to assess the progression of the disease,” says Vogel.

The new study was published in the journal Nature Medicine.


New Barrett’s esophagus monitoring method could aid in easier and more precise prognoses

Sandy Markowitz

A new technique for sampling and testing cells from Barrett’s esophagus (BE) patients could result in earlier and easier identification of patients whose disease has progressed toward cancer or whose disease is at high risk of progressing toward cancer, according to a collaborative study by investigators at Case Western Reserve University and Johns Hopkins Kimmel Cancer Center (JHKCC).

Published in the journal Gastroenterology, the findings show the combination of esophageal “brushing” with a massively parallel sequencing method can provide an accurate assessment of the stages of BE in patients and detect specific chromosomal alterations, including the presence of esophageal adenocarcinoma (EAC).

This combined approach aims to provide a practical and sensitive molecular-based method that could improve how doctors detect early progression of BE toward cancer and also assess the risks for such progression among patients already diagnosed with early-stage BE.

“The tests that we have for detecting disease progression in patients with BE are inadequate, as shown by BE patients who develop cancer while under medical surveillance,” said Amitabh Chak, senior and corresponding author on the study and a professor of medicine at the School of Medicine and gastroenterologist at the University Hospitals Digestive Health Institute. “We also lack accurate means to recognize new BE patients who are at highest risk to progress toward cancer, and who need more intense surveillance.”

“Our findings provide the technical means and conceptual basis for an new molecular based approach that could become key to the clinical management of this disease,” said Sanford Markowitz, co-corresponding author and the Ingalls Professor of Cancer Genetics and Medicine and Distinguished University Professor at the Case Western Reserve School of Medicine and Case Comprehensive Cancer Center (Case CCC), an oncologist at UH Seidman Cancer Center and corresponding author of the study.

Associated with chronic gastroesophageal reflux disease, BE usually emerges from damage to the lining of the esophagus after repeated exposure to acid and contents from the stomach.

BE is the precursor lesion to esophageal cancer, and while most BE cases do not progress to cancer, those in whom cancer develops face an overall five-year survival rate below 20%.  

Challenges in caring for BE patients are therefore to detect small areas within the BE in which progression toward cancer has occurred and to identify new BE patients in whom the risk of such progression is particularly high. This study reported a new molecular based approach that addresses both these needs.

New way to monitor BE

The effectiveness of the new approach comes from the respective convenience and effectiveness of its two parts.

First, esophageal brushings can sample a more extensive region of the esophagus than conventionally employed biopsies—even when multiple biopsies are performed. Second, massively parallel sequencing can detect chromosomal changes indicative of disease progression even in rare cells present in the mixture collected by brushings. The sequencing technology, called RealSeqS, is similar to that which JHKCC investigators developed for use in blood tests for cancer, except the Case Western Reserve and JHKCC collaborative team applied it to esophageal brushings.

“We reasoned that RealSeqS could be effective when applied to esophageal brushings, because of the underlying challenge is the same as in blood samples, that of detecting DNA from rare abnormal cells among the large number of normal cells also present,” Markowitz said. More studies will be needed with larger cohorts to refine the approach, he said.

Current BE testing and monitoring methods, including endoscopic detection surveillance and testing of abnormal tissue, to monitor BE for progression and to detect esophageal cancer, but the approach relies on the sampling with random biopsies, which are inherently imprecise.

“This new method shows promise to make the monitoring of BE more efficient and effective,” said Chak. “Currently, some patients can progress to advanced cancer even though they are under surveillance. Most patients are not at risk for progressing, yet because we cannot tell who is not at risk for progressing, we survey everyone—so we over-survey patients. We are seeking to change this.”

In the study, esophageal brushings were obtained from patients without BE, with early-stage BE—known as non-dysplastic BE (NDBE); with earliest-stage progression, known as low-grade dysplasia (LGD); with further progression known as high grade dysplasia (HGD), or with full progression to EAC.

Testing esophageal brushing samples with RealSeqS, enabled researchers to develop molecular classifiers—based on detecting progression associated chromosome alterations contributed by rare cells in the BE—to accurately discriminate between patients with non-dysplastic BE (NDBE) and those with precancerous cells (dysplasia) or cancer. Moreover, the investigators identified a unique subset of 7% of NDBE patients who already showed the molecular signature of its progression, and who are likely to be at high risk of developing clinically evident progressive disease.

The study is a collaboration between JHKCC and the Case CCC’s GI SPORE (Gastrointestinal Specialized Program of Research Excellence) and BETRNet (Barrett’s Esophagus Translational Research Network) programs led respectively by Markowitz, Chak and Joseph Willis, a pathology professor at the School of Medicine and pathology vice-chair for translational research at UH.

Co-authors of the paper “Massively Parallel Sequencing of Esophageal Brushings Enables an Aneuploidy-Based Classification of Patients with Barrett’s Esophagus,” include joint first authors Christopher Douville of the Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins University (JHU) and Helen R. Moinova from Case Western Reserve; joint senior authors Chetan Bettegowda of JHU and Willis and Chak of CWRU/UH; additional authors include Prashanthi N. Thota of Cleveland Clinic; Nicholas J. Shaheen of the University of North Carolina School of Medicine; Prasad G. Iyer of Mayo Clinic; Jean S. Wang of Washington University School of Medicine in St. Louis; John Dumot and Ashley Faulx of UH; and Kenneth W. Kinzler, Marcia Irene Canto, Nickolas Papadopoulos and Bert Vogelstein of JHU.

Scientists Reconstruct Stone Age Bear Genome From Cave Soil Samples

Scientists reconstructed ancient DNA from samples found in soil for the first time, in a development that is set to significantly advance the study of evolution. Their findings are published in the journal Current Biology.

The findings, which have been described as the “moon landings of genomics,” mean that researchers will no longer have to rely solely on finding and testing ancient fossils of bone or teeth to determine genetic ancestry. 

A team of scientists, led by Professor Eske Willerslev of the University of Cambridge’s Department of Zoology, recreated the genomes of animals, plants, and bacteria from microscopic fragments of DNA found in the Chiquihuite Cave in the Astillero Mountains of North-Western Mexico.

Their work constitutes the first time environmental DNA has been sequenced from soil and sediment, which included the ancient DNA profile of a Stone Age American black bear.

By sampling feces and urine droplets from an ancestor of the American black bear, the scientists recreated the entire genetic code of two species of the animal — the Stone Age American black bear, and a short-faced bear called Arctodus simus that died out 12,000 years ago. 

Soil samples used to reconstruct genomes for the first time

By reconstructing DNA from highly fragmented samples found in soil, the researchers opened up a whole host of possibilities for future investigations into ancient settlements.

“When an animal or a human urinates or defecates, cells from the organism are also excreted. We can detect the DNA fragments from these cells in the soil samples and have now used these to reconstruct genomes for the first time,” Professor Willerslev explained in a Cambridge University press release. “We have shown that hair, urine, and feces all provide genetic material which, in the right conditions, can survive for much longer than 10,000 years.”

Willerslev said that the new findings could lead to whole new areas of investigation into climate change and the evolution of species, as fossils are no longer a requirement. The team of researchers explained that tests could now reveal never-before-detected insights into a large number of Stone Age settlements worldwide.

“Imagine the stories those traces could tell,” Willerslev said. “It’s a little insane — but also fascinating — to think that, back in the Stone Age, these bears urinated and defecated in the Chiquihuite Cave and left us the traces we’re able to analyze today.”

Discounting recent false claims that Neuralink has the technology to build Jurassic Park, this is one of the most impressive recent developments in genomics and is likely to lead to a host of new findings about our past and the evolution of life on our planet.


Gene That Could Help Prevent or Delay Onset of Alzheimer’s Disease Identified

The protein encoded by the gene ABCC1 has the ability to break down amyloid plaques in the brain that are a characteristic symptom of Alzheimer’s disease, suggests research from the Translational Genomics Research Institute in Phoenix.

The researchers involved in the study believe that increasing expression of this gene could not only delay, but may even actively prevent the neurodegenerative disease from developing in the first place.

“Much work remains toward developing a drug that slows the development of, or prevents, Alzheimer’s disease, but our findings suggest that targeting ABCC1 offers a promising path that could eventually lead to effective therapeutics,” said Wayne Jepsen, a researcher at the Translational Genomics Research Institute (TGen), and the lead author of the study describing the work that is published in the journal Biology Open.

Alzheimer’s disease is one of the most common neurodegenerative dementias and is estimated to affect 44 million people around the world. In the U.S. alone, there are an estimated 5.5 million people with the condition the majority of whom are 65 years and older.

Attempts to develop effective drugs or therapies to prevent or treat Alzheimer’s have largely failed and the only current option for those diagnosed with this condition is moderately effective treatments to reduce symptom severity such as acetylcholinesterase inhibitors.

The Adenosine triphosphate binding cassette subfamily C member 1 (ABCC1) was previously shown to transport amyloid beta to the periphery of the blood brain barrier in a mouse model of Alzheimer’s. Other animal studies have also shown that activating expression of the ABBC1 protein can help reduce amyloid plaque build-up in the brain.

In this study, Jepsen and colleagues tested the impact of ABCC1 on human brain cell lines engineered to overexpress amyloid precursor protein.

They confirmed previous studies showing that ABCC1 transports amyloid beta – the substance that forms amyloid plaques in the brain – from the cell cytoplasm. But also showed that overexpression of the protein can actively reduce levels of amyloid beta.

ABCC1 seems to do this by increasing the number of amyloid precursor proteins that are cut by an enzyme known as an alpha-secretase. These amyloid molecules do not go on to form plaques in the same way that those cut with a beta-secretase do.

“Compounds that can dramatically increase ABCC1 transport activity, or that can increase ABCC1 expression, may prove to be viable drugs for the treatment or prevention of Alzheimer’s disease by not only increasing clearance of amyloid beta from the brain, but also by reducing the amount of amyloid beta that is produced,” said Matt Huentelman, Ph.D., TGen Professor of Neurogenomics, and the study’s senior author.

“Interestingly, due to the historical focus in cancer research on finding ABCC1 inhibiting drugs, we are betting that there are already drugs out there that are known to have an opposite ABCC1-activating effect, and our data suggest that such drugs should be examined for anti-Alzheimer’s disease activity,” he concluded.