A healthy lifestyle can lower dementia risk, even among those with a family history of cognitive decline, according to a study presented Thursday during an American Heart Association conference held virtually because of the COVID-19 pandemic.
This includes eating a healthy diet, exercising regularly, not smoking or drinking alcohol to excess and maintaining good sleep habits and a healthy body weight, the researchers said during the Epidemiology, Prevention, Lifestyle and Cardiometabolic Health Conference.
Adults ages 50 to 73 who embrace at least three of the behaviors can reduce their dementia risk by 30%, the data showed.
Those with a family history of dementia who followed at least three of the behaviors had a 25% to 35% reduced risk for the condition compared to those who followed two or fewer.
“When dementia runs in a family, both genetics and non-genetic factors, such as dietary patterns, physical activity and smoking status, affect an individual’s overall risk,” study co-author Angelique Brellenthin said in a press release.
However, the findings suggest “there may be opportunities for reducing risk by addressing those non-genetic factors,” said Brellenthin, an assistant professor of kinesiology at Iowa State University in Ames.
Having a close relative with dementia, such as a parent or sibling, can increase a person’s risk for the disease by nearly 75% compared to those with no family history, according to the Alzheimer’s Association.
Older age and high blood pressure, high cholesterol, Type 2 diabetes and depression also can increase a person’s risk for the condition, Brellenthin and her colleagues said.
For this study, the researchers analyzed health information on more than 302,000 adults ages 50 to 73 years who were free of dementia at the beginning of the study and filled out questionnaires about family health history and lifestyle habits.
Participants were given one point for each of six healthy lifestyle behaviors they followed.
These included eating a healthy diet with more fruits and vegetables and less processed meat and refined grains; meeting national exercise guidelines by engaging in 150 or more minutes of moderate to vigorous physical activity each week; and sleeping 6 to 9 hours each day.
They also received one point for drinking alcohol in moderation, not smoking and not being obese.
Participants’ health then was monitored for an average of about eight years.
Nearly 1,700, or 0.6%, of the participants developed dementia during that period, the data showed. Those with a family history of dementia had a nearly 70% higher risk for dementia compared to those who did not.
Following all six healthy lifestyle behaviors reduced participants’ risk for dementia by nearly one-half, compared to those who followed two or fewer healthy behaviors.
The results suggest that starting with small changes, such as engaging in at least three or more healthy lifestyle behaviors, can significantly lower a person’s risk for dementia, according to the researchers.
“This study provides important evidence that a healthy lifestyle can have a positive impact on brain health,” American Heart Association president Dr. Mitchell S.V. Elkind said in a press release.
“It should be reassuring and inspiring to people to know that following just a few healthy behaviors can delay cognitive decline, prevent dementia and preserve brain health,” said Elkind, a professor of neurology and epidemiology at Columbia University in New York City who was not part of Brellenthin’s study.
Time is all around us, a constant that keeps the world and universe ticking. (Image credit: Shutterstock)
When considering time, it’s easy to quickly get lost in the complexity of the topic. Time is all around us — it’s ever-present and is the basis of how we record life on Earth. It’s the constant that keeps the world, the solar system and even the universe ticking.
Civilizations have risen and fallen, stars have been born and extinguished, and our one method of keeping track of every event in the universe and on Earth has been comparing them to the present day with the regular passing of time. But is it really a constant? Is time really as simple as a movement from one second to the next?
Some 13.8 billion years ago, the universe was born, and since then time has flown by to the present day, overseeing the creation of galaxies and the expansion of space. But when it comes to comparing time, it’s daunting to realize just how little of it we have actually experienced.
Earth might be 4.5 billion years old, but modern humans have inhabited the planet for around 300,000 years — that’s just 0.002% the age of the universe. Feeling small and insignificant yet? It gets worse. We have experienced so little time on Earth that in astronomical terms we’re entirely negligible.
In the 17th century, physicist Isaac Newton saw time as an arrow fired from a bow, traveling in a direct, straight line and never deviating from its path. To Newton, one second on Earth was the same length of time as that same second on Mars, Jupiter or in deep space. He believed that absolute motion could not be detected, which meant that nothing in the universe had a constant speed, even light. By applying this theory, he was able to assume that if the speed of light could vary, then time must be constant. Time must tick from one second to the next, with no difference between the length of any two seconds. This is something that it’s easy to think is true. Every day has roughly 24 hours; you don’t have one day with 26 and one with 23 hours.
However, in 1905, Albert Einstein asserted that the speed of light doesn’t vary, but rather it is a constant, traveling at roughly 186,282 miles per second (299,792 kilometers per second). He postulated that time was more like a river, ebbing and flowing depending on the effects of gravity and space-time. Time would speed up and slow down around cosmological bodies with differing masses and velocities, and therefore one second on Earth was not the same length of time everywhere in the universe.
This posed a problem. If the speed of light was really a constant, then there had to be some variable that altered over large distances across the universe. With the universe expanding and planets and galaxies moving on a humongous scale, something had to give to allow for these small fluctuations. And this variable had to be time.
It was ultimately Einstein’s theory that was not only believed to be the truth, but also proven to be entirely accurate. In October 1971, two physicists named J.C. Hafele and Richard Keating set out to prove its validity. To do this, they flew four cesium atomic clocks on planes around the world, eastward and then westward.
According to Einstein’s theory, when compared with ground-based atomic clocks — in this instance at the U.S. Naval Observatory in Washington, D.C. — Hafele and Keating’s airborne clocks would be about 40 nanoseconds slower after their eastward trip, and about 275 nanoseconds faster after traveling west, due to the gravitational effects of the Earth on the velocity of the planes, according to their 1972 study in the journal Science. Incredibly, the clocks did indeed register a difference when traveling east and west around the world — about 59 nanoseconds slower and 273 nanoseconds faster, respectively, when compared with the U.S. Naval Observatory. This proved that Einstein was correct, specifically with his theory of time dilation, and that time did indeed fluctuate throughout the universe.
Newton and Einstein did agree on one thing, though — that time moves forward. So far, there is no evidence of anything in the universe that is able to dodge time and move forwards and backward at will. Everything ultimately moves forward in time, be it at a regular pace or slightly warped if approaching the speed of light. But why does time tick forward? Scientists aren’t certain, but they have several theories to explain time’s one-track “mind.” One of these brings in the laws of thermodynamics, specifically the second law. This states that everything in the universe wants to move from low to high entropy, or from uniformity to disorder, beginning with simplicity at the Big Bang and moving to the almost random arrangement of galaxies and their inhabitants in the present day. This is known as the “arrow of time,” or sometimes “time’s arrow,” likely coined by British astronomer Arthur Eddington in 1928, analytic philosopher Huw Price said at Séminaire Poincaré in 2006.
Eddington suggested that time was not symmetrical: “If as we follow the arrow, we find more and more of the random element in the state of the world, then the arrow is pointing towards the future; if the random element decreases, the arrow points towards the past,” he wrote in “The Nature of the Physical World” in 1928. For example, if you were to observe a star in almost uniformity, but later saw it explode as a supernova and become a scattered nebula, you would know that time had moved forward from equality to chaos. Advertisement
Discover high-throughput multiplex assays, in antibody discovery and immuno-oncology researchSEE MORE
Another theory suggests that the passage of time is due to the expansion of the universe. As the universe expands, it pulls time with it, as space and time are linked as one; but this would mean that if the universe were to reach a theoretical limit of expansion and begin to contract, then time would reverse — a slight paradox for scientists and astronomers. Would time really move backward, with everything coming back to an era of simplicity and ending with a Big Crunch? It’s unlikely we will be around to find out, but scientists can postulate on what might happen.
It’s incredible to think of the progress humanity has made in our understanding of time over the past century. From ancient time-telling sundials to modern atomic clocks, we can even track the passing of a second more closely than ever before. Time remains a complex topic, but thanks to scientific visionaries, we are getting closer to unlocking the secrets of this not-so-constant universal constant.
The importance of Einstein’s theory of special relativity
Einstein’s theory of special relativity relies on one key fact: The speed of light is the same no matter how you look at it. To put this into practice, imagine you are traveling in a car at 20 mph (32 km/h), and you drive past a friend who is standing still. As you pass them, you throw a ball out in front of the car at 10 mph (16 km/h).
To your friend, the ball’s speed combines with that of the car, and so appears to be traveling at 30 mph (48 km/h). Relative to you, however, the ball travels at only 10 mph, as you are already traveling at 20 mph.
Now imagine the same scenario, but this time you pass your stationary friend while traveling at half the speed of light. Through some imaginary contraption, your friend can observe you as you travel past. This time you shine a beam of light out of the car windscreen.
In our previous calculation we added together the speed of the ball and the car to find out what your friend saw, so in this instance, does your friend see the beam of light traveling at one-and-a-half times the speed of light?
According to Einstein, the answer is no. The speed of light always remains constant, and nothing can travel faster than it. On this occasion, both you and your friend observe the speed of light traveling at its universally agreed value at roughly 186,282 miles per second. This is the theory of special relativity, and it’s very important when talking about time.
It was once thought that space and time were separate, and that the universe was merely an assortment of cosmic bodies arranged in three dimensions. Einstein, however, introduced the concept of a fourth dimension — time — that meant that space and time were inextricably linked. The general theory of relativity suggests that space-time expands and contracts depending on the momentum and mass of nearby matter. The theory was sound, but all that was needed was proof.
That proof came courtesy of NASA’s Gravity Probe B, which demonstrated that space and time were indeed linked. Four gyroscopes were pointed in the direction of a distant star, and if gravity did not have an effect on space and time, they would remain locked in the same position. However, scientists clearly observed a “frame-dragging” effect due to the gravity of Earth, which meant the gyroscopes were pulled very slightly out of position. This seems to prove that the fabric of space itself can be altered, and if space and time are linked, then time itself can be stretched and contracted by gravity.
How long is a second?
There are two main ways of measuring time: dynamic and atomic time. The former relies on the motion of celestial bodies, including Earth, to keep track of time, whether it’s the rotation time of a distant spinning star such as a pulsar, the motion of a star across our night sky or the rotation of Earth. However, a spinning star not withstanding, which can be hard to observe, these methods are not always entirely accurate.
The old definition of a second was based on the rotation of Earth. As it takes the sun one day to rise in the east, set in the west and rise again, a day was almost arbitrarily divided into 24 hours, an hour into 60 minutes and a minute into 60 seconds. However, Earth doesn’t rotate uniformly. Its rotation decreases at a rate of about 30 seconds every 10,000 years due to factors such as tidal friction. Scientists have devised ways to account for the changing speed of Earth’s rotation, introducing leap seconds,” but for the most accurate time you have to go even smaller.Advertisement
Atomic time relies on the energy transition within an atom of a certain element, commonly caesium. By defining a second using the number of these transitions, time can be measured with an accuracy of losing a tiny portion of a second in a million years. The definition of a second is now defined as 9,192,631,770 transitions within a caesium atom, Scientific American reported.
Atomic clocks: The most accurate track of time
The most accurate clock in the universe would probably be a rotating star like a pulsar, but on Earth atomic clocks provide the most accurate track of time. The entire GPS system in orbit around Earth uses atomic clocks to accurately track positions and relay data to the planet, while entire scientific centers are set up to calculate the most accurate measure of time — usually by measuring transitions within a caesium atom.
While most atomic clocks rely on magnetic fields, modern clocks are using lasers to track and detect energy transitions within caesium atoms and keep a more definite measure of time. Although caesium clocks are currently used to keep time around the world, strontium clocks promise twice as much accuracy, while an experimental design based on charged mercury atoms could reduce discrepancies even further to less than 1 second lost or gained in 400 million years.
But for urban dwellers in big cities like Tokyo, Japan, smaller homes on small plots of land are the norm to start with, not the exception. In creating a new home for a couple in their 40s, Tokyo-based Unemori Architects managed to make the most of the tiny 280-square-foot plot of land by building up vertically and doing some strategic rearrangement of the home’s spatial volumes to bring in more sunlight and ventilation.
“In Tokyo, tiny plots of land are the standard. Houses in the city have to be compact and cleverly structured. With House Tokyo, we reacted to the challenge by designing the house as stacked, interlinked cubes with a very open floor plan.”
In stacking and manipulating the volumes, which are wrapped with corrugated galvanized steel, the home feels less hemmed in by the neighboring buildings. In addition, the new multipurpose outdoor terrace that has been created on top of one of the volumes helps to compensate for the absence of a backyard in this small house, which is located in a densely packed urban neighborhood. The clients’ busy urban lifestyle means that they are also out of the house often, making the most of what this cosmopolitan city has to offer.https://5b6e3329abfd6398f06ba853b74111bf.safeframe.googlesyndication.com/safeframe/1-0-38/html/container.html
Most importantly, the differential stacking produces gaps that allow for the diverse placement of windows, which is beneficial in many ways, says Unemori:
“The small gap between the neighboring houses brings a view to the sky, wind circulation, and of course, sunlight.”
A large kitchen and dining area occupy the main level, and it seems to be also a lounging area integrated here too, with a sofa suspended from the platform above, facing a television screen mounted on the far wall. There is plenty of storage here to be found in the long row of cabinets, some of which stretch over the entry hall, thus bridging the two spaces.
Thanks to the interplay of volumes here, the ceiling height here extends far up, creating a greater sense of spaciousness. In addition, heating and cooling are made more efficient with the installation of a ventilation duct here that directs warm air from the upper area back down to the living zones during the winter. Conversely, during the summer, one can flip a switch to bring warm air outside, so that the air conditioner operates more efficiently.
Below the main level is the bedroom, which is tucked away in the half-basement. Here it is darker and quieter—perfect for a bedroom. As it is equipped with two sliding door entries, the space here can also be potentially divided into two separate rooms, to accommodate the clients’ wishes that they might move out someday and have their house rented out to tenants instead.
In either one of two hallways leading out of the bedroom’s two doorways, we have a small washroom and toilet, and a separate shower room, in addition to various storag
e spaces and a washing machine that is tucked away underneath the bent metal stairs.
With so little land to work with, the architects’ intriguing design strategy has allowed them to create a series of unique spaces and interior views that are ultimately connected together enough to create a unified whole that feels big, despite its small size. Ultimately, it’ll be creative strategies like this that will help to make the small house typology more appealing and livable for a wider audience.
They can survive temperatures close to absolute zero. They can withstand heat beyond the boiling point of water. They can shrug off the vacuum of space and doses of radiation that would be lethal to humans. Now, researchers have subjected tardigrades, microscopic creatures affectionately known as water bears, to impacts as fast as a flying bullet. And the animals survive them, too—but only up to a point. The test places new limits on their ability to survive impacts in space—and potentially seed life on other planets.
The research was inspired by a 2019 Israeli mission called Beresheet, which attempted to land on the Moon. The probe infamously included tardigrades on board that mission managers had not disclosed to the public, and the lander crashed with its passengers in tow, raising concerns about contamination. “I was very curious,” says Alejandra Traspas, a Ph.D. student at Queen Mary University of London who led the study. “I wanted to know if they were alive.”
Traspas and her supervisor, Mark Burchell, a planetary scientist at the University of Kent, wanted to find out whether tardigrades could survive such an impact—and they wanted to conduct their experiment ethically. So after feeding about 20 tardigrades moss and mineral water, they put them into hibernation, a so-called “tun” state in which their metabolism decreases to 0.1% of their normal activity, by freezing them for 48 hours.
They then placed two to four at a time in a hollow nylon bullet and fired them at increasing speeds using a two-stage light gas gun, a tool in physics experiments that can achieve muzzle velocities far higher than any conventional gun. When shooting the bullets into a sand target several meters away, the researchers found the creatures could survive impacts up to about 900 meters per second (or about 3000 kilometers per hour), and momentary shock pressures up to a limit of 1.14 gigapascals (GPa), they report this month in Astrobiology. “Above [those speeds], they just mush,” Traspas says.
The results suggest the tardigrades on Beresheet were unlikely to survive. Although the lander is thought to have crashed at a few hundred meters per second, the shock pressure its metal frame generated hitting the surface would have been “well above” 1.14 GPa, Traspas says. “We can confirm they didn’t survive.”
The research also places new limits on a theory known as panspermia, which suggests some forms of life could move between worlds, as stowaways on meteorites kicked up after an asteroid strikes a planet or moon. Eventually, the meteorite could impact another planet—along with its living cargo.
Charles Cockell, an astrobiologist at the University of Edinburgh who was not involved in the study, says the research shows how unlikely panspermia is. “What this paper is showing is that complex multicellular animals cannot be easily transferred,” he says. “In other words, Earth is a biogeographical island with respect to animals. They’re trapped, like a flightless bird on an island.”
Traspas, however, says it shows panspermia “is hard,” but not impossible. Meteorite impacts on Earth typically arrive at speeds of more than 11 kilometers per second. On Mars, they collide at least at 8 kilometers per second. These speeds are well above the threshold for tardigrades to survive. However, some parts of a meteorite impacting Earth or Mars would experience lower shock pressures that a tardigrade could live through, Traspas says.
Objects strike the Moon at still lower speeds. When impacts on Earth send bits of rock and debris hurtling toward the Moon, about 40% of that material could travel at speeds low enough for any tardigrades to survive, Traspas and Burchell say, theoretically allowing them to jump from our planet to the Moon. A similar passage, they add, could take place from Mars to its moon, Phobos. And other life might have an even better chance of surviving; compared with water bears, some microbes can survive even faster impacts of up to 5000 meters per second, according to previous research.
The new experiment also has implications for our ability to detect life on icy moons in the outer Solar System. Saturn’s moon Enceladus, for example, ejects plumes of water into space from a subsurface ocean that could support life, as might Jupiter’s moon Europa. If the findings of the new study apply to potential life trapped in the plumes, a spacecraft orbiting Enceladus—at relatively low speeds of hundreds of meters per second—might sample and detect existing life without killing it.
No such orbiting mission is currently planned for Enceladus or Europa—upcoming NASA and European flyby missions will swoosh by the latter at high speeds of several kilometers per second. But perhaps one day far in the future an orbiter might be in the cards, with an ability to detect life at gentler speeds. “If you collect it and it died on impact, how do you know whether it’s been dead for millions of years?” asks Anna Butterworth, a planetary scientist at the University of California, Berkeley, who has studied plume impacts on spacecraft. “If you collect microscopic life and it’s moving around, you can say it’s alive.”
Abrain-implant system trained to decode the neural signals for handwriting from a paralyzed man enabled a computer to type up to 90 characters per minute with 94 percent accuracy, researchers report yesterday (May 12) in Nature. The study’s authors say this brain-computer interface (BCI) is a considerable improvement over other experimental devices aimed at facilitating communication for people who cannot speak or move, but many steps remain before it might be used clinically.
“There are so many aspects of [the study] that are great,” says Emily Oby, who works on BCIs at the University of Pittsburgh and was not involved in the work. “It’s a really good demonstration of human BCI that is working towards clinical viability,” and also contributes to understanding why the handwriting-based system seems to work better than BCIs based on translating the neural signals for more straightforward physical motions such as pointing at letters on a display.
The study came out of a long-term clinical trial called BrainGate2 in which participants who are paralyzed have sensors implanted in the motor cortex of their brains and work with researchers who aim to use the sensors’ data to develop BCIs. “Because of the animal model heritage and the history of the [BCI] field, a lot of the early stuff is focused on this point-and-click typing method where you move a cursor on a screen, and you type on keys individually,” explains Frank Willett, a member of the Neural Prosthetics Translational Laboratory (NPTL) at Stanford University and a Howard Hughes Medical Institute research specialist. “We’re interested in kind of pushing the boundaries and looking at other ways to let people communicate.”
Willett and his colleagues worked with a BrainGate2 participant nicknamed T5 who has a spinal injury, is able to talk, and has a sensor in an area of the brain known as the hand knob that is associated with hand movement. In several sessions, they asked T5 to pretend he was holding a pen and writing hundreds of sentences they showed him on a screen. They then used the activity detected by T5’s sensor to train a neural network to identify the letters T5 was writing, and tested the program’s ability to generate text in real time based on brain signals generated as he imagined writing new sentences.
An algorithm interpreted patterns of electrical signals from T5’s brain as he imagined writing letters.
The researchers report that the trained network enabled T5 to “type” at a speed of up to 90 characters per minute and had 94.1 percent accuracy in deciphering the letters he wrote. That’s a considerable improvement on a previous BCI the group developed that was based on having participants control a computer mouse with their brain signals and click on letters, which achieved about 40 characters per minute. In fact , the authors write, to their knowledge, it’s the fastest typing rate for any BCI so far.
Speed is critical for people who need BCIs to communicate, notes Oby, because “the faster and more efficiently that they can communicate the better, in terms of increasing their quality of life, and just making interactions more easy and smooth and less stressful.”
To see what accounts for this superior performance, the authors analyzed the neural patterns corresponding to letters and to the straight reaching movements used in the point-and-click BCI. They found that the patterns for the letters are more distinct from one another, making them easier for a neural network to decipher. They also devised their own 26-letter alphabet, replete with curvy lines, that their simulations indicate would enable an even more accurate BCI by eschewing letters that are written similarly to one another.
“[It] makes a lot of sense . . . that having more complex movement dynamics can really help improve the communication rate, the accuracy of the decoding,” says Edward Chang, a neurosurgeon at the University of California, San Francisco, who has worked informally with the NPTL group but was not involved in the current study. “They’re really exploiting a new dimension of features that help make the signals more discriminable.”
There are several improvements that would be needed to make the BCI ready for clinical use. Those include tweaks to the brain implant itself, such as making it smaller and capable of wireless signal transmission, says study coauthor Jaimie Henderson, a neurosurgeon in the NPTL who consults for the BCI company Neuralink and is on the medical advisory board for Enspire, a company exploring deep-brain stimulation for stroke recovery. In addition, in the study the researchers needed to regularly calibrate the BCI to account for minute shifts in the positions of the sensors that alter what neural activity they pick up; ideally, Henderson and Willett say, this process, as well as the initial training of the neural network, would be automated.
Henderson, Willett, and senior author Krishna Shenoy, another NPTL member and a Howard Hughes Medical Institute investigator who consults for or serves on the advisory boards of several BCI-related companies, have filed a patent application for the neural decoding method they used and are talking with companies about the possibility of licensing it, Shenoy says. Ultimately, Willett and Henderson say, they’re interested in exploring neural signals for speech as a way to enable even faster communication than with handwriting. The rate of speech is about 150–200 words per minute, Henderson notes, and decoding it is an interesting scientific endeavor because it’s uniquely human and because it’s not fully understood how speech is produced in the brain. “We feel like that’s a very rich area of exploration, and so one of our big goals over the next five to ten years is to really tackle the problem of understanding speech and decoding it into both text and spoken word.”
F.R. Willett et al., “High-performance brain-to-text communication via handwriting,” Nature, 593:249–54, 2021.
There’s a “sweet spot” for the amount of sleep you should get to reduce your risk of heart attack and stroke, new research shows.
Folks who get six to seven hours a sleep a night — no more, no less — have the lowest chance of dying from a heart attack or stroke, according to new findings.
Waking early or dozing on past that ideal window increases your risk of heart-related death by about 45%, researchers found.
This trend remained true even after they accounted for other known risk factors for heart disease or stroke, including age, high blood pressure, diabetes, smoking, BMI (body mass index) and high cholesterol levels.
“Even then, sleep came out to be an independent risk factor,” said lead researcher Dr. Kartik Gupta, a resident in internal medicine at Henry Ford Hospital in Detroit.
For the study, Gupta and his colleagues analyzed data from more than 14,000 participants in the federally funded U.S. National Health and Nutrition Examination Survey between 2005 and 2010. As part of the survey, these folks were asked how long they usually slept.
Researchers tracked participants for an average of 7.5 years to see if they died from heart attack, heart failure or stroke. They also assessed their heart health risk scores as well as their blood levels of C-reactive protein (CRP), which increases when there’s inflammation in your body. High CRP levels have been associated with heart disease.
The research team found a U-shaped relationship between heart risk and sleep duration, with risk at its lowest among people who got between six and seven hours of sleep on average.
A lack of sleep already has been linked to poor heart health, said Dr. Martha Gulati, editor-in-chief of CardioSmart.org, the American College of Cardiology’s educational site for patients.
“We have a lot of data related to less sleep,” said Gulati, a cardiologist. She noted that a number of key heart risk factors — blood pressure, glucose tolerance, diabetes and inflammation — are exacerbated by too little sleep.
There’s not as much evidence regarding those who slumber too long and their heart risk, however, Gulati and Gupta said.
Gupta and his colleagues found one possible explanation in their research. Based on patients’ levels of CRP, inflammation accounted for about 14% of heart-related deaths among short sleepers and 13% among long sleepers, versus just 11% of folks who got the optimal six to seven hours of sleep.
“Patients who sleep for six to seven hours have the least CRP, so this inflammation might be driving increased cardiovascular risk,” Gupta said.
It might be that people who sleep longer than seven hours are just getting lousy sleep, and so have to doze longer, Gulati said. Poor quality sleep could be driving the increased risk among late snoozers.
“You wonder if somebody is sleeping longer because they just didn’t get a good night’s sleep,” Gulati said. “I always say there’s good sleep and there’s bad sleep. You might be in bed for eight hours, but is it good quality sleep?”
Here are some tips for improving your sleep, courtesy of Harvard Medical School:
Avoid caffeine and nicotine four to six hours from bedtime.
Keep your bedroom dark, quiet and cool to promote better sleep.
Establish a relaxing routine an hour or so before bed.
Don’t try to force yourself to sleep — if you aren’t asleep within about 20 minutes, get up and do something relaxing for a bit until you feel sleepy.
Eat dinner several hours before bedtime and avoid foods that can upset your stomach.
Exercise earlier in the day, at least three hours before bed.
“In the medical community we know it’s important to sleep, but we still don’t treat it like something we should be asking about routinely,” Gulati said. “I wish I could say doctors were good enough at asking about sleep. I think it should be like a vital sign.”
The findings will be presented virtually May 15 at the annual meeting of the American College of Cardiology. Findings presented at medical meetings are considered preliminary until published in a peer-reviewed journal.
The U.S. Centers for Disease Control and Prevention offers more sleep basics.
SOURCES: Kartik Gupta, MD, internist, Henry Ford Hospital, Detroit; Martha Gulati, MD, editor-in-chief, CardioSmart.org; online presentation, American College of Cardiology virtual annual meeting, May 15, 2021
A new study has found evidence of a link between prenatal maternal depressive symptoms and alterations in early brain development. The findings have been published in the journal Psychiatry Research: Neuroimaging.
“Child behavioral and emotional development as well as adult mental and physical health might be shaped by maternal depressive symptoms during pregnancy,” said researcher Henriette Acosta of the University of Turku. “The underlying biological mechanisms are not yet well understood and could involve alterations in fetal brain development.”
The study examined neuroimaging data from 28 children, who were scanned using magnetic resonance imaging when they were 4 years old. The children’s mothers had completed multiple assessments of anxiety and depressive symptoms during and after their pregnancy.
Acosta and her colleagues were particularly interested in a brain region known as the amygdala, which has been implicated in psychiatric disorders such as depression, post-traumatic stress disorder, schizophrenia and autism spectrum disorder.
After controlling for maternal anxiety, the researchers found that the children tended to have smaller right amygdalar volumes when their mothers experienced more depressive symptoms during pregnancy. Postnatal depressive symptoms, however, were not associated with amygdalar volumes.
“Higher maternal depressive symptoms during early and late pregnancy were associated with smaller subcortical brain volumes in 4-year-olds, which were more pronounced in boys than girls. The affected brain area, the amygdala, plays a prominent role in emotion processing and emotional memory and is implicated in several psychiatric disorders,” Acosta told PsyPost.
“The study results suggest that maternal depressive symptoms as early as in the prenatal period alter early brain development and might thus influence the offspring’s vulnerability to develop a mental disorder over the lifespan.”
The researchers controlled for a number of factors besides anxiety, including childhood maltreatment, maternal education, maternal age, prenatal medication, and maternal substance use. But the study — like all research — includes some limitations.
“A major caveat is the unknown role of underlying genetic effects that the child inherits from the mother and could impact child’s brain developmental trajectory as well as their vulnerability to stress and depression,” Acosta explained. “Moreover, the sample size of this study was rather small. Hence, the here reported findings should be addressed in future studies with larger sample sizes and genetically informed designs.”
Nevertheless, the study indicates that prenatal depression could have long-lasting effects on offspring health.
“The findings of this study support the notion that pregnancy constitutes a vulnerable period of an individual’s development and that the protection of the expectant mother from adversity should be a primary concern of society,” Acosta said.
In the most detailed genomic study ever conducted of individuals over the age of 100 years, researchers have homed in on several particular genetic characteristics that seem to confer protection from age-related diseases. Gene variants improving DNA repair processes were particularly prominent in this cohort of supercentenarians.
If you eat well, exercise frequently and avoid those detrimental vices, you can reasonably hope to live a long and healthy life. Of course, many age-related diseases seem almost inevitable, whether they catch up with you in your 80s or your 90s. But some people show a propensity for extreme longevity, living healthily well past the age of 100.
Research has shown those who live beyond the age of 100 tend to present extraordinarily healthy signs of aging. They are less likely to have been hospitalized in earlier life and have seemed to avoid many age-related conditions most people battle in their 80s or 90s, such as heart disease or neurodegeneration.
This new study presents a comprehensive investigation of 81 semi-supercentenarians (aged over 105) and supercentenarians (aged over 110). The researchers also matched this cohort against a group of healthy, geographically matched subjects aged in their late 60s. The goal was to genetically distinguish those generally healthy people in their late 60s from those extremely healthy supercentenarians.
Five particular genetic changes were commonly detected in the supercentenarian cohort, concentrated around two genes called STK17A and COA1.
STK17A is known to be involved in DNA damage response processes. As we age, the body’s DNA repair mechanisms become less effective. Accumulated DNA damage is known to be responsible for some signs of aging, so increased expression of STK17A can favor healthy aging by preserving DNA repair processes in old age.
Reduced expression of COA1 in the supercentenarians was also detected. This gene plays a role in communications between a cell’s nucleus and mitochondria.
“Previous studies showed that DNA repair is one of the mechanisms allowing an extended lifespan across species,” explains senior author on the new study, Cristina Giuliani. “We showed that this is true also within humans, and data suggest that the natural diversity in people reaching the last decades of life are, in part, linked to genetic variability that gives semi-supercentenarians the peculiar capability of efficiently managing cellular damage during their life course.”
The researchers also found the supercentenarians displayed an unexpectedly low level of somatic gene mutations, which are the mutations we all generally accumulate as we grow older. It is unclear why these older subjects have avoided the age-related exponential increase usually seen with these kinds of mutations.
“Our results suggest that DNA repair mechanisms and a low burden of mutations in specific genes are two central mechanisms that have protected people who have reached extreme longevity from age-related diseases,” says Claudio Franceschi, another senior author on the study.
The new research was published in the journal eLife.
Female soccer players are twice as likely to suffer concussion as their male counterparts, a study of more than 80,000 teenage players across US high schools has found.
Researchers analysed survey data from around 43,000 male and 39,000 female players from schools in Michigan over 3 academic years. A striking difference emerged between the sexes in their likelihood of having a sports-related head injury, with the girls’ chance of concussion 1.88 times higher than the boys’, according to the findings published on 27 April in JAMA Network Open1.
Scientists already suspected that head injuries were more common, and required longer recovery times, in female athletes. But concrete data were lacking, says neuropathologist Willie Stewart at the University of Glasgow, UK, who led the study. “We’re doing so little research in female athletes,” he says. Such a large volume of data on sports injuries, collected by the Michigan High School Athletic Association, offered an opportunity to investigate whether female athletes really are at higher risk of concussion (see ‘Concussion risk’).
“There were indeed differences between male and female athletes,” says Stewart. How the high-school players sustained their injuries also differed significantly between male and female adolescents: the boys’ most common way of becoming concussed was through bashing into another player, with almost half of all concussions reported happening in this way. Girls were most likely to be concussed after colliding with another object, such as the ball or one of the goalposts. Boys were also more likely to be removed from play immediately after a suspected head injury than were girls.
The different mechanism for head injuries in girls is an important finding, Stewart says. “It might be one reason girls with concussion were not being picked up on the field so regularly,” he adds. Concussion-management systems currently in use — from how potential head injuries are spotted during a match, to how athletes are treated and how quickly they return to play — are almost exclusively dictated by research on male athletes, says Stewart. “Rather than the current, male-informed, one-size-fits-all approach to concussion management, there might need to be consideration of sex-specific approaches,” he says. This could include restrictions on heading footballs, or having more medically trained personnel present during female matches.
Liz Williams, who researches biomechanics and head injuries at Swansea University, UK, conducted a large international study into female rugby players and their experiences of injury in 2020. Stewart’s findings don’t surprise her. “We’re all finding the same thing, females are more predisposed to brain injury than males,” she says, “and the incidence is likely higher, in my opinion, than what is being reported.”doi: https://doi.org/10.1038/d41586-021-01184-8
1.Bretzin, A. C. et al.JAMA Netw. Open4, e218191(2021).
Researchers have detected four types of Alzheimer’s by tracking different patterns of tau protein accumulation in the brains of patients.
By Rich Haridy
A new international study has found four distinct patterns of toxic protein spread in the brains of patients with Alzheimer’s disease. The findings indicate these patterns correspond with particular symptoms, and the researchers hypothesize these four variants could respond to different treatments.
The research focused the accumulation and spread of a toxic protein in the brain called tau. Alongside amyloid beta, another protein known to be implicated in neurodegeneration, the spread of tau has been associated with cognitive decline seen in Alzheimer’s.
Positron emission tomography (PET) imaging was used to monitor levels of tau, and patterns of spread, in the brains of over 1,000 subjects. The cohort spanned the spectrum of Alzheimer’s patients, from those yet to display symptoms of cognitive decline to those in advanced stages of dementia.
“In contrast to how we have so far interpreted the spread of tau in the brain, these findings indicate that tau pathology in the brain varies according to at least four distinct patterns,” says Jacob Vogel, lead author on the new study. “This would suggest that Alzheimer’s is an even more heterogeneous disease than previously thought.”
The four variants of tau spread clearly corresponded with symptomatic experiences. These four variants were also quite evenly spread across the cohort meaning they all were common and there likely is no one single dominant type of Alzheimer’s disease.
Variant one was the most prevalent, detected in 33 percent of cases. This pattern of tau spread was primarily located in the temporal lobe and influenced memory.
Variant two, on the other hand, displayed greater tau spread in other parts of the cerebral cortex. Around 18 percent of cases showed this kind of spread and it manifested in difficulties with executive functions such as self-regulation and focus.
Variant three, the second-most prevalent subtype, was noted in 30 percent of cases. It showed distinct tau accumulations in the visual cortex. Symptoms of this variant included difficulties distinguishing distance, shapes, contours and general orientation.
The final variant described in the study saw asymmetric spread of tau across the left hemisphere of the brain. This mostly influenced language skills and was detected in 19 percent of cases.
“Because different regions of the brain are affected differently in the four subtypes of Alzheimer’s, patients develop different symptoms and also prognoses,” notes Oskar Hansson, from Lund University and corresponding author on the study “This knowledge is important for doctors who assess patients with Alzheimer’s, and it also makes us wonder whether the four subtypes might respond differently to different treatments.”
This is not the first research to divide Alzheimer’s disease into different subtypes. Currently the disease is only classified as either early-onset Alzheimer’s or late-onset Alzheimer’s, and a milestone 2018 study presented six different disease categories based on specific cognitive and genomic characteristics.
A more recent brain tissue study divided the disease into three different molecular subtypes. However, until now it has been difficult translating these findings into a potential diagnostic tool.
A strength of this new study is the way it takes an accessible brain imaging tool and uses it to categorize tau accumulations alongside symptomatic presentation in a large number of patients. The researchers cautiously note follow-up work is needed to validate these patterns over longer periods of time, but it seems to be increasingly clear Alzheimer’s is a much more diverse disease that previously assumed.
“We now have reason to reevaluate the concept of typical Alzheimer’s, and in the long run also the methods we use to assess the progression of the disease,” says Vogel.