Coffee may help protect you against Parkinson’s

By Nancy Clanton, The Atlanta Journal-Constitution

Studies have shown drinking coffee might protect against the development of Parkinson’s disease in people who have no genetic risk factors for the disease. A new study, however, suggests coffee might protect people who have genetic risk factors, too.

“These results are promising and encourage future research exploring caffeine and caffeine-related therapies to lessen the chance that people with this gene develop Parkinson’s,” study author Grace Crotty, of Massachusetts General Hospital in Boston and a member of the American Academy of Neurology, told Medical Dialogues. “It’s also possible that caffeine levels in the blood could be used as a biomarker to help identify which people with this gene will develop the disease, assuming caffeine levels remain relatively stable.

This recent study, published in the journal Neurology, looked at a genetic mutation that increases the risk of Parkinson’s; the mutation is in a gene called LRRK2, for leucine-rich repeat kinase 2.

The Boston study compared 188 people with Parkinson’s to 180 people without it. Both groups were composed of people who had the LRRK2 gene mutation and those who didn’t. In addition, 212 research subjects filled out a survey about how much caffeine they consumed each day.

The researchers then looked at not only the amount of caffeine in the participants’ blood, but also other chemicals that are produced as caffeine is metabolized in the body.

“Among people carrying the LRRK2 gene mutation, those who had Parkinson’s had a 76% lower concentration of caffeine in their blood than those who did not have Parkinson’s,” Medical Dialogue wrote. “People with Parkinson’s with a normal copy of the gene had a 31% lower concentration of caffeine in their blood than non-carriers without Parkinson’s.

“Carriers of the gene mutation who had Parkinson’s also had lower consumption of caffeine in their diet. The gene carriers with Parkinson’s consumed 41% less caffeine per day than the people who did not have Parkinson’s, both with and without the gene mutation.”

The Boston study examined participants at only one point in time, Crotty noted, so it isn’t helpful in understanding what effect, if any, caffeine has over time on the risk on Parkinson’s. The study also doesn’t prove caffeine causes a lower risk of Parkinson’s; it only shows an association, she pointed out.

Sleep apnea severity tied to greater buildup of Alzheimer’s brain plaques

Stephen Robinson, Ph.D.

n a research first, Alzheimer’s-like amyloid plaques have been found in the brains of people with clinically verified obstructive sleep apnea, according to the results of a small study published last week in the journal Sleep.

Sleep apnea and Alzheimer’s are thought to be related, but the reasons for the connection remains unclear, the researchers said. In the new study, plaques were found to develop in the same place (the hippocampus) and spread in the same way in the brains of people with obstructive sleep apnea as they do in Alzheimer’s. In addition, the severity of sleep apnea was linked with greater plaque build-up.

Notably, the use of continuous positive airway pressure (the standard treatment for moderate to severe sleep apnea) made no difference in the amount of plaques found, reported Stephen Robinson, Ph.D., of RMIT University, Melbourne, Australia.

The study participants had no clinical symptoms of dementia before they died. This suggests that they may have been in an early pre-dementia stage of disease, the authors concluded.

“While some people may have had mild cognitive impairment or undiagnosed dementia, none had symptoms that were strong enough for an official diagnosis, even though some had a density of plaques and tangles that were sufficiently high to qualify as Alzheimer’s disease,” Robinson said.

The authors hope to conduct a larger clinical trial of the study, and plan to further analyze the current samples for additional understanding of how the participants’ brains had changed.

Sleep apnea severity tied to greater buildup of Alzheimer’s brain plaques

Horses help in the development of emotionally well-adjusted teenagers, study finds

The study team found that the equine students had fewer emotional and behavioral problems, and their prosocial behavior was about four times better than that of the control group. Photo by Philippe Oursel

Interacting with horses is great for the development of emotionally well-adjusted adolescents, the findings of a new study show.

The differences between adolescents involved with horses and those without such contact were found to be quite profound in some areas.

For their study, Imre Zoltán Pelyva and his fellow researchers focused on a group of healthy students, aged 14–18, without special educational needs or problems.

Those with contact with horses attended 10 agricultural secondary schools in Hungary. They all took part in a four-year equine program. These students had no diagnosed physical or psychological difficulties.

Within the curriculum, they spent two days — 9 to 13 hours each week — with horses. They fed and groomed the horses, cleaned the stable, and worked with the horses on the lunge, from the saddle, and also undertook carriage driving.

Members of the control group comprised students from the same schools who studied non-horse related, agricultural, or food industry vocations, such as gardening, animal husbandry, meat processing or baking.

They did not take part in any activities involving horses.

All the students — there were 525 in all — underwent evaluations at the beginning and at the end of their studies. Central to this was a recognised questionnaire to assess their emotional and behavioral problems and psychic disturbances.

The results between the equine students and the control group were then compared.

The study team, writing in the journal Environmental Research and Public Health, found that the equine students had fewer emotional and behavioral problems, and their prosocial behavior was about four times better than that of the control group.

Prosocial behavior is social behavior that benefits other people or society as a whole, such as helping, sharing, donating, co-operating, and volunteering.

The study team, from the University of Pécs and the University of Szeged in Hungary, characterized the differences as remarkable.

“Our results indicate that students of equine-related vocations are more helpful and empathetic, and have fewer behavior problems than those studying other vocations.

Equine students were assessed as having fewer behavior problems upon admission to their school (all of them had regular contact with horses before). However, impressively, the rate of decline in these problems was found to be more significant than in the other group.

The study team, discussing their findings, said the findings that favorable characteristics were already present at the admission of equine students to the institutions might suggest that adolescents with stronger social skills are attracted to horses.

“On the other hand, the fact that the decline of behavior problems is more remarkable in the equine group than in the control group suggests that equine-assisted activities might play a role in strengthening these skills.”

Their analysis showed that equine-related activities were a significant factor leading to these favorable behavior traits.

“It is important to mention that these beneficial effects of equine-assisted activities are mostly based on the students’ understanding of and susceptibility to equine communication.

“The mere presence of a horse is less likely to be effective if the equine professional present does not give meaning to the horse’s behavior.

“Students have to learn to treat the horses as subjects and not as objects in order to get involved and become receptive to positive influence within the interaction.

“At the same time, this knowledge (that is, understanding equine communication and behavior) is also essential just to be able to work safely and effectively with these animals.

“This means that no therapeutic goals are needed to teach students to pay attention to and respect horses — it is the basis of all equine interactions in professional environments.”

That, they said, is why the standard school environments, without any therapeutic element, could produce such results.

“We strongly believe that the relationship humans build with horses shows them a way to build trust, acceptance, and understanding toward humans, as well.

“Our results suggest that young people who learn to listen to and take care of the horse can transfer this knowledge to intraspecies communication and behavior, as well.

“Equine students’ prosocial behavior is four times better than that of non-equine students. This result is remarkable and supports the idea that being around horses improves students’ social competences.”

Adolescence, they said, is a difficult period in life. They have to cope with many difficulties during these years.

“They need help to understand and find their place in the world, or to just generally get around successfully. The lucky ones get enough support from their family and friends, others — a very limited number — get professional help with more serious problems.

“Our study showed that with a little care and attention, normal school programs can improve competencies that are useful in life.

“If horses can be used to help adolescents and there are schools with horses and adolescents, why not exploit the possibility? With a little investment, gains might be great.”

The results indicate that equine-assisted activities have a protective effect on the behavior of adolescents, they said.

“These results also show that equine vocational schools or programs have — to the best of our knowledge — so far unidentified potential to help adolescents with behavior problems, or possibly to prevent their development.

The full study team comprised Pelyva, Etelka Szovák and Ákos Levente Tóth, all with the University of Pécs; and Réka Kresák, with the University of Szeged.

Pelyva, I.Z.; Kresák, R.; Szovák, E.; Tóth, Á.L. How Equine-Assisted Activities Affect the Prosocial Behavior of Adolescents. Int. J. Environ. Res. Public Health 2020, 17, 2967.

Horses help in the development of emotionally well-adjusted teenagers, study finds

A man covered his face with tattoos and turned his eyes black. He says it cost him his kindergarten teaching job.

Helaine says he loves being a primary school teacher.

A schoolteacher whose body, face and tongue are covered in tattoos and who has had the whites of his eyes surgically turned black said he was prevented from teaching at a French kindergarten after a parent complained he scared their child.

But the teacher, Sylvain Helaine, 35, still teaches children from the age of six up, and said that, after an initial shock when they see him for the first time, his pupils see past his appearance.

“All of my students and their parents were always cool with me because basically they knew me,” said Helaine, who estimated he has spent around 460 hours under the tattooists’ needle.

“It’s only when people see me from far away that they can assume the worst.”

He said last year he was teaching kindergarten at the Docteur Morere Elementary School in Palaiseau, a suburb of Paris, when the parents of a three-year-old child complained to educational authorities. They said their son, who was not taught by Helaine, had nightmares after seeing him.

A couple of months later the school authorities informed him he would no longer teach kindergarten children, he said. “I think the decision they took was quite sad,” said Helaine.

A spokesman for the local education authority said an agreement was reached with Helaine to move him away from teaching kindergarten. Pupils under six “could be frightened by his appearance”, the spokesman said.

Despite the setbacks, Helaine said he would stick with his chosen career. “I’m a primary school teacher … I love my job.”

He said he started getting tattoos at the age of 27 when, while teaching at a private school in London, he had an “existential crisis”. Since then, he said, “Getting tattoos is my passion.”

He said he hoped to show his pupils that they should accept people who are different from the norm. “Maybe when they are adults they will be less racist and less homophobic and more open-minded,” he said.

Thanks to Mr. C for bringing this to the It’s Interesting community.

Algorithm Spots COVID-19 Cases from Eye Images

A small study shows artificial intelligence can pick out individuals with coronavirus infections, but ophthalmologists and AI experts say the approach is far from proven to be capable of distinguishing infections with SARS-CoV-2 from other ills.

by Anthony King

Scientists describe a potential screening method for COVID-19 based on eye images analyzed by artificial intelligence. Scanning a set of images from several hundred individuals with and without COVID-19, the tool accurately diagnosed coronavirus infections more than 90 percent of the time, the developers reported in a preprint posted to medRxiv September 10.

“Our model is quite fast,” Yanwei Fu, a computer scientist at Fudan University in Shanghai, China, who led the study, tells The Scientist. “In less than a second it can check results.”

Currently, screening for coronavirus infection involves CT imaging of the lungs or analyzing samples from the nose or throat, both of which take time and require professional effort. A system based on a few images of the eyes that could triage or even diagnose people would save on both costs and time, says Fu. But the investigation by Fu’s team is preliminary and both ophthalmologists and AI specialists say they’d want to see much more information on the technique—and its performance—before being convinced it could work.

Volunteers at Shanghai Public Health Clinical Centre in Fudan each had five photos of their eyes taken using common CCD or CMOS cameras. Of 303 patients, 104 had COVID-19, 131 had other pulmonary conditions, and 68 had eye diseases. A neural network tool extracted and quantified the features from different regions of the eye and an algorithm recognized the ocular characteristics of each disease. A neural network is a series of algorithms for solving AI problems, learning as it goes along in a way that mimics the human brain. The researchers then carried out a validation experiment on a small dataset from healthy people, COVID-19 patients, pulmonary patients, and ocular patients.

Of 24 people with confirmed coronavirus infections, the tool correctly diagnosed 23, Fu tells The Scientist. And the algorithm accurately identified 30 out of 30 uninfected individuals.

Coronavirus infections, not just those caused by SARS-CoV-2, have long had associations with the eye, causing inflammation of the transparent membrane that covers the inside of the eyelid and whites of the eyeball, a condition called conjunctivitis, or pink eye. The eyes also offer a route to infection for respiratory viruses, including coronaviruses.

Human coronavirus NL63, which causes common cold symptoms, was first identified in 2004 in a baby with bronchiolitis and conjunctivitis. Subsequent studies showed that a minority of children infected with this coronavirus suffer from this eye condition.

Although conjunctivitis remains a potential symptom of coronavirus infections, less than 5 percent of COVID-19 patients actually present with eye symptoms, notes Daniel Ting, ophthalmologist at the Singapore National Eye Centre, who has published on this topic and deep learning in ophthalmology. “If you look to develop an AI system to detect COVID-19 based on [limited numbers of] eye images, I think the performance is not going to be great,” especially given the low prevalence of eye symptoms. He doubts the performance of the algorithm also because “a lot of eye manifestations could be due to reasons other than COVID-19.”

Ting cautions that the sample size of 303 patients and 136 healthy individuals in the Shanghai study is too small to draw strong conclusions. “To develop a good deep learning system to automatically detect some unique features from any medical imaging requires more patients,” he says. “In order to increase the reliability of this study, the same size would need to be multiplied by at least ten times, so, thousands of patients.”

Fu has started down this road, increasing the number of participants and broadening the types of subjects. “We are now doing more double-blind tests in the hospitals, with patients, some with eye diseases,” he says. The group also plans to introduce an online screening platform that uses the algorithm to screen for COVID-19.

“As an ophthalmologist it would be very surprising if there is a distinct COVID viral conjunctivitis pattern as opposed to other similar forms of viral conjunctivitis,” ophthalmologist Alastair Denniston, the director of the Health Data Research Hub for Eye Health in Birmingham, UK, writes in an email to The Scientist. “This is unlike building an algorithm for conditions which are biologically more distinct like macular degeneration,” he writes.

He notes that if there were a unique pattern evident in COVID-19 cases, “then the comparison for training and testing should be against cases that look similar,” such as non–COVID-19 viral conjunctivitis or other causes of a red eye associated with colds caused by adenovirus or rhinovirus. He also faults the paper in not providing “the necessary description to really critique the science in terms of how they built and (tried to) validate the model.”

Denniston recently reviewed more than 20,000 AI studies on detecting disease from medical imaging, but found that less than 1 percent were sufficiently robust in their design and reporting that independent reviewers had high confidence in their claims. This led him to convene a group of experts to define the international standards for the design and reporting of clinical trials of AI systems. These standards were published this month in Nature Medicine, The BMJ, and Lancet Digital Health and are supported by leading medical journals.

The Shanghai study has some potentially controversial applications, even if the AI works. Their algorithm could be used in public places, Fu says, though this would raise data privacy concerns in many countries. “In China, for example, we have a lot of high-resolution cameras everywhere,” he notes. “In airports or at train stations, we could use these surveillance cameras to check people’s eyes.” The program would be most accurate if people looked directly at the camera, but Fu says “as long as our camera can clearly watch the eye region it would be good enough.”

Screening the public without expressed consent using this algorithm would be ruled out of bounds in some parts of the world. “In Europe, this would be highly problematic and most likely illegal, in violation of the EU Charter of Fundamental Rights and general data protection legislation,” says computer scientist Barry O’Sullivan of University College Cork in Ireland who is an expert in AI. The gathering of health data and biometric data in Europe requires consent.

O’Sullivan echoes the concern that the paper falls short on detail regarding its methodology. “It is an interesting hypothesis,” he says. But, as currently written, it isn’t ready for publication in a machine learning journal, he concludes.

Too much or too little sleep may increase dementia risk

By Brian P. Dunleavy

Getting too much or too little sleep may increase the risk for cognitive decline, or dementia, in older adults, according to a study published Monday by JAMA Network Open.

In an analysis of the sleep habits of more than 20,000 English and Chinese adults age 48 to 75, people who slept for fewer than four hours or more than 10 hours per day showed evidence of declines in cognitive function, including memory and language comprehension, researchers said.

“This study is an observational study and cannot demonstrate a causal relationship,” study co-author Yanjun Ma told UPI, so the findings don’t necessarily prove that lack of sleep or excessive sleep causes a decline in cognitive function.

Observational studies are intended to assess only the effect of an intervention — in this case, sleep — on study participants, without trying to modify it to compare differences.

It’s possible that diminished or excessive sleep is an early sign of cognitive decline or dementia, as opposed to a risk factor, researchers said.

“Future mechanism studies, as well as intervention studies examining the association between sleep duration and cognitive decline are required,” said Ma, of the Peking University Clinical Research Institute in China.

As many as 6 million Americans have some form of dementia, and changes in sleep patterns are common, according to the Alzheimer’s Association.

To date, research has shown that sleep disturbances can result from cognitive impairment, while animal studies have found links between lack of sleep and increased levels of brain proteins that are thought to be signs for Alzheimer’s disease, said Dr. Yue Leng, who authored a commentary on the study findings.

Leng is an assistant professor of psychiatry at the University of California-San Francisco.

For their research, Ma and colleagues analyzed data on sleep behaviors and cognitive function in 20,065 adults from the English Longitudinal Study of Aging and the China Health and Retirement Longitudinal Study, and tracked them for about eight years, on average.

In addition to finding higher levels of cognitive decline among those who slept fewer than four or more than 10 hours per day, the researchers also observed that people with these sleep habits had “faster cognitive decline” than those who slept seven to nine hours per day, Ma said.

“It’s usually believed that sleep deprivation might lead to cognitive decline, but it’s unclear why too much sleep might be bad for cognitive health,” Leng added. “Older adults should pay more attention to their sleep habits, as these might have implications for their cognitive health.”

Record-Breaking Whale Stays Underwater for 3 Hours and 42 Minutes

By George Dvorsky

Marine biologists are astonished after a Cuvier’s beaked whale held its breath for nearly four hours during a deep dive. The unexpected observation shows there’s much to learn about these medium-sized whales.

Scientists from Duke University and the Cascadia Research Collective recorded the unbelievable dive during field observations off the coast of Cape Hatteras, North Carolina, in 2017. In the first of two epic dives, the Cuvier beaked whale, wearing tag ZcTag066, stayed underwater for nearly three hours. A week later, the whale outdid itself, holding its breath for a bewildering three hours and 42 minutes.

“We didn’t believe it at first, because these are mammals after all, and any mammal spending that long underwater just seemed incredible,” Nicola Quick, the lead author of the new study and a biologist at Duke University, said in an email.

The record-breaking observations occurred in the midst of a five-year survey, in which Quick and her colleagues were measuring the time it takes Cuvier beaked whales (Ziphius cavirostris) to perform their deep foraging dives. During these dives, the whales venture to depths exceeding 9,800 feet (3,000 meters) and hunt squid and deep-sea fish. Unfortunately, the two recordings of ZcTag066 had to be excluded from the researchers’ primary data set “because they were recorded 17 and 24 days after a known [one-hour] exposure to a Navy mid-frequency active sonar signal,” as the authors wrote in the study, adding that these two extreme dives “are perhaps more indicative of the true limits of the diving behaviour of this species.” It’s possible the exposure to sonar might have altered the whale’s normal diving habits, but the researchers don’t know.

Going into the study, the scientists had estimated a maximum length of 33 minutes for the deep dives, after which time the whales need to resurface and gulp some precious atmospheric oxygen, or resume “anaerobic respiration,” in the parlance of the researchers. The team conducted field observations to test this assumption and to measure the length of time it takes for these toothed whales to recover once at the surface. Details of their work were published today in the Journal of Experimental Biology.

Cuvier’s beaked whales are elusive and skittish, having developed fascinating strategies to avoid predators, namely orcas. Thus, it was a challenge for the team to place their satellite-linked tags onto the whales.

“Because the animals spend so little time at the surface, we needed calm seas and experienced observers to look for them,” said Quick in a press release, adding that the “average period they spend at the surface is about two minutes, so getting a tag on [them] takes a dedicated crew and a manoeuvrable vessel.”

The researchers managed to tag 23 individuals, with field observations ongoing from 2014 to 2018. In total, the scientists recorded more than 3,600 foraging dives, the median duration of which was clocked at 59 minutes. The shortest dives lasted just 33 minutes, but the longest dive (excluding ZcTag066’s) was recorded at 2 hours and 13 minutes.

With this data in hand, the researchers had to revise their models. They re-visited the breath-holding patterns and abilities of other aquatic mammals, which led to a new estimate of 77.7 minutes. This obviously still fell considerably short of their field observations, as 5% of dives exceeded this apparent limit.

Clearly, the scientists are missing something about these whales and the unique abilities that allow for their extended stays beneath the water. This sad fact was driven home even further when the team analyzed the whales’ recovery time, that is, the time spent on the surface after a long foraging dive in preparation for a subsequent dive.

It stands to reason that, after a super-long dive, a Cuvier’s beaked whale might want to chill on the surface for a bit to replenish its oxygen supply and rest its weary muscles. Weirdly, this assumption did not jibe with the field observations, as no clear pattern emerged from the data. For example, a whale that dove for 2 hours needed just 20 minutes of rest before it went back for more, while another whale, after diving for 78 minutes, stayed on the surface for 4 hours before foraging again. The new study raises more questions than it answers.

We asked Quick how it’s possible for these mammals to stay underwater for so long.

“These animals are really adapted to diving, so they have lots of myoglobin in their muscles, which helps them to hold more oxygen in their bodies,” she replied. “They are also able to reduce their energy expenditure by being streamlined to dive, and we think reducing their metabolic rate. It’s likely they have many other adaptations as well that we still don’t fully understand, such as being able to reduce their heart rates and restrict the movement of blood flow to tissues.”

As to why some some of the dives lasted so long, the authors said the whales may have been enjoying their time in areas rich in food or reacting to a perceived threat, such as a noise disturbance (U.S. Navy, we’re looking at you).

An encouraging aspect of this study is how much there is still to learn about these aquatic mammals. Clearly, it’s a case of biology exceeding our expectations, which can only be described as exciting.

Mediterranean diet helps offset the health impacts of obesity

By Chrissy Sexton

The Mediterranean diet helps to counter the health impacts of obesity, according to a new study from Uppsala University in Sweden.

In 2015, four million deaths were attributed to excessive weight and more than two-thirds of those deaths were caused by cardiovascular disease (CVD).

“Despite the increasing prevalence of obesity, the rates of CVD-related death continue to decrease in Western societies, a trend not explained by medical treatment alone,” wrote the study authors. “These observations suggest that other factors might modify the higher risk of CVD associated with higher body mass. Potentially, one such factor is diet.”

The Mediterranean diet centers mainly around plant-based foods such as vegetables, fruits, herbs, nuts, beans, and whole grains. The diet also includes moderate amounts of dairy, poultry, eggs, and seafood, while red meat is only eaten occasionally.

A team led by Karl Michaëlsson set out to investigate how a Mediterranean-style diet among individuals with a higher body mass index (BMI) may affect all-cause mortality, with a particular focus on fatal cardiovascular events.

The study was focused on data from more than 79,000 Swedish adults enrolled in the Swedish Mammography Cohort and Cohort of Swedish Men.

Adherence to a Mediterranean-like diet (mMED) was assessed on a scale of 0 to 8, based on intake of fruits and vegetables, legumes, nuts, high-fiber grains, fish, red meat, and olive oil.

Over 21 years of follow-up, more than 30,000 participants died. The researchers found that individuals classified as overweight with high mMED had the lowest risk of all-cause mortality. Obese individuals who had high mMED did not have a higher mortality risk compared with those in the healthy weight group with the same diet.

By contrast, individuals with a healthy weight but low mMED had higher mortality rates compared to people in the same weight range who regularly adhered to a Mediterranean-style diet.

The findings were very similar among 12,000 participants who died from cardiovascular disease. The researchers determined that CVD mortality associated with high BMI was reduced by adherence to a Mediterranean diet, although it was not fully countered. Furthermore, lower BMI did not help offset the elevated CVD mortality risk associated with a low mMED.

“These results indicate that adherence to healthy diets such as a Mediterranean-like diet may be a more appropriate focus that avoidance of obesity for the prevention of overall mortality,” wrote the study authors. “Nonetheless, a healthy diet may not completely counter higher CVD mortality related with obesity.”

The research is published in the journal PLOS Medicine .

Mediterranean diet helps offset the health impacts of obesity

Why do we sleep? The answer may change right before we turn 3.

By Nicoletta Lanese

Humans spend about a third of our lives sleeping, and scientists have long debated why slumber takes up such a huge slice of our time. Now, a new study hints that our main reason for sleeping starts off as one thing, then changes at a surprisingly specific age.

Two leading theories as to why we sleep focus on the brain: One theory says that the brain uses sleep to reorganize the connections between its cells, building electrical networks that support our memory and ability to learn; the other theory says that the brain needs time to clean up the metabolic waste that accumulates throughout the day. Neuroscientists have quibbled over which of these functions is the main reason for sleep, but the new study reveals that the answer may be different for babies and adults.

In the study, published Sep. 18 in the journal Science Advances, researchers use a mathematical model to show that infants spend most of their sleeping hours in “deep sleep,” also known as random eye movement (REM) sleep, while their brains rapidly build new connections between cells and grow ever larger. Then, just before toddlers reach age 2-and-a-half, their amount of REM sleep dips dramatically as the brain switches into maintenance mode, mostly using sleep time for cleaning and repair.

“It was definitely shocking to us that this transition was so sharp,” from growth mode to maintenance mode, senior author Van Savage, a professor of ecology and evolutionary biology and of computational medicine at the University of California, Los Angeles and the Santa Fe Institute, told Live Science in an email. The researchers also collected data in other mammals — namely rabbits, rats and guinea pigs — and found that their sleep might undergo a similar transformation; however, it’s too soon to tell whether these patterns are consistent across many species.

That said, “I think in actuality, it may not be really so sharp” a transition, said Leila Tarokh, a neuroscientist and Group Leader at the University Hospital of Child and Adolescent Psychiatry and Psychotherapy at the University of Bern, who was not involved in the study. The pace of brain development varies widely between individuals, and the researchers had fairly “sparse” data points between the ages of 2 and 3, she said. If they studied individuals through time as they aged, they might find that the transition is less sudden and more smooth, or the age of transition may vary between individuals, she said.

An emerging hypothesis

In a previous study, published in 2007 in the journal Proceedings of the National Academy of Sciences, Savage and theoretical physicist Geoffrey West found that an animal’s brain size and brain metabolic rate accurately predict the amount of time the animal sleeps — more so than the animal’s overall body size. In general, big animals with big brains and low brain metabolic rates sleep less than small animals with the opposite features.

This rule holds up across different species and between members of the same species; for instance, mice sleep more than elephants, and newborn babies sleep more than adult humans. However, knowing that sleep time decreases as brains get bigger, the authors wondered how quickly that change occurs in different animals, and whether that relates to the function of sleep over time.

To begin answering these questions, the researchers pooled existing data on how much humans sleep, compiling several hundred data points from newborn babies and children up to age 15. They also gathered data on brain size and metabolic rate, the density of connections between brain cells, body size and metabolic rate, and the ratio of time spent in REM sleep versus non-REM sleep at different ages; the researchers drew these data points from more than 60 studies, overall.

Babies sleep about twice as much as adults, and they spend a larger proportion of their sleep time in REM, but there’s been a long-standing question as to what function that serves, Tarokh noted.

The study authors built a mathematical model to track all these shifting data points through time and see what patterns emerged between them. They found that the metabolic rate of the brain was high during infancy when the organ was building many new connections between cells, and this in turn correlated with more time spent in REM sleep. They concluded that the long hours of REM in infancy support rapid remodeling in the brain, as new networks form and babies pick up new skills. Then, between age 2 and 3, “the connections are not changing nearly as quickly,” and the amount of time spent in REM diminishes, Savage said.

At this time, the metabolic rate of cells in the cerebral cortex — the wrinkled surface of the brain — also changes. In infancy, the metabolic rate is proportional to the number of existing connections between brain cells plus the energy needed to fashion new connections in the network. As the rate of construction slows, the relative metabolic rate slows in turn.

“In the first few years of life, you see that the brain is making tons of new connections … it’s blossoming, and that’s why we see all those skills coming on,” Tarokh said. Developmental psychologists refer to this as a “critical period” of neuroplasticity — the ability of the brain to forge new connections between its cells. “It’s not that plasticity goes away” after that critical period, but the construction of new connections slows significantly, as the new mathematical model suggests, Tarokh said. At the same time, the ratio of non-REM to REM sleep increases, supporting the idea that non-REM is more important to brain maintenance than neuroplasticity.

Looking forward, the authors plan to apply their mathematical model of sleep to other animals, to see whether a similar switch from reorganization to repair occurs early in development, Savage said.

“Humans are known to be unusual in the amount of brain development that occurs after birth,” lead author Junyu Cao, an assistant professor in the Department of Information, Risk, and Operations Management at The University of Texas at Austin, told Live Science in an email. (Cao played a key role in compiling data and performing computations for the report.) “Therefore, it is conceivable that the phase transition described here for humans may occur earlier in other species, possibly even before birth.”

In terms of human sleep, Tarokh noted that different patterns of electrical activity, known as oscillations, occur in REM versus non-REM sleep; future studies could reveal whether and how particular oscillations shape the brain as we age, given that the amount of time spent in REM changes, she said. Theoretically, disruptions in these patterns could contribute to developmental disorders that emerge in infancy and early childhood, she added — but again, that’s just a hypothesis.

Playing video games in childhood improves working memory years later

By Chrissy Sexton

Playing video games as a child leads to long-lasting cognitive benefits, according to new research from the Universitat Oberta de Catalunya (UOC). The study suggests that gaming improves working memory and concentration.

Previous studies have shown that gaming improves attention, enhances visual-spatial skills, and causes structural changes in the brain – even increasing the size of some regions. The current study is the first to show that video games promote positive cognitive changes that can take place years after people stop playing them.

“People who were avid gamers before adolescence, despite no longer playing, performed better with the working memory tasks, which require mentally holding and manipulating information to get a result,” said study lead author Dr. Marc Palaus.

The research was focused on 27 people between the ages of 18 and 40 with and without any kind of video gaming experience.

The experts analyzed cognitive skills, including working memory, at three points during the study period: before training the volunteers to play Nintendo’s Super Mario 64, at the end of the training, and fifteen days later.

The findings revealed that participants who had not played video games in childhood did not benefit from improvements in processing and inhibiting irrelevant stimuli. As expected, these individuals were initially slower than those who had played games as children.

“People who played regularly as children performed better from the outset in processing 3D objects, although these differences were mitigated after the period of training in video gaming, when both groups showed similar levels,” said Dr. Palaus.

The experts also performed 10 sessions of a non-invasive brain stimulation known as transcranial magnetic stimulation on the individuals.

“It uses magnetic waves which, when applied to the surface of the skull, are able to produce electrical currents in underlying neural populations and modify their activity,” explained Palaus.

The researchers theorized that combining video gaming with this type of stimulation could improve cognitive performance, but that was not the case.

“We aimed to achieve lasting changes. Under normal circumstances, the effects of this stimulation can last from milliseconds to tens of minutes. We wanted to achieve improved performance of certain brain functions that lasted longer than this.”

The game used in the study had a 3D platform, but there are many types of video games that can influence cognitive functions. According to Dr. Palaus, what most video games have in common is that they involve elements that make people want to continue playing, and that they gradually get harder and present a constant challenge.

“These two things are enough to make it an attractive and motivating activity, which, in turn, requires constant and intense use of our brain’s resources,” said Dr. Palaus. “Video games are a perfect recipe for strengthening our cognitive skills, almost without our noticing.”

The study is published in the journal Frontiers in Human Neuroscience.

Playing video games in childhood improves working memory years later