Posts Tagged ‘dementia’

By Emily Underwood

One of the thorniest debates in neuroscience is whether people can make new neurons after their brains stop developing in adolescence—a process known as neurogenesis. Now, a new study finds that even people long past middle age can make fresh brain cells, and that past studies that failed to spot these newcomers may have used flawed methods.

The work “provides clear, definitive evidence that neurogenesis persists throughout life,” says Paul Frankland, a neuroscientist at the Hospital for Sick Children in Toronto, Canada. “For me, this puts the issue to bed.”

Researchers have long hoped that neurogenesis could help treat brain disorders like depression and Alzheimer’s disease. But last year, a study in Nature reported that the process peters out by adolescence, contradicting previous work that had found newborn neurons in older people using a variety of methods. The finding was deflating for neuroscientists like Frankland, who studies adult neurogenesis in the rodent hippocampus, a brain region involved in learning and memory. It “raised questions about the relevance of our work,” he says.

But there may have been problems with some of this earlier research. Last year’s Nature study, for example, looked for new neurons in 59 samples of human brain tissue, some of which came from brain banks where samples are often immersed in the fixative paraformaldehyde for months or even years. Over time, paraformaldehyde forms bonds between the components that make up neurons, turning the cells into a gel, says neuroscientist María Llorens-Martín of the Severo Ochoa Molecular Biology Center in Madrid. This makes it difficult for fluorescent antibodies to bind to the doublecortin (DCX) protein, which many scientists consider the “gold standard” marker of immature neurons, she says.

The number of cells that test positive for DCX in brain tissue declines sharply after just 48 hours in a paraformaldehyde bath, Llorens-Martín and her colleagues report today in Nature Medicine. After 6 months, detecting new neurons “is almost impossible,” she says.

When the researchers used a shorter fixation time—24 hours—to preserve donated brain tissue from 13 deceased adults, ranging in age from 43 to 87, they found tens of thousands of DCX-positive cells in the dentate gyrus, a curled sliver of tissue within the hippocampus that encodes memories of events. Under a microscope, the neurons had hallmarks of youth, Llorens-Martín says: smooth and plump, with simple, undeveloped branches.

In the sample from the youngest donor, who died at 43, the team found roughly 42,000 immature neurons per square millimeter of brain tissue. From the youngest to oldest donors, the number of apparent new neurons decreased by 30%—a trend that fits with previous studies in humans showing that adult neurogenesis declines with age. The team also showed that people with Alzheimer’s disease had 30% fewer immature neurons than healthy donors of the same age, and the more advanced the dementia, the fewer such cells.

Some scientists remain skeptical, including the authors of last year’s Nature paper. “While this study contains valuable data, we did not find the evidence for ongoing production of new neurons in the adult human hippocampus convincing,” says Shawn Sorrells, a neuroscientist at the University of Pittsburgh in Pennsylvania who co-authored the 2018 paper. One critique hinges on the DCX stain, which Sorrells says isn’t an adequate measure of young neurons because the DCX protein is also expressed in mature cells. That suggests the “new” neurons the team found were actually present since childhood, he says. The new study also found no evidence of pools of stem cells that could supply fresh neurons, he notes. What’s more, Sorrells says two of the brain samples he and his colleagues looked at were only fixed for 5 hours, yet they still couldn’t find evidence of young neurons in the hippocampus.

Llorens-Martín says her team used multiple other proteins associated with neuronal development to confirm that the DCX-positive cells were actually young, and were “very strict,” in their criteria for identifying young neurons.

Heather Cameron, a neuroscientist at the National Institute of Mental Health in Bethesda, Maryland, remains persuaded by the new work. Based on the “beauty of the data” in the new study, “I think we can all move forward pretty confidently in the knowledge that what we see in animals will be applicable in humans, she says. “Will this settle the debate? I’m not sure. Should it? Yes.”

https://www.sciencemag.org/news/2019/03/new-neurons-life-old-people-can-still-make-fresh-brain-cells-study-finds?utm_campaign=news_daily_2019-03-25&et_rid=17036503&et_cid=2734364

Having a parent with Alzheimer’s disease has been known to raise a person’s risk of developing the disease, but new research published in Neurology suggests that having second- and third-degree relatives who have had Alzheimer’s may also increase risk.

“Family history is an important indicator of risk for Alzheimer’s disease, but most research focuses on dementia in immediate family members, so our study sought to look at the bigger family picture,” said Lisa A. Cannon-Albright, PhD, University of Utah School of Medicine, Salt Lake City, Utah. “We found that having a broader view of family history may help better predict risk. These results potentially could lead to better diagnoses and help patients and their families in making health-related decisions.”

For the study, researchers looked at the Utah Population Database, which includes the genealogy of Utah pioneers from the 1800s and their descendants up until modern day. The database is linked to Utah death certificates, which show causes of death, and in a majority of cases, contributing causes of death.

In that database, researchers analysed data from over 270,800 people who had at least 3 generations of genealogy connected to the original Utah pioneers including genealogy data for both parents, all 4 grandparents, and at least 6 of 8 great-grandparents. Of those, 4,436 have a death certificate that indicates Alzheimer’s disease as a cause of death.

Results showed that people with 1 first-degree relative with Alzheimer’s disease (18,494 people) had a 73% increased risk of developing the disease. Of this group of people, 590 developed Alzheimer’s disease; the researchers would have expected this group to have 341 cases.

People with 2 first-degree relatives were 4 times more likely to develop the disease; those with 3 were 2.5 times more likely; and those with 4 were nearly 15 times more likely to develop Alzheimer’s disease.

Of the 21 people in the study with 4 first-degree relatives with Alzheimer’s, 6 had the disease. The researchers would have expected only 0.4 people to develop the disease.

Those with 1 first-degree relative and 1 second-degree relative had a 21 times greater risk. Examples of this would be a parent and one grandparent with the disease, or a parent and one aunt or uncle. There were 25 people in this category in the study; 4 of them had the disease when researchers would have expected 0.2 cases.

Those who had only third-degree relatives, and 3 such relatives, with Alzheimer’s disease had a 43% greater risk of developing the disease. An example of this would be two great-grandparents with the disease, along with one great uncle, but no parents or grandparents with the disease. Of the 5,320 people in this category, 148 people had the disease when researchers would have expected 103.

“More and more, people are increasingly seeking an estimate of their own genetic risk for Alzheimer’s disease,” said Dr. Cannon-Albright. “Our findings indicate the importance of clinicians taking a person’s full family history that extends beyond their immediate family members.”

She noted that among all of the study participants, 3% had a family history that doubled their risk of Alzheimer’s disease, and a little over one-half of a percent had a family history that increased their risk by ≥3 times that of a person without a family history of the disease.

Limitations of the study include that not all individuals dying from Alzheimer’s disease may have had a death certificate listing it as cause of death. Dr. Cannon-Albright said death certificates are known to underestimate the prevalence of the disease.

“There are still many unknowns about why a person develops Alzheimer’s disease,” she said. “A family history of the disease is not the only possible cause. There may be environmental causes, or both. There is still much more research needed before we can give people a more accurate prediction of their risk of the disease.”

Reference:
https://n.neurology.org/content/early/2019/03/13/WNL.0000000000007231

https://dgnews.docguide.com/having-great-grandparents-cousins-alzheimer-s-linked-higher-risk?overlay=2&nl_ref=newsletter&pk_campaign=newsletter&nl_eventid=20119

Clumps of harmful proteins that interfere with brain functions have been partially cleared in mice using nothing but light and sound.

Research led by MIT has found strobe lights and a low pitched buzz can be used to recreate brain waves lost in the disease, which in turn remove plaque and improve cognitive function in mice engineered to display Alzheimer’s-like behaviour.

It’s a little like using light and sound to trigger their own brain waves to help fight the disease.

This technique hasn’t been clinically trialled in humans as yet, so it’s too soon to get excited – brain waves are known to work differently in humans and mice.

But, if replicated, these early results hint at a possible cheap and drug-free way to treat the common form of dementia.

So how does it work?

Advancing a previous study that showed flashing light 40 times a second into the eyes of engineered mice treated their version of Alzheimer’s disease, researchers added sound of a similar frequency and found it dramatically improved their results.

“When we combine visual and auditory stimulation for a week, we see the engagement of the prefrontal cortex and a very dramatic reduction of amyloid,” says Li-Huei Tsai, one of the researchers from MIT’s Picower Institute for Learning and Memory.

It’s not the first study to investigate the role sound can play in clearing the brain of the tangles and clumps of tau and amyloid proteins at least partially responsible for the disease.

Previous studies showed bursts of ultrasound make blood vessels leaky enough to allow powerful treatments to slip into the brain, while also encouraging the nervous system’s waste-removal experts, microglia, to pick up the pace.

Several years ago, Tsai discovered light flickering at a frequency of about 40 flashes a second had similar benefits in mice engineered to build up amyloid in their brain’s nerve cells.

“The result was so mind-boggling and so robust, it took a while for the idea to sink in, but we knew we needed to work out a way of trying out the same thing in humans,” Tsai told Helen Thomson at Nature at the time.

The only problem was this effect was confined to visual parts of the brain, missing key areas that contribute to the formation and retrieval of memory.

While the method’s practical applications looked a little limited, the results pointed to a way oscillations could help the brain recover from the grip of Alzheimer’s disease.

As our brain’s neurons transmit signals they also generate electromagnetic waves that help keep remote regions in sync – so-called ‘brain waves’.

One such set of oscillations are defined as gamma-frequencies, rippling across the brain at around 30 to 90 waves per second. These brain waves are most active when we’re paying close attention, searching our memories in order to make sense of what’s going on.

Tsai’s previous study had suggested these gamma waves are impeded in individuals with Alzheimer’s, and might play a pivotal role in the pathology itself.

Light was just one way to trick the parts of the brain into humming in the key of gamma. Sounds can also manage this in other areas.

Instead of the high pitched scream of ultrasound, Tsui used a much lower droning noise of just 40 Hertz, a sound only just high enough for humans to hear.

Exposing their mouse subjects to just one hour of this monotonous buzz every day for a week led to a significant drop in the amount of amyloid build up in the auditory regions, while also stimulating those microglial cells and blood vessels.

“What we have demonstrated here is that we can use a totally different sensory modality to induce gamma oscillations in the brain,” says Tsai.

As an added bonus, it also helped clear the nearby hippocampus – an important section associated with memory.

The effects weren’t just evident in the test subjects’ brain chemistry. Functionally, mice exposed to the treatment performed better in a range of cognitive tasks.

Adding the light therapy from the previous study saw an even more dramatic effect, clearing plaques in a number of areas across the brain, including in the prefrontal cortex. Those trash-clearing microglia also went to town.

“These microglia just pile on top of one another around the plaques,” says Tsai.

Discovering new mechanisms in the way nervous systems clear waste and synchronise activity is a huge step forward in the development of treatments for all kinds of neurological disorders.

Translating discoveries like this to human brains will take more work, especially when there are potential contrasts in how gamma waves appear in mice and human Alzheimer’s brains.

So far early testing for safety has shown the process seems to have no clear side effects.

This research was published in Cell.

https://www.sciencealert.com/astonishing-new-study-treats-alzheimer-s-in-mice-with-a-light-and-sound-show

A team from the Department of Psychological Medicine and Department of Biochemistry at the Yong Loo Lin School of Medicine at the National University of Singapore (NUS) has found that seniors who consume more than two standard portions of mushrooms weekly may have 50 per cent reduced odds of having mild cognitive impairment (MCI).

A portion was defined as three quarters of a cup of cooked mushrooms with an average weight of around 150 grams. Two portions would be equivalent to approximately half a plate. While the portion sizes act as a guideline, it was shown that even one small portion of mushrooms a week may still be beneficial to reduce chances of MCI.

“This correlation is surprising and encouraging. It seems that a commonly available single ingredient could have a dramatic effect on cognitive decline,” said Assistant Professor Lei Feng, who is from the NUS Department of Psychological Medicine, and the lead author of this work.

The six-year study, which was conducted from 2011 to 2017, collected data from more than 600 Chinese seniors over the age of 60 living in Singapore. The research was carried out with support from the Life Sciences Institute and the Mind Science Centre at NUS, as well as the Singapore Ministry of Health’s National Medical Research Council. The results were published online in the Journal of Alzheimer’s Disease on 12 March 2019.

Determining MCI in seniors

MCI is typically viewed as the stage between the cognitive decline of normal ageing and the more serious decline of dementia. Seniors afflicted with MCI often display some form of memory loss or forgetfulness and may also show deficit on other cognitive function such as language, attention and visuospatial abilities. However, the changes can be subtle, as they do not experience disabling cognitive deficits that affect everyday life activities, which is characteristic of Alzheimer’s and other forms of dementia.

“People with MCI are still able to carry out their normal daily activities. So, what we had to determine in this study is whether these seniors had poorer performance on standard neuropsychologist tests than other people of the same age and education background,” explained Asst Prof Feng. “Neuropsychological tests are specifically designed tasks that can measure various aspects of a person’s cognitive abilities. In fact, some of the tests we used in this study are adopted from commonly used IQ test battery, the Wechsler Adult Intelligence Scale (WAIS).”

As such, the researchers conducted extensive interviews and tests with the senior citizens to determine an accurate diagnosis. “The interview takes into account demographic information, medical history, psychological factors, and dietary habits. A nurse will measure blood pressure, weight, height, handgrip, and walking speed. They will also do a simple screen test on cognition, depression, anxiety,” said Asst Prof Feng.

After this, a two-hour standard neuropsychological assessment was performed, along with a dementia rating. The overall results of these tests were discussed in depth with expert psychiatrists involved in the study to get a diagnostic consensus.

Mushrooms and cognitive impairment

Six commonly consumed mushrooms in Singapore were referenced in the study. They were golden, oyster, shiitake and white button mushrooms, as well as dried and canned mushrooms. However, it is likely that other mushrooms not referenced would also have beneficial effects.

The researchers believe the reason for the reduced prevalence of MCI in mushroom eaters may be down to a specific compound found in almost all varieties. “We’re very interested in a compound called ergothioneine (ET),” said Dr. Irwin Cheah, Senior Research Fellow at the NUS Department of Biochemistry. “ET is a unique antioxidant and anti-inflammatory which humans are unable to synthesise on their own. But it can be obtained from dietary sources, one of the main ones being mushrooms.”

An earlier study by the team on elderly Singaporeans revealed that plasma levels of ET in participants with MCI were significantly lower than age-matched healthy individuals. The work, which was published in the journal Biochemical and Biophysical Research Communications in 2016, led to the belief that a deficiency in ET may be a risk factor for neurodegeneration, and increasing ET intake through mushroom consumption might possibly promote cognitive health.

Other compounds contained within mushrooms may also be advantageous for decreasing the risk of cognitive decline. Certain hericenones, erinacines, scabronines and dictyophorines may promote the synthesis of nerve growth factors. Bioactive compounds in mushrooms may also protect the brain from neurodegeneration by inhibiting production of beta amyloid and phosphorylated tau, and acetylcholinesterase.

Next steps

The potential next stage of research for the team is to perform a randomised controlled trial with the pure compound of ET and other plant-based ingredients, such as L-theanine and catechins from tea leaves, to determine the efficacy of such phytonutrients in delaying cognitive decline. Such interventional studies will lead to more robust conclusion on causal relationship. In addition, Asst Prof Feng and his team also hope to identify other dietary factors that could be associated with healthy brain ageing and reduced risk of age-related conditions in the future.

https://medicalxpress.com/news/2019-03-mushrooms-cognitive-decline.html

By Carl Zimmer

In 2014 John Cryan, a professor at University College Cork in Ireland, attended a meeting in California about Alzheimer’s disease. He wasn’t an expert on dementia. Instead, he studied the microbiome, the trillions of microbes inside the healthy human body.

Dr. Cryan and other scientists were beginning to find hints that these microbes could influence the brain and behavior. Perhaps, he told the scientific gathering, the microbiome has a role in the development of Alzheimer’s disease.

The idea was not well received. “I’ve never given a talk to so many people who didn’t believe what I was saying,” Dr. Cryan recalled.

A lot has changed since then: Research continues to turn up remarkable links between the microbiome and the brain. Scientists are finding evidence that microbiome may play a role not just in Alzheimer’s disease, but Parkinson’s disease, depression, schizophrenia, autism and other conditions.

For some neuroscientists, new studies have changed the way they think about the brain.

One of the skeptics at that Alzheimer’s meeting was Sangram Sisodia, a neurobiologist at the University of Chicago. He wasn’t swayed by Dr. Cryan’s talk, but later he decided to put the idea to a simple test.

“It was just on a lark,” said Dr. Sisodia. “We had no idea how it would turn out.”

He and his colleagues gave antibiotics to mice prone to develop a version of Alzheimer’s disease, in order to kill off much of the gut bacteria in the mice. Later, when the scientists inspected the animals’ brains, they found far fewer of the protein clumps linked to dementia.

Just a little disruption of the microbiome was enough to produce this effect. Young mice given antibiotics for a week had fewer clumps in their brains when they grew old, too.

“I never imagined it would be such a striking result,” Dr. Sisodia said. “For someone with a background in molecular biology and neuroscience, this is like going into outer space.”

Following a string of similar experiments, he now suspects that just a few species in the gut — perhaps even one — influence the course of Alzheimer’s disease, perhaps by releasing chemical that alters how immune cells work in the brain.

He hasn’t found those microbes, let alone that chemical. But “there’s something’s in there,” he said. “And we have to figure out what it is.”

‘It was considered crazy’

Scientists have long known that microbes live inside us. In 1683, the Dutch scientist Antonie van Leeuwenhoek put plaque from his teeth under a microscope and discovered tiny creatures swimming about.

But the microbiome has stubbornly resisted scientific discovery. For generations, microbiologists only studied the species that they could grow in the lab. Most of our interior occupants can’t survive in petri dishes.

In the early 2000s, however, the science of the microbiome took a sudden leap forward when researchers figured out how to sequence DNA from these microbes. Researchers initially used this new technology to examine how the microbiome influences parts of our bodies rife with bacteria, such as the gut and the skin.

Few of them gave much thought to the brain — there didn’t seem to be much point. The brain is shielded from microbial invasion by the so-called blood-brain barrier. Normally, only small molecules pass through.

“As recently as 2011, it was considered crazy to look for associations between the microbiome and behavior,” said Rob Knight, a microbiologist at the University of California, San Diego.

He and his colleagues discovered some of the earliest hints of these links. Investigators took stool from mice with a genetic mutation that caused them to eat a lot and put on weight. They transferred the stool to mice that had been raised germ-free — that is, entirely without gut microbiomes — since birth.

After receiving this so-called fecal transplant, the germ-free mice got hungry, too, and put on weight.

Altering appetite isn’t the only thing that the microbiome can do to the brain, it turns out. Dr. Cryan and his colleagues, for example, have found that mice without microbiomes become loners, preferring to stay away from fellow rodents.

The scientists eventually discovered changes in the brains of these antisocial mice. One region, called the amygdala, is important for processing social emotions. In germ-free mice, the neurons in the amygdala make unusual sets of proteins, changing the connections they make with other cells.

Studies of humans revealed some surprising patterns, too. Children with autism have unusual patterns of microbial species in their stool. Differences in the gut bacteria of people with a host of other brain-based conditions also have been reported.

But none of these associations proves cause and effect. Finding an unusual microbiome in people with Alzheimer’s doesn’t mean that the bacteria drive the disease. It could be the reverse: People with Alzheimer’s disease often change their eating habits, for example, and that switch might favor different species of gut microbes.

Fecal transplants can help pin down these links. In his research on Alzheimer’s, Dr. Sisodia and his colleagues transferred stool from ordinary mice into the mice they had treated with antibiotics. Once their microbiomes were restored, the antibiotic-treated mice started developing protein clumps again.

“We’re extremely confident that it’s the bacteria that’s driving this,” he said. Other researchers have taken these experiments a step further by using human fecal transplants.

If you hold a mouse by its tail, it normally wriggles in an effort to escape. If you give it a fecal transplant from humans with major depression, you get a completely different result: The mice give up sooner, simply hanging motionless.

As intriguing as this sort of research can be, it has a major limitation. Because researchers are transferring hundreds of bacterial species at once, the experiments can’t reveal which in particular are responsible for changing the brain.

Now researchers are pinpointing individual strains that seem to have an effect.

To study autism, Dr. Mauro Costa-Mattioli and his colleagues at the Baylor College of Medicine in Houston investigated different kinds of mice, each of which display some symptoms of autism. A mutation in a gene called SHANK3 can cause mice to groom themselves repetitively and avoid contact with other mice, for example.

In another mouse strain, Dr. Costa-Mattioli found that feeding mothers a high-fat diet makes it more likely their pups will behave this way.


Coloured positron emission tomography (PET, centre) and computed tomography (CT, left) scans of the brain of a 62-year-old woman with Alzheimer’s disease.

By Pam Belluck

In dementia research, so many paths have led nowhere that any glimmer of optimism is noteworthy.

So some experts are heralding the results of a large new study, which found that people with hypertension who received intensive treatment to lower their blood pressure were less likely than those receiving standard blood pressure treatment to develop minor memory and thinking problems that often progress to dementia.

The study, published Monday in JAMA, is the first large, randomized clinical trial to find something that can help many older people reduce their risk of mild cognitive impairment — an early stage of faltering function and memory that is a frequent precursor to Alzheimer’s disease and other dementias.

The results apply only to those age 50 or older who have elevated blood pressure and who do not have diabetes or a history of stroke. But that’s a condition affecting a lot of people — more than 75 percent of people over 65 have hypertension, the study said. So millions might eventually benefit by reducing not only their risk of heart problems but of cognitive decline, too.

“It’s kind of remarkable that they found something,” said Dr. Kristine Yaffe, a professor of psychiatry and neurology at University of California San Francisco, who was not involved in the research. “I think it actually is very exciting because it tells us that by improving vascular health in a comprehensive way, we could actually have an effect on brain health.”

The research was part of a large cardiovascular study called Sprint, begun in 2010 and involving more than 9,000 racially and ethnically diverse people at 102 sites in the United States. The participants had hypertension, defined as a systolic blood pressure (the top number) from 130 to 180, without diabetes or a history of stroke.

These were people who could care for themselves, were able to walk and get themselves to doctors’ appointments, said the principal investigator, Dr. Jeff D. Williamson, chief of geriatric medicine and gerontology at Wake Forest School of Medicine.

The primary goal of the Sprint study was to see if people treated intensively enough that their blood pressure dropped below 120 would do better than people receiving standard treatment which brought their blood pressure just under 140. They did — so much so that in 2015, the trial was stopped because the intensively treated participants had significantly lower risk of cardiovascular events and death that it would have been unethical not to inform the standard group of the benefit of further lowering their blood pressure.

But the cognitive arm of the study, called Sprint Mind, continued to follow the participants for three more years even though they were no longer monitored for whether they continued with intensive blood pressure treatment. About 8,500 participants received at least one cognitive assessment.

The primary outcome researchers measured was whether patients developed “probable dementia.” Fewer patients did so in the group whose blood pressure was lowered to 120. But the difference — 149 people in the intensive-treatment group versus 176 people in the standard-treatment group — was not enough to be statistically significant.

But in the secondary outcome — developing mild cognitive impairment or MCI — results did show a statistically significant difference. In the intensive group, 287 people developed it, compared to 353 people in the standard group, giving the intensive treatment group a 19 percent lower risk of mild cognitive impairment, Dr. Williamson said.

Because dementia often develops over many years, Dr. Williamson said he believes that following the patients for longer would yield enough cases to definitively show whether intensive blood pressure treatment helps prevent dementia too. To find out, the Alzheimer’s Association said Monday it would fund two more years of the study.

“Sprint Mind 2.0 and the work leading up to it offers genuine, concrete hope,” Maria C. Carrillo, the association’s chief science officer, said in a statement. “MCI is a known risk factor for dementia, and everyone who experiences dementia passes through MCI. When you prevent new cases of MCI, you are preventing new cases of dementia.”

Dr. Yaffe said the study had several limitations and left many questions unanswered. It’s unclear how it applies to people with diabetes or other conditions that often accompany high blood pressure. And she said she would like to see data on the participants older than 80, since some studies have suggested that in people that age, hypertension might protect against dementia.

The researchers did not specify which type of medication people took, although Dr. Williamson said they plan to analyze by type to see if any of the drugs produced a stronger cognitive benefit. Side effects of the intensive treatment stopped being monitored after the main trial ended, but Dr. Williamson said the biggest negative effect was dehydration.

Dr. Williamson said the trial has changed how he treats patients, offering those with blood pressure over 130 the intensive treatment. “I’ll tell them it will give you a 19 percent lower chance of developing early memory loss,” he said.

Dr. Yaffe is more cautious about changing her approach. “I don’t think we’re ready to roll it out,” she said. “It’s not like I’m going to see a patient and say ‘Oh my gosh your blood pressure is 140; we need to go to 120.’ We really need to understand much more about how this might differ by your age, by the side effects, by maybe what else you have.”

Still, she said, “I do think the take-home message is that blood pressure and other measures of vascular health have a role in cognitive health,” she said. “And nothing else has worked.”

According to the results of a study published in Nature, gaming could possibly increase the volume of gray matter in the brain.
Researchers recently studied the insular cortex regions of frequent gamers and those who didn’t play video games as regularly.
The study found a correlation between playing action video games and increased gray matter volume in the brain.

Do you ever feel you could do with polishing up on your cognitive skills?

Well, according to the results of a study published in Nature, gaming could possibly be the way forward.

Researchers from the Chinese University of Electronic Science and Technology and the Australian Macquarie University in Sydney joined forces, and recently found a correlation between playing action video games and increased gray matter volume in the brain.

How video games stimulate the gray matter in your brain

The focus of the team’s research was on the insular cortex, a part of the cerebral cortex folded deep in the brain that has been the subject of very few studies to date.

It’s thought that a large part of linguistic processing takes place in this region of the brain, and that other processes relating to taste and smell, compassion and empathy, and interpersonal experiences are also managed here.

The study looked at 27 regular video game players described in the study as “Action Video Game experts” as well as 30 amateurs who played less frequently and didn’t perform as well in games.

The participants in the “expert” group were all recognised participants of regional or national championships of League of Legends and Dota 2. Using an MRI scanner, the scientists took detailed pictures of the participants’ insular cortices.

“By comparing AVG experts and amateurs, we found that AVG experts had enhanced functional connectivity and gray matter volume in insular subregions,” wrote the research team.

Gaming actually promotes networking within the brain

The gray matter in your brain is part of your central nervous system and essentially controls all your brain’s functions.

It follows that better connectivity in this region will lead to faster thought processes and correspondingly higher intelligence.

If you want to improve your cognitive performance, you don’t necessarily have to resort to hours of video games; sports and art-based recreation are just two among many activities that promote connectivity in the brain.

However it does mean that those who still like to sit in front of their console from time to time no longer need to feel guilty about being sat in front of a screen — after all, it is exercise — just for the brain.

https://www.businessinsider.com/video-games-may-increase-your-brains-gray-matter-2018-12