New research shows that people with ‘O’ blood type have decreased risk of cognitive decline

A pioneering study conducted by leading researchers at the University of Sheffield has revealed blood types play a role in the development of the nervous system and may impact the risk of developing cognitive decline.

The research, carried out in collaboration with the IRCCS San Camillo Hospital Foundation in Venice, shows that people with an ‘O’ blood type have more grey matter in their brain, which helps to protect against diseases such as Alzheimer’s, than those with ‘A’, ‘B’ or ‘AB’ blood types.

Research fellow Matteo De Marco and Professor Annalena Venneri, from the University’s Department of Neuroscience, made the discovery after analysing the results of 189 Magnetic Resonance Imaging (MRI) scans from healthy volunteers.

The researchers calculated the volumes of grey matter within the brain and explored the differences between different blood types.

The results, published in the Brain Research Bulletin, show that individuals with an ‘O’ blood type have more grey matter in the posterior proportion of the cerebellum.

In comparison, those with ‘A’, ‘B’ or ‘AB’ blood types had smaller grey matter volumes in temporal and limbic regions of the brain, including the left hippocampus, which is one of the earliest part of the brain damaged by Alzheimer’s disease.

These findings indicate that smaller volumes of grey matter are associated with non-‘O’ blood types.

As we age a reduction of grey matter volumes is normally seen in the brain, but later in life this grey matter difference between blood types will intensify as a consequence of ageing.

“The findings seem to indicate that people who have an ‘O’ blood type are more protected against the diseases in which volumetric reduction is seen in temporal and mediotemporal regions of the brain like with Alzheimer’s disease for instance,” said Matteo DeMarco.

“However additional tests and further research are required as other biological mechanisms might be involved.”

Professor Annalena Venneri added: “What we know today is that a significant difference in volumes exists, and our findings confirm established clinical observations. In all likelihood the biology of blood types influences the development of the nervous system. We now have to understand how and why this occurs.”

More information: “‘O’ blood type is associated with larger grey-matter volumes in the cerebellum,” Brain Research Bulletin, Volume 116, July 2015, Pages 1-6, ISSN 0361-9230, dx.doi.org/10.1016/j.brainresbull.2015.05.005

Scientists Have Figured Out How to Recover Forgotten Memories Still Lurking in the Brain

memory

All might not be lost. Researchers recently announced a discovery that could have significant implications later down the road for helping people with severe amnesia or Alzheimer’s disease.

The research tackles a highly debated topic of whether memory loss due to damaged brain cells means that memories cannot be stored anymore or if just accessing that memory is inhibited in some way.

Scientists from MIT found in new research that the latter is most likely the case, demonstrating how lost memories could be recovered using technology known as optogenetics, which a news release about the study described as when “proteins are added to neurons to allow them to be activated with light.”

“The majority of researchers have favored the storage theory, but we have shown in this paper that this majority theory is probably wrong,” Susumu Tonegawa, a professor in MIT’s biology department and director of the RIKEN-MIT Center at the Picower Institute for Learning and Memory, said in a statement. “Amnesia is a problem of retrieval impairment.”

First, the scientists demonstrated how “memory engram cells” — brain cells that trigger a memory upon experiencing a related sight or smell, for example — could be strengthened in mice.

The researchers then gave the mice anisomycin, which blocked protein synthesis in neurons, after they had formed a new memory. In doing so, the researchers prevented the engram cells from strengthening.

A day later, the scientists tried to trigger the memory in mice, but couldn’t see any activation that would indicate the mice were remembering it.

“So even though the engram cells are there, without protein synthesis those cell synapses are not strengthened, and the memory is lost,” Tonegawa explained of this part of the research.

The team first developed a clever technique to selectively label the neurons representing what is known as a memory engram – in other words, the brain cells involved in forming a specific memory. They did this by genetically engineering mice so they had extra genes in all their neurons. As a result, when neurons fire as a memory is formed, they produce red proteins visible under a microscope, allowing the researchers to tell which cells were part of the engram. They also inserted a gene that made the neurons fire when illuminated by blue light.

After the researchers induced amnesia, they used optogenetic tools on the mice and witnessed the animals experiencing full recollection.

“If you test memory recall with natural recall triggers in an anisomycin-treated animal, it will be amnesiac, you cannot induce memory recall. But if you go directly to the putative engram-bearing cells and activate them with light, you can restore the memory,” Tonegawa said.

With this discovery, the researchers wrote in the study published this week in the journal Science that they believe a “specific pattern of connectivity of engram cells may be crucial for memory information storage and that strengthened synapses in these cells critically contribute to the memory retrieval process.”

James Bisby, a neuroscientist at University College London, told New Scientist that it’s “not surprising that they could trigger the memories, but it is a cool way to do it.”

http://www.newscientist.com/article/dn27618-lost-memories-recovered-in-mice-with-a-flash-of-light.html

Thanks to Steven Weihing for bringing this to the It’s Interesting community.

New research suggests that memories may not be stored by synaptic connections between nerve cells

New research suggests that memories may not be stored by synaptic connections between neurons in the brain, but rather synapses may allow the expression of memories that are stored elsewhere in the neuron.

The revolutionary study by academics at the University of California has suggested for the first time that memories are not stored in synapses as previously thought. It is synapses, the connections between brain cells, that are destroyed by Alzheimer’s.

The breakthrough, reported in the highly regarded online journal eLife, could mean that it becomes possible to restore lost memories.

“Long-term memory is not stored at the synapse,” said David Glanzman, the study’s co-author and professor of integrative biology and physiology and of neurobiology at UCLA. “That’s a radical idea, but that’s where the evidence leads. The nervous system appears to be able to regenerate lost synaptic connections. If you can restore the synaptic connections, the memory will come back. It won’t be easy, but I believe it’s possible.”

Professor Glanzman’s team studied the marine snail Aplysia to understand the animal’s learning and memory functions. Glanzman was particularly interested in the Aplysia’s defensive reactions and the sensory and motor neurons responsible for its withdrawal response.

“If you train an animal on a task, inhibit its ability to produce proteins immediately after training, and then test it 24 hours later, the animal doesn’t remember the training,” said Prof. Glanzman. “However, if you train an animal, wait 24 hours, and then inject a protein synthesis inhibitor in its brain, the animal shows perfectly good memory 24 hours later. In other words, once memories are formed, if you temporarily disrupt protein synthesis, it doesn’t affect long-term memory. That’s true in the Aplysia and in human’s brains.”

As part of the test, the snails were given a number of electric shocks, which in themselves would not usually produce long-term memories. The team found that the memories they thought had been completely erased earlier in the experiment had returned, suggesting that synaptic connections that had previously been lost were apparently restored.

“That suggests that the memory is not in the synapses but somewhere else,” said Glanzman. “We think it’s in the nucleus of the neurons. We haven’t proved that, though.”

He added that the research could be a major breakthrough for Alzheimer’s sufferers as even though the disease destroys synapses in the brain, memories might not necessarily destroyed.

“As long as the neurons are still alive, the memory will still be there, which means you may be able to recover some of the lost memories in the early stages of Alzheimer’s,” said Prof Glanzman.

http://www.telegraph.co.uk/news/science/11307411/Cure-for-memory-loss-could-be-on-the-horizon.html

Acceptance and Commitment Therapy (ACT) shows that self-compassion may be more important than self-esteem

Few concepts in popular psychology have gotten more attention over the last few decades than self-esteem and its importance in life success and long-term mental health. Of course, much of this discussion has focused on young people, and how families, parents, teachers, coaches, and mentors can provide the proper psychological environment to help them grow into functional, mature, mentally stable adults.

Research shows that low self-esteem correlates with poorer mental health outcomes across the board, increased likelihood of suicide attempts, and difficulty developing supportive social relationships. Research also shows that trying to raise low self-esteem artificially comes with its own set of problems, including tendencies toward narcissism, antisocial behavior, and avoiding challenging activities that may threaten one’s self-concept.

This division in the research has led to a division amongst psychologists about how important self-esteem is, whether or not it’s useful to help people improve their self-esteem, and what the best practices are for accomplishing that.

In one camp, you have people who believe improving self-esteem is of paramount importance. On the other side of the fence are those who feel the whole concept of self-esteem is overrated and that it’s more critical to develop realistic perceptions about oneself.

But what if we’ve been asking the wrong questions all along? What if the self-esteem discussion is like the proverbial finger pointing at the moon?

New research is suggesting this may indeed be the case, and that a new concept — self-compassion — could be vastly more important than self-esteem when it comes to long-term mental health and success.

Why the Self-Esteem Model Is Flawed

The root problem with the self-esteem model comes down to some fundamental realities about language and cognition that Acceptance and Commitment Therapy (ACT, pronounced all as one word) was designed to address.

The way psychologists classically treat issues with self-esteem is by having clients track their internal dialog — especially their negative self talk — and then employ a number of tactics to counter those negative statements with more positive (or at least more realistic) ones. Others attempt to stop the thoughts, distract themselves from them, or to self sooth.

Put bluntly, these techniques don’t work very well. The ACT research community has shown this over and over again. There are many reasons that techniques like distraction and thought stopping tend not to work — too many to go into all of them here. For a full discussion, see the books Acceptance and Commitment Therapy or Get Out of Your Mind and Into Your Life. For the purposes of our discussion here, we will look at one aspect of this: How fighting a thought increases its believability.

Imagine a young person has the thought, “There is something wrong with me.” The classic rhetoric of self-esteem forces this person to take the thought seriously. After all he or she has likely been taught that having good self-esteem is important and essential for success in life. If they fight against the thought by countering it, however, that means the thought is confirmed. The thought is itself something that is wrong with the individual and has to change. Every time they struggle against it, the noose just gets tighter as the thought is reconfirmed. The more they fight the thought, the more power they give it.

This is a classic example of why in ACT we say, “If you are not willing to have it, you do.”

The simple fact is, we can’t always prevent young people from experiencing insecurity and low self-esteem. Heck, we can’t eliminate those feelings in ourselves. All people feel inadequate or imperfect at times. And in an ever-evolving, ever-more complex world, there is simply no way we can protect our young people from events that threaten their self-esteem — events like social rejection, family problems, personal failures, and others.

What we can do is help young people to respond to those difficult situations and to self-doubt with self-compassion. And a couple of interesting studies that were recently published show that this may indeed offer a more useful way forward not only for young people, but for all of us.

What Is Self-Compassion?

Before we look at the studies, let’s take a moment to define self-compassion.

Dr. Kirstin Neff, one of the premier researchers in this area, defines self-compassion as consisting of three key components during times of personal suffering and failure:
1. Treating oneself kindly.
2. Recognizing one’s struggles as part of the shared human experience.
3. Holding one’s painful thoughts and feelings in mindful awareness.

Given this context, the negativity or positivity of your thoughts isn’t what’s important. It’s how you respond to those thoughts that matters. Going back to the example above — “There is something wrong with me” — instead of fighting against that thought or trying to distract yourself from it, you could notice this thought without getting attached to it (become mindful), understand that it is common to all humans and part of our shared experience as people, and then treat yourself kindly instead of beating yourself up.

Does this approach really work better than simply improving self-esteem?

It seems it does.

A just-published longitudinal study that followed 2,448 ninth graders for a year found that low self-esteem had little effect on mental health in those who had the highest levels of self-compassion. That means that even if they had negative thoughts, those thoughts had minimal impact on their sense of well-being over time as compared to peers who didn’t have self-compassion skills.6

This suggests that teaching kids who suffer from self-esteem issues to be more self-compassionate may have more benefit than simply trying to improve their self-esteem.

The question is: How do we do that?

As it turns out, this is exactly where ACT excels.

Using ACT to Enhance Self-Compassion

Knowing that enhancing self-compassion has been shown not only to mitigate problems with self-esteem, but also impacts other conditions including traumatic stress. Jamie Yadavaia decided to see in his doctoral project if we could enhance self-compassion using ACT.

The results were promising.

A group of 78 students 18 years or older was randomized into one of two groups. The first group was put in a “waitlist condition” which basically means they received no treatment. The other group was provided with six hours of ACT training.

As anticipated, ACT intervention led to substantial increases in self-compassion over the waitlist control post-treatment and two months after the intervention. In this group self-compassion increased 106 percent — an effect size comparable to far longer treatments previously published. Not only that, but the ACT treatment reduced general psychological distress, depression, anxiety, and stress.

At the heart of all these changes was psychological flexibility, this skill seemed to be the key mediating factor across the board, which makes sense. After all, learning how to become less attached to your thoughts, hold them in mindful awareness, and respond to them with a broader repertoire of skills — like self-kindness, for example — has not only been posited in the self-compassion literature as a core feature of mental health but proven time and again in the ACT research as essential for it.

Taken together these studies have an important lesson to teach all of us.

It’s time for us to put down the idea that we have to think well of ourselves at all times to be mature, successful, functional, mentally healthy individuals. Indeed, this toxic idea can foster a kind of narcissistic ego-based self-story that is bound to blow up on us. Instead of increasing self-esteem content what we need to do is increase self-compassion as the context of all we do. That deflates ego-based self-stories, as we humbly accept our place as one amongst our fellow human beings, mindfully acknowledging that we all have self-doubt, we all suffer, we all fail from time to time, but none of that means we can’t live a life of meaning, purpose, and compassion for ourselves and others.

http://www.huffingtonpost.com/steven-c-hayes-phd/is-selfcompassion-more-im_b_6316320.html

Old as time: What we can learn from past attempts to treat aging

Erika Check Hayden
Nature Medicine 20,1362–1364(2014)doi:10.1038/nm1214-1362Published online 04 December 2014

In 1889, the pioneering endocrinologist Charles Edouard Brown-Séquard told Parisian doctors that he had reinvigorated himself by injecting an extract made from dog and guinea pig testicles. Thousands of physicians began administering the extract—sold as “Elixir of Life”—to their patients. Though other researchers looked derisively on his salesmanship, his was among the early investigations that led to the eventual discovery of hormones.

The quest to end aging, rife with bizarre and doomed therapies, is perhaps as old as humanity itself. And even though researchers today have more sophisticated tools for studying aging, the hunt for drugs to prevent human decay has still seen many false leads.

Now, the field hopes to improve its track record with the entrance of two new players, Calico, which launched in September 2013, and Human Longevity, which entered the stage six months later. South San Francisco–based Calico, founded by Google with an initial commitment of at least $250 million, boasts an all-star slate of biotechnology industry leaders such as Genentech alums Art Levinson and Hal Barron and aging researchers David Botstein and Cynthia Kenyon. Human Longevity was founded by genome pioneer Craig Venter and hopes to use a big data approach to combat age-related disease.

The involvement of high-profile names from outside the aging field—and the deep pockets of a funder like Google—have inspired optimism among longevity researchers. “For Google to say, ‘This is something I’m putting a lot of money into,’ is a boost for the field,” says Stephanie Lederman, executive director of the New York–based American Federation for Aging Research, which funds aging research. “There’s a tremendous amount of excitement.”

The lift was badly needed; in August 2013, a major funder of antiaging research, the Maryland-based Ellison Medical Foundation, founded by billionaire Larry Ellison, had said it would no longer sponsor aging research. But so far, neither Calico nor Human Longevity has progressed enough to know whether they will be able to turn around the field’s losing track record, and the obstacles they face are formidable, say veterans of antiaging research.

“We’ve made inroads over the past 20 years or so,” says molecular biologist Leonard Guarente of the Massachusetts Institute of Technology in Cambridge, who has founded and advised high-profile companies in the space. “But I think there’s a long way to go.”

Pathway to success?

Calico appears to be taking the approach that worked for Barron and Levinson at Genentech, the pioneering biotechnology company that has become among the more successful drug companies in the world by making targeted medicines—largely engineered proteins—that disrupt disease pathways in diseases such as cancer. The hallmark of Genentech’s approach has been to dissect the pathways involved in disease and then target them with biotechnology drugs. This past September, Calico announced an alliance with AbbVie, the drug development firm spun out of Abbott Laboratories in 2013. In that deal, Calico and AbbVie said they would jointly spend up to $1.5 billion to develop drugs for age-related diseases including neurodegenerative disorders and cancer.

Such an approach is representative of one way to cure aging: targeting the diseases that become more prevalent as people grow older. This follows the argument that treating such diseases is itself treating aging. The opposing view is to see aging as an inherently pathological program that, if switched off or reprogrammed, could be halted. But because regulators don’t consider the progression of life itself a disease, the semantic debate is moot to drug companies: they can only get drugs approved by targeting diseases that become more common with age, such as cancer, diabetes and neurodegenerative disorders.

Calico has a close view on disease targets. In another September announcement, the company revealed one of its first development areas: drugs related to a class of compounds called P7C3s, which appear to protect nerve cells in the brain from dying by activating an enzyme called nicotinamide phosphoribosyltransferase that inhibits cell death. The P7C3 compounds, discovered in 2010 by researchers at University of Texas Southwestern in Dallas, have been tested in numerous models of neurodegenerative diseases associated with aging, including Alzheimer’s disease and Parkinson’s disease.

The AbbVie and P7C3 deals signal that Calico may focus on a traditional drug development strategy aimed at developing drugs that affect molecular players in the aging process in animal models. That approach makes sense to many who have been in the field for a long time, who say there is still much to learn about the molecular biology of aging: “The way Calico has said they are approaching this is the right way, which is to understand some fundamental aspects of the aging process and see how intervening in them affects that process,” says George Vlasuk, the chief executive of Cambridge, Massachusetts–based Navitor Pharmaceuticals and former head of the now defunct antiaging company Sirtris Pharmaceuticals.

But so far that approach has been difficult to translate successfully into interventions that delay aging or prevent age-related disease. For the most part, the biology of aging has been worked out in animal models; Kenyon’s foundational discoveries, for instance, were made in Caenorhabditis elegans roundworms. But the legion of companies that have failed to commercialize these discoveries is large, and some in the field now think that further progress can be made only by studying human aging. Screening for drugs that affect lifespan in model organisms such as yeast and nematodes is a gamble, says physician Nir Barzilai of the Albert Einstein College of Medicine in New York, who leads a large study of human centenarians. “I’m not sure those are going to be so important.”

Human focus

Craig Venter is squarely in the camp of those who believe the focus must shift towards humans. His Human Longevity is taking a big data dive into human aging, planning to sequence the genes of up to 100,000 people per year and analyze a slew of phenotypic data about them, including their protein profiles, the microbial content of their bodies and digitized imagery of their bodies. “We’re trying to get as much information as we can about humans so that we can find the components in the human genome that are predictive of those features,” Venter told Nature Medicine. “The model organism approach has largely failed. There’s only one model for humans, and that’s humans.”

Venter has a point, according to Judith Campisi, a cell and molecular biologist at the Buck Institute for Age Research in Novato, California. “We now have lots of targets, so I think there’s room for optimism,” she says. “But we’re still swimming in a sea of ignorance about how all these pathways and targets are integrated and how we can intervene in them safely.”

Michael West, CEO of the California-based regenerative medicine company BioTime, knows this well. In 1990, West founded a company, Geron, with $40 million from Silicon Valley venture capitalists such as Menlo Park, California–based Kleiner Perkins, dedicated to activating an enzyme called telomerase to forestall human aging. Telomerase activity, discovered in 1984, extends telomeres—the ends of chromosomes, thought to function as timekeepers of the age of a cell. But researchers soon found that human cancer cells have overactive telomerase, and it’s now thought that telomerase serves a highly useful function as a defense against unchecked cell growth that could lead to cancer1. Geron has shifted its telomerase strategy to blocking telomerase to fight cancer; it no longer works on longevity. “The focus on aging was abandoned,” West says.

Other companies, however, carried forward with the search for drugs against aging, inspired by a 1982 finding that mutating some genes in roundworms could enable them to live longer2. For example, one mutant lived for an average of 40% to 60% longer than normal, and at warm temperatures more than doubled its maximum life expectancy from 22 to 46.2 days. It was the first demonstration that aging was not an inevitable process. The work triggered a flurry of activity to find genes linked to aging and use them in interventions to stave off age-related disease.

Companies rooted in this strategy include Elixir Pharmaceuticals, cofounded in 1999 by Guarente and Kenyon, and Sirtris, established in 2004 by one of Guarente’s former students, David Sinclair. Kenyon had discovered genes in nematodes that extended life; with Guarente, she hoped to make drugs that could do this in humans. Guarente and Sinclair founded different companies, but both were interested in a pathway discovered at MIT that, they believed, acted similarly to a drastic treatment, called calorie restriction, long known to extend the lives of rats. If the rats were fed 40% fewer calories than normal, they could live up to 20–40% longer than the average rat. Guarente’s lab discovered that boosting the dose of genes called sirtuins could prolong the lives of roundworms3, and Sinclair published similar evidence in yeast. They thought that sirtuins worked through the same pathway as calorie restriction and that this same pathway was targeted by a naturally occurring compound called resveratrol found in red grapes and red wine. Both companies began looking for chemicals similar to resveratrol that, they predicted, might ultimately cure aging.

Sirtuin stepbacks

UK-based GlaxoSmithKline bought Sirtris for $720 million in 2008, a move seen as an important endorsement of that “calorie restriction mimetic” strategy. But other researchers were not able to reproduce some of Sinclair’s key studies4—for instance, those showing that resveratrol exerted its antiaging effects through sirtuins. It was also later found that the kind of diet fed to lab mice could affect whether or not sirtuins extended their lifespans; those eating a very high-fat diet seemed to benefit5, but it wasn’t clear that this was the most relevant model for human beings. Similar arguments about diet composition have yielded conflicting results for calorie restriction studies in monkeys and have raised the question of whether animal models of caloric restriction that appear to find a benefit are really just proving that bringing fat animals down to normal weight helps keep them disease free, thus extending lifespan.

Last year, GSK closed Sirtris, absorbed its drug development work and laid off some of Sirtris’s 60 employees. Elixir shut down some time after 2010, having burned through $82 million in venture capital.

The Sirtris experience underscored the unpredictability of aging research. Since the field does not agree on biological readouts of aging, such as altered signaling of certain pathways or expression of particular molecules that serve as proximate measurements of the aging process, the only way to do these studies was to follow animals until they died in order to record their lifespan.

The US National Institute on Aging stepped in, organizing a 1999 meeting that led to the Interventions Testing Program, aimed at bringing some order to the field. The program would systematically run experiments of candidate life extension treatments in mice at three separate sites. The hope was that the studies, which began in 2004, would help identify candidate life extension interventions that most deserved to be taken forward.

Already, most researchers agree, the program has succeeded in building more consensus around some drugs. One of the winners from the program so far, for instance, has been rapamycin, a relatively old drug given to kidney transplant recipients and some patients with cancer. In 2009, the drug was shown to extend the lives of genetically diverse mice7. (Resveratrol failed to prolong mouse lifespan in these same studies.) It was also shown to work in much older mice—the equivalent of about 60-year-old people—than had been studied in previous experiments, a situation that researchers say is much more relevant to the way antiaging drugs would be used in human patients. “You’re not going to give these drugs to teenagers,” says Matthew Kaeberlein of the University of Washington in Seattle. “You’ll probably want to give them to people who are certainly post-reproductive, and perhaps in their 60s and 70s.”

Strong signals

Rapamycin suffers from some of the same issues as previous failed antiaging treatments. It’s an old, unpatentable drug, like resveratrol, and has side effects such as a diabetes-like syndrome when given to transplant patients, who continue to take the drug for life after their surgeries. The side effects are worrying for a potential medicine that might be given over years to delay aging. But the signal from the rapamycin studies in mice is so strong that it’s now seen as one of the most promising leads in aging research, even despite these problems. Navitor, for instance, is looking for compounds that influence the mTOR, short for the ‘mammalian target of rapamycin’, pathway, through which rapamycin seems to extend lifespan. The pathway has the potential to influence a wide range of diseases, including neurodegenerative, autoimmune, metabolic and rare diseases and cancer. That’s been enough to entice investors to fund a $23.5 million financing in the company in June. By targeting a specific branch of the mTOR pathway, Navitor hopes they can elicit the benefits of rapamycin without its side effects.

Vlasuk says that companies like his now focus on treating age-related diseases rather than trumpeting the potential to cure aging itself and all associated maladies. “I’m acutely aware that I don’t want to be caught up in the same hype cycle that Sirtris was at one time,” Vlasuk says.

The field is also maturing in other ways. For instance, there’s a growing realization that the people who wish to take life extension drugs will be more old than young, but that it might be difficult to reverse age-related pathology once it has already set in.

Meanwhile, young researchers are taking the field in new directions. In May, three groups published results of experiments in which they transferred blood or blood products from young to old mice. They showed that the technique can rejuvenate muscle, neurons and age-related cognitive decline. A batch of companies is now forming to translate the finding into people; one, privately funded Alkahest, has begun enrolling patients into a study that will test whether blood donated from young adults and infused into patients with mild to moderate Alzheimer’s disease can improve their symptoms. Importantly, says regenerative biologist Amy Wagers of the Harvard Stem Cell Institute in Cambridge, Massachusetts, one of the pioneers of this approach, it seems to reverse some signs of age-related disease: “This notion that you can do some good even after pathology begins means its much more likely that we can come to a place where we can support people with more healthy aging,” she says.

The challenge of clinical trials for aging-related illnesses is familiar to the brains behind the newest antiaging companies. One solution could be to prove that an intervention prevents the sick from becoming sicker. It’s long been suspected, for instance, that the diabetes drug metformin has antiaging properties, but it can have potential side effects because it inhibits glucose production by the liver, so it can’t be given to healthy people. This year, however, UK-based researchers reported in a large retrospective trial that patients with diabetes taking metformin lived a small but significantly longer time than both diabetics taking another class of drugs and healthy people who were not taking metformin.

Barzilai has been impressed enough by these and other findings to try to round up funding for an international clinical trial to test whether metformin or some other drug improves health of the elderly by delaying the onset of a second disease in those who begin taking it when they are newly diagnosed with diabetes. He argues that second diseases, which can include cancer, become much more likely once a patient has been diagnosed with a first. Preventing the onset of a second disease is a way of extending longevity, he argues, by reducing the disease burden in any one patient. “Let’s show that we can delay aging and delay the onset of a second disease,” he says. “If we can do that, we can make FDA [the US Food and Drug Administration] change its review process and look at better potential drugs that delay aging.”

The challenges of testing treatments in patients with age-related diseases, such as Alzheimer’s, are formidable. Hal Barron knows this well; he presided over a failed Genentech trial of an antibody called crenezumab, which was designed to alleviate symptoms of mild to moderate Alzheimer’s disease. Still, that hasn’t deterred him or Levinson from going all in on neurodegenerative diseases with Calico.

“Art Levinson is one of the smartest guys around in terms of his perception of what drug discovery can do,” Vlasuk says. “His involvement in Calico and the group that he’s assembling there and the backing that Google has provided for this has really opened a lot of people’s eyes.” The question now is what Levinson, Venter and others are seeing—and whether it will be enough to lead aging research to finally fulfill its potential.

Adoptees’ ‘lost language’ from infancy triggers brain response


Chinese children are lined up in Tiananmen Square in 2003 for photos with the overseas families adopting them. The children in the new study were adopted from China at an average age of 12.8 months and raised in French-speaking families.

You may not recall any memories from the first year of life, but if you were exposed to a different language at the time, your brain will still respond to it at some level, a new study suggests.

Brain scans show that children adopted from China as babies into families that don’t speak Chinese still unconsciously recognize Chinese sounds as language more than a decade later.

“It was amazing to see evidence that such an early experience continued to have a lasting effect,” said Lara Pierce, lead author of the study just published in the journal Proceedings of the National Academy of Sciences, in an email to CBC News.

The adopted children, who were raised in French-speaking Quebec families, had no conscious memory of hearing Chinese.

“If you actually test these people in Chinese, they don’t actually know it,” said Denise Klein, a researcher at McGill University’s Montreal Neurological Institute who co-authored the paper.

But their brains responded to Chinese language sounds the same way as those of bilingual children raised in Chinese-speaking families.


Children exposed to Chinese as babies display similar brain activation patterns as children with continued exposure to Chinese when hearing Chinese words, fMRI scans show.

“In essence, their pattern still looks like people who’ve been exposed to Chinese all their lives.”

Pierce, a PhD candidate in psychology at McGill University, working with Klein and other collaborators, scanned the brains of 48 girls aged nine to 17. Each participant lay inside a functional magnetic resonance imaging machine while she listened to pairs of three-syllable phrases. The phrases contained either:

■Sounds and tones from Mandarin, the official Chinese dialect.
■Hummed versions of the same tones but no actual words.

Participants were asked to tell if the last syllables of each pair were the same or different. The imaging machine measured what parts of the brain were active as the participants were thinking.

“Everybody can do the task — it’s not a difficult task to do,” Klein said. But the sounds are processed differently by people who recognize Chinese words — in that case, they activate the part of the brain that processes language.

Klein said the 21 children adopted from China who participated in the study might have been expected to show patterns similar to those of the 11 monolingual French-speaking children. After all, the adoptees left China at an average age of 12.8 months, an age when most children can only say a few words. On average, those children had not heard Chinese in more than 12 years.

The fact that their brains still recognized Chinese provides some insight into the importance of language learning during the first year of life, Klein suggested.

Effect on ‘relearning’ language not known

But Klein noted that the study is a preliminary one and the researchers don’t yet know what the results mean.

For example, would adopted children exposed to Chinese in infancy have an easier time relearning Chinese later, compared with monolingual French-speaking children who were learning it for the first time?

Pierce said studies trying to figure that out have had mixed results, but she hopes the findings in this study could generate better ways to tackle that question.

She is also interested in whether the traces of the lost language affect how the brain responds to other languages or other kinds of learning. Being able to speak multiple languages has already been shown to have different effects on the way the brain processes languages and other kinds of information.

http://www.cbc.ca/news/technology/adoptees-lost-language-from-infancy-triggers-brain-response-1.2838001

Scientists publish new evidence of that awareness may persist several minutes after clinical death, which was previously thought impossible

Bright+light

The largest ever medical study into near-death and out-of-body experiences has discovered that some awareness may continue even after the brain has shut down.

Scientists at the University of Southampton spent four years examining more than 2000 people who suffered cardiac arrest from 15 hospitals in the UK, US and Austria. They found that of 360 people who had been revived after experiencing cardiac arrest, about 40 percent of them had some sort of “awareness” during the period when they were “clinically dead.”

One man’s memory of what he saw “after death” was spot-on in describing what actually happened during his resuscitation. The 57-year-old recalled leaving his body and watching his resuscitation from the corner of the room. He reported hearing two beeps come from a machine that went off every three minutes — indicating that his conscious experience during the time he had no heartbeat lasted for around three minutes. According to the researchers, that suggests the man’s brain may not have shut down completely, even after his heart stopped.

“This is paradoxical, since the brain typically ceases functioning within 20-30 seconds of the heart stopping and doesn’t resume again until the heart has been restarted,” study co-author Dr. Sam Parnia, a professor of medicine at Stony Brook University and former research fellow at Southampton University, said in a written statement.

Parnia added that it’s possible even more patients in the study had mental activity following cardiac arrest but were unable to remember events during the episode as a result of brain injury or the use of sedative drugs.

“We know the brain can’t function when the heart has stopped beating,” said Dr Sam Parnia, a former research fellow at Southampton University, now at the State University of New York, who led the study.

“But in this case, conscious awareness appears to have continued for up to three minutes when the heart wasn’t beating, even though the brain typically shuts down within 20 to 30 seconds after the heart has stopped.

Although many could not recall specific details, some themes emerged. One in five said they had felt an unusual sense of peace while nearly one third said time had slowed down or speeded up.

Some recalled seeing a bright light and others recounted feelings of fear, drowning or being dragged through deep water.

Dr Parnia believes many more people may have experiences when they are close to death but drugs or sedatives used in resuscitation may stop them remembering.

“Estimates have suggested that millions of people have had vivid experiences in relation to death but the scientific evidence has been ambiguous at best.

“Many people have assumed that these were hallucinations or illusions but they do seem to have corresponded to actual events.

“These experiences warrant further investigation.”

Dr David Wilde, a research psychologist at Nottingham Trent University, is currently compiling data on out-of-body experiences in an attempt to discover a pattern that links each episode.

“There is some good evidence here that these experiences are happening after people have medically died.

“We just don’t know what is going on. We are still in the dark about what happens when you die.”

The study was published in the journal Resuscitation.

http://www.resuscitationjournal.com/article/S0300-9572(14)00739-4/abstract

Parenting Rewires the Male Brain

By Elizabeth Norton

Cultures around the world have long assumed that women are hardwired to be mothers. But a new study suggests that caring for children awakens a parenting network in the brain—even turning on some of the same circuits in men as it does in women. The research implies that the neural underpinnings of the so-called maternal instinct aren’t unique to women, or activated solely by hormones, but can be developed by anyone who chooses to be a parent.

“This is the first study to look at the way dads’ brains change with child care experience,” says Kevin Pelphrey, a neuroscientist at Yale University who was not involved with the study. “What we thought of as a purely maternal circuit can also be turned on just by being a parent—which is neat, given the way our culture is changing with respect to shared responsibility and marriage equality.”

The findings come from an investigation of two types of households in Israel: traditional families consisting of a biological mother and father, in which the mother assumed most of the caregiving duties, though the fathers were very involved; and homosexual male couples, one of whom was the biological father, who’d had the child with the help of surrogate mothers. The two-father couples had taken the babies home shortly after birth and shared caregiving responsibilities equally. All participants in the study were first-time parents.

Researchers led by Ruth Feldman, a psychologist and neuroscientist at Bar-Ilan University in Ramat Gan, Israel, visited with the families in their homes, videotaping each parent with the child and then the parents and children alone. The team, which included collaborators at the Tel Aviv Sourasky Medical Center in Israel, also took saliva samples from all parents before and after the videotaped sessions to measure oxytocin—a hormone that’s released at times of intimacy and affection and is widely considered the “trust hormone.” Within a week of the home visit, the participants underwent functional magnetic resonance imaging scanning to determine how their brains reacted to the videotapes of themselves with their infants.

The mothers, their husbands, and the homosexual father-father couples all showed the activation of what the researchers term a “parenting network” that incorporated two linked but separate pathways in the brain. One circuit encompasses evolutionarily ancient structures such as the amygdala, insula, and nucleus accumbens, which handle strong emotions, attention, vigilance, and reward. The other pathway turns up in response to learning and experience and includes parts of the prefrontal cortex and an area called the superior temporal sulcus.

In the mothers, activation was stronger in the amygdala-centered network, whereas the heterosexual fathers showed more activity in the network that’s more experience-dependent. At first glance, Feldman says, the finding would seem to suggest that mothers are more wired up to nurture, protect, and possibly worry about their children. The fathers, in contrast, might have to develop these traits through tending, communicating, and learning from their babies what various sounds mean and what the child needs.

“It’s as if the father’s amygdala can shut off when there’s a woman around,” Feldman observes. It could be assumed, she says, that this circuitry is activated only by the rush of hormones during conception, pregnancy, and childbirth.

But the brains of the homosexual couples, in which each partner was a primary caregiver, told a different story. All of these men showed activity that mirrored that of the mothers, with much higher activation in the amygdala-based network, the team reports online today in the Proceedings of the National Academy of Sciences.

This finding argues strongly that the experience of hands-on parenting, with no female mother anywhere in the picture, can configure a caregiver’s brain in the same way that pregnancy and childbirth do, Feldman says.

She adds that in the heterosexual fathers, the activation of the amygdala-based network was proportional to the amount of time they spent with the baby, though the activity wasn’t as high as in the mothers or in the two-father couples.

Feldman does not believe that the brain activity of the primary-caregiving fathers differed because they were gay. Previous imaging studies, she notes, show no difference in brain activation when homosexual and heterosexual participants viewed pictures of their loved ones.

Future studies, Pelphrey says, might focus more closely on this question. “But it’s clear that we’re all born with the circuitry to help us be sensitive caregivers, and the network can be turned up through parenting.”

http://news.sciencemag.org/brain-behavior/2014/05/parenting-rewires-male-brain

Possible link between cynicism and risk of dementia

Cynics are three times more likely to develop dementia than those who have faith in humanity, a study has shown.

Believing that others are motivated by selfishness, or that they lie to get what they want, appears to radically increase the risk of cognitive decline in later life.

It could mean that grumpy old men and women should be screened more closely for diseases such as Alzheimer’s. Cynicism has previously been linked to health problems such as heart disease, but this is the first time it has been associated with dementia.

“These results add to the evidence that people’s view on life and personality may have an impact on their health,” said Dr. Anna-Maija Tolppanen, the lead researcher at the University of Eastern Finland, whose study is published online in the journal Neurology.

Academics asked nearly 1,500 people with an average age of 71 to fill out a questionnaire to measure their levels of cynicism.

They were asked how much they agreed with statements such as “I think most people would lie to get ahead”, “it is safer to trust nobody” and “most people will use somewhat unfair reasons to gain profit or an advantage rather than lose it.”

Those taking part were monitored for eight years, during which time 46 of them were diagnosed with dementia. The academics discovered that those who had scored highly for cynicism were three times more likely to have developed dementia than those with low scores.

Researchers adjusted the results for other factors that could affect the risk of dementia, such as high blood pressure, high cholesterol and smoking.

Of the 164 people with high levels of cynicism, 14 people developed dementia, compared with nine of the 212 people with low levels of cynicism.

One in three people over 65 will develop a form of dementia. Of the 800,000 people in the U.K. who have the condition, more than half have Alzheimer’s disease. It is estimated that 1.7 million Britons will suffer from dementia by 2051.

Responding to the study findings, charities cautioned that the early symptoms of Alzheimer’s and dementia could make people more cynical about life.

Dr. Doug Brown, of the Alzheimer’s Society, said: “While this research attempts to make a link between higher levels of cynical distrust and risk of dementia, there were far too few people in this study that actually developed dementia to be able to draw any firm conclusions.

http://news.nationalpost.com/2014/05/28/being-a-cynic-linked-to-tripled-risk-of-developing-dementia-finland-study-suggests/

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

Study Finds Pedophiles’ Brains Wired to Find Children Attractive

Pedophiles’ brains are “abnormally tuned” to find young children attractive, according to a new study published this week. The research, led by Jorge Ponseti at Germany’s University of Kiel, means that it may be possible to diagnose pedophiles in the future before they are able to offend.

The findings, published in scientific journal Biology Letters, discovered that pedophiles have the same neurological reaction to images of those they find attractive as those of people with ordinary sexual predilections, but that all the relevant cerebral areas become engaged when they see children, as opposed to fellow adults. The occipital areas, prefrontal cortex, putamen, and nucleus caudatus become engaged whenever a person finds another attractive, but the subject of this desire is inverted for pedophiles.

While studies into the cognitive wiring of sex offenders have long been a source of debate, this latest research offers some fairly conclusive proof that there is a neural pattern behind their behavior.

The paper explains: “The human brain contains networks that are tuned to face processing, and these networks appear to activate different processing streams of the reproductive domain selectively: nurturing processing in the case of child faces and sexual processing in the case of sexually preferred adult faces. This implies that the brain extracts age-related face cues of the preferred sex that inform appropriate response selection in the reproductive domains: nurturing in the case of child faces and mating in the case of adult faces.”

Usually children’s faces elicit feelings of caregiving from both sexes, whereas those of adults provide stimuli in choosing a mate. But among pedophiles, this trend is skewed, with sexual, as opposed to nurturing, emotions burgeoning.

The study analyzed the MRI scans of 56 male participants, a group that included 13 homosexual pedophiles and 11 heterosexual pedophiles, exposing them to “high arousing” images of men, women, boys, and girls. Participants then ranked each photo for attractiveness, leading researchers to their conclusion that the brain network of pedophiles is activated by sexual immaturity.

The critical new finding is that face processing is also tuned to face cues revealing the developmental stage that is sexually preferred,” the paper reads.

Dr. James Cantor, associate professor at the University of Toronto’s Faculty of Medicine, said he was “delighted” by the study’s results. “I have previously described pedophilia as a ‘cross-wiring’ of sexual and nurturing instincts, and this data neatly verifies that interpretation.”

Cantor has undertaken extensive research into the area, previously finding that pedophiles are more likely to be left-handed, 2.3 cm shorter than the average male, and 10 to 15 IQ points lower than the norm.

He continued: “This [new] study is definitely a step in the right direction, and I hope other researchers repeat this kind of work. There still exist many contradictions among scientists’ observations, especially in identifying exactly which areas of the brain are the most central to pedophilia. Because financial support for these kinds of studies is quite small, these studies have been quite small, permitting them to achieve only incremental progress. Truly definitive studies about what in the brain causes pedophilia, what might detect it, and what might prevent it require much more significant support.”

Ponseti said that he hoped to investigate this area further by examining whether findings could be emulated when images of children’s faces are the sole ones used. This could lead to gauging a person’s predisposition to pedophilia far more simply than any means currently in place. “We could start to look at the onset of pedophilia, which is probably in puberty at about 12 or 14 years [old],” he told The Independent.

While Cantor is correct in citing the less than abundant size of the study, the research is certainly significant in providing scope for future practicable testing that could reduce the number of pedophilic crimes committed. By being able to run these tests and examine a person’s tendency toward being sexually attracted to underage children, rehabilitative care and necessary precautions could be taken to safeguard children and ensure that those at risk of committing a crime of this ilk would not be able to do so.

http://www.thedailybeast.com/articles/2014/05/23/study-finds-pedophiles-brains-wired-to-find-children-attractive.html#