You are surprisingly likely to have a living doppelganger

By Zaria Gorvett

It’s on your passport. It’s how criminals are identified in a line-up. It’s how you’re recognised by old friends on the street, even after years apart. Your face: it’s so tangled up with your identity, soon it may be all you need to unlock your smartphone, access your office or buy a house.

Underpinning it all is the assurance that your looks are unique. And then, one day your illusions are smashed.

“I was the last one on the plane and there was someone in my seat, so I asked the guy to move. He turned around and he had my face,” says Neil Douglas, who was on his way to a wedding in Ireland when it happened.

“The whole plane looked at us and laughed. And that’s when I took the selfie.” The uncanny events continued when Douglas arrived at his hotel, only to find the same double at the check-in desk. Later their paths crossed again at a bar and they accepted that the universe wanted them to have a drink. He woke up the next morning with a hangover and an Argentinian radio show on the phone – the picture had gone viral.

Folk wisdom has it that everyone has a doppelganger; somewhere out there there’s a perfect duplicate of you, with your mother’s eyes, your father’s nose and that annoying mole you’ve always meant to have removed. The notion has gripped the popular imagination for millennia – it was the subject of one of the oldest known works of literature – inspiring the work of poets and scaring queens to death.

But is there any truth in it? We live on a planet of over seven billion people, so surely someone else is bound to have been born with your face? It’s a silly question with serious implications – and the answer is more complicated than you might think.

In fact until recently no one had ever even tried to find out. Then last year Teghan Lucas set out to test the risk of mistaking an innocent double for a killer.

Armed with a public collection of photographs of U.S. military personnel and the help of colleagues from the University of Adelaide, Teghan painstakingly analysed the faces of nearly four thousand individuals, measuring the distances between key features such as the eyes and ears. Next she calculated the probability that two peoples’ faces would match.

What she found was good news for the criminal justice system, but likely to disappoint anyone pining for their long-lost double: the chances of sharing just eight dimensions with someone else are less than one in a trillion. Even with 7.4 billion people on the planet, that’s only a one in 135 chance that there’s a single pair of doppelgangers. “Before you could always be questioned in a court of law, saying ‘well what if someone else just looks like him?’ Now we can say it’s extremely unlikely,” says Teghan.

The results can be explained by the famed infinite monkey problem: sit a monkey in front of a typewriter for long enough and eventually it will surely write the Complete Works of William Shakespeare by randomly hitting, biting and jumping up and down on the keys on the board.

It’s a mathematical certainty, but reversing the problem reveals just how staggeringly long the monkey would have to toil. Ignoring grammar, the monkey has a one in 26 chance of correctly typing the first letter of Macbeth. So far, so good. But already by the second letter the chance has shrunk to one in 676 (26 x 26) and by the end of the fourth line (22 letters) it’s one in 13 quintillion. When you multiply probabilities together, the chances of something actually happening disappear very, very quickly.

Besides, the wide array in human guises is undoubtedly down to more than eight traits. Far from everyone having a long-lost “twin”, in Teghan’s view it’s more likely no one does.

But that’s not quite the end of the story. The study relied on exact measurements; if your doppelganger’s ears are 59 mm but yours are 60, your likeness wouldn’t count. In any case, you probably won’t remember the last time you clocked an uncanny resemblance based on the length of someone’s ears.

There may be another way – and it all comes down to what you mean by a doppelganger. “It depends whether we mean ‘lookalike to a human’ or ‘lookalike to facial recognition software’,” says David Aldous, a statistician at U.C. Berkeley.

Francois Brunelle, who has photographed over 200 pairs of doubles for his project I’m not a look-alike, agrees. “For me it’s when you see someone and you think it’s the other person. It’s the way of being, the sum of the parts.” When seen apart, his subjects looked like perfect clones. “When you get them together and you see them side by side, sometimes you feel that they are not the same at all.”
If fine details aren’t important, suddenly the possibility of having a lookalike looks a lot more realistic. But is this really true? To find out, first we need to get to grips with what’s going on when we recognise a familiar face.

Take the illusion of Bill Clinton and Al Gore which circulated the internet before their re-election in 1997. It features a seemingly unremarkable picture of the two men standing side by side. On closer inspection, you can see that Gore’s “internal” facial features – his eyes, nose and mouth – have been replaced by Clinton’s. Even without these traits, with his underlying facial structure intact Al Gore looks completely normal.

It’s a striking demonstration of the way faces are stored in the brain: more like a map than an image. When you bump into a friend on the street, the brain immediately sets to work recognising their features – such as hairline and skin tone – individually, like recognising Italy by its shape alone. But what if they’ve just had a haircut? Or they’re wearing makeup?

To ensure they can be recognised in any context, the brain employs an area known as the fusiform gyrus to tie all the pieces together. If you compare it to finding a country on a map, this is like checking it has a border with France and a coast. This holistic ‘sum of the parts’ perception is thought to make recognising friends a lot more accurate than it would be if their features were assessed in isolation. Crucially, it also fudges the importance of some of the subtler details.

“Most people concentrate on superficial characteristics such as hair-line, hair style, eyebrows,” says Nick Fieller, a statistician involved in The Computer-Aided Facial Recognition Project. Other research has shown we look to the eyes, mouth and nose, in that order.

Then it’s just a matter of working out the probability that someone else will have all the same versions as you. “There are only so many genes in the world which specify the shape of the face and millions of people, so it’s bound to happen,” says Winrich Freiwald, who studies face perception at Rockefeller University. “For somebody with an ‘average’ face it’s comparatively easy to find good matches,” says Fieller.

Let’s assume our man has short blonde hair, brown eyes, a fleshy nose (like Prince Philip, the Duke of Edinburgh), a round face and a full beard. Research into the prevalence of these features is hard to come by, but he’s off to a promising start: 55% of the global population has brown eyes.

Meanwhile more than one in ten people have round faces, according to research funded by a cosmetics company. Then there’s his nose. A study of photographs taken in Europe and Israel identified the ‘fleshy’ type as the most prevalent (24.2%). In the author’s view these are also the least attractive.

Finally – how much hair is there out there? If you thought this was too frivolous for serious investigation, you’d be wrong: among 24,300 people surveyed at a Florida theme park, 82% of men had hair shorter than shoulder-length. Natural blondes, however, constitute just 2%. As the ‘beard capital’ of the world, in the UK most men have some form of facial hair and nearly one in six have a full beard.
A simple calculation (male x brown eyes x blonde x round face x fleshy nose x short hair x full beard) reveals the probability of a person possessing all these features is just over one in 100,000 (0.00001020%).

That would give our guy no less than 74,000 potential doppelgangers. Of course many of these prevalence rates aren’t global, so this is very imprecise. But judging by the number of celebrity look-alikes out there, it might not be far off. “After the picture went viral I think there was a small army of us at some point,” says Douglas.

So what’s the probability that everyone has a duplicate roaming the earth? The simplest way to guess would be to estimate the number of possible faces and compare it to the number of people alive today.
You might expect that even if there are 7.4 billion different faces out there, with 7.4 billion people on the planet there’s clearly one for everyone. But there’s a catch. You’d actually need close to 150 billion people for that to be statistically likely. The discrepancy is down to a statistical quirk known as the coupon collector’s problem. Let’s say there are 50 coupons in a jar and each time you draw one it’s put back in. How many would you need to draw before it’s likely you’ve chosen each coupon at least once?

It takes very little time to collect the first few coupons. The trouble is finding the last few: on average drawing the last one takes about 50 draws on its own, so to collect all 50 you need about 225. It’s possible that most people have a doppelganger – but everyone? “There’s a big difference between being lucky sometimes and being lucky always,” says Aldous.

No one has any good idea what the first number is. Indeed, it may never be possible to say definitively, since the perception of facial resemblance is subjective. Some people have trouble recognising themselves in photos, while others rarely forget a face. And how we perceive similarity is heavily influenced by familiarity. “Some doubles when they get together, they say ‘No I don’t see it. Really, I don’t.’ It’s so obvious to everyone else; it’s a little crazy to hear that,” says Brunelle.
Even so, Fieller thinks there’s a good chance. “I think most people have somebody who is a facial lookalike unless they have a truly exceptional and unusual face,” he says. Friewald agrees. “I think in the digital age which we are entering, at some point we will know because there will be pictures of almost everyone online,” he says.

Why are we so interested anyway? “If you meet someone that looks like you, you have an instant bond because you share something.” Brunelle has received interest from thousands of people searching for their lookalikes, especially from China – a fact he puts down to the one-child policy. Research has shown we judge similar looking-people to be more trustworthy and attractive – a factor thought to contribute to our voting choices.

It may stem back to our deep evolutionary past, when facial resemblance was a useful indicator of kinship. In today’s globalised world, this is misguided. “It is entirely possible for two people with similar facial features to have DNA that is no more similar than that of two random people,” says Lavinia Paternoster, a geneticist at the University of Bristol.

And before you go fantasising about doing a temporary life-swap with your ‘twin’, there’s no guarantee you’ll have anything in common physically either. “Well I’m 5’7 and he’s 6’3… so it’s mainly in the face,” says Douglas.

http://www.bbc.com/future/story/20160712-you-are-surprisingly-likely-to-have-a-living-doppelganger

Will machines one day control our decisions?

New research suggests it’s possible to detect when our brain is making a decision and nudge it to make the healthier choice.

In recording moment-to-moment deliberations by macaque monkeys over which option is likely to yield the most fruit juice, scientists have captured the dynamics of decision-making down to millisecond changes in neurons in the brain’s orbitofrontal cortex.

“If we can measure a decision in real time, we can potentially also manipulate it,” says senior author Jonathan Wallis, a neuroscientist and professor of psychology at the University of California, Berkeley. “For example, a device could be created that detects when an addict is about to choose a drug and instead bias their brain activity towards a healthier choice.”

Located behind the eyes, the orbitofrontal cortex plays a key role in decision-making and, when damaged, can lead to poor choices and impulsivity.

While previous studies have linked activity in the orbitofrontal cortex to making final decisions, this is the first to track the neural changes that occur during deliberations between different options.

“We can now see a decision unfold in real time and make predictions about choices,” Wallis says.

Measuring the signals from electrodes implanted in the monkeys’ brains, researchers tracked the primates’ neural activity as they weighed the pros and cons of images that delivered different amounts of juice.

A computational algorithm tracked the monkeys’ orbitofrontal activity as they looked from one image to another, determining which picture would yield the greater reward. The shifting brain patterns enabled researchers to predict which image the monkey would settle on.

For the experiment, they presented a monkey with a series of four different images of abstract shapes, each of which delivered to the monkey a different amount of juice. They used a pattern-recognition algorithm known as linear discriminant analysis to identify, from the pattern of neural activity, which picture the monkey was looking at.

Next, they presented the monkey with two of those same images, and watched the neural patterns switch back and forth to the point where the researchers could predict which image the monkey would choose based on the length of time that the monkey stared at the picture.

The more the monkey needed to think about the options, particularly when there was not much difference between the amounts of juice offered, the more the neural patterns would switch back and forth.

“Now that we can see when the brain is considering a particular choice, we could potentially use that signal to electrically stimulate the neural circuits involved in the decision and change the final choice,” Wallis says.

Erin Rich, a researcher at the Helen Wills Neuroscience Institute, is lead author of the study published in the journal Nature Neuroscience. The National Institute on Drug Abuse and the National Institute of Mental Health funded the work.

http://www.futurity.org/brains-decisions-1181542/

Brain activity differs between men and women when cooperating


When it comes to social behavior, there are clear differences between men and women, and a new study suggests cooperation with others is no exception.

Written by Honor Whiteman

Published in the journal Scientific Reports, the study reveals that men and women show significant differences in brain activity when working with others in order to complete a task.

The research team – co-led by Joseph Baker, Ph.D., a postdoctoral fellow at Stanford University School of Medicine – says the findings may shed light on the evolutionary differences in cooperation between men and women.

Additionally, they could help inform new strategies to enhance cooperation, which could prove useful for people with disorders that affect social behavior, such as autism.

This latest study is not the first to identify sex differences in cooperation – defined as “a situation in which people work together to do something.”

For example, previous research has shown that a pair of men tend to cooperate better than a pair of women. In mixed-sex pairs, however, women tend to cooperate better than men.

While a number of theories have been put forward to explain these differences, Baker and colleagues note that there is limited data on the neurological processes at play.


The cooperation task

To further investigate, the team enrolled 222 participants – of whom 110 were female – and assigned each of them a partner.

Each pair was made up of either two males, two females, or one male and one female.

The pairs were required to engage in a cooperation task, in which each partner sat in front of a computer opposite from one another. Each partner could see the other, but they were instructed not to talk.

Each individual was instructed to press a button when a circle on their computer screen changed color; their goal was to try and press the button at the same time as their partner.

The pairs were given 40 tries to get the timing of their button presses as close to each other as possible, and after each try, they were told which partner had pressed the button first.

During the task, the researchers recorded the brain activity of each participant simultaneously using hyperscanning and functional near-infrared spectroscopy (fNIRS).

“We developed this test because it was simple, and you could easily record responses,” notes senior study author Dr. Allan Reiss, professor of psychiatry and behavioral sciences and psychology at Stanford.

No ‘interbrain coherence’ when opposite-sex pairs cooperate

Overall, the team found that, compared with female-female pairs, male-male pairs were better at timing their button pushes more closely.

From the brain imaging results, however, the researchers noticed that both partners in each of the same-sex pairs had highly synchronized brain activity during the task – representing greater “interbrain coherence.”

“Within same-sex pairs, increased coherence was correlated with better performance on the cooperation task,” says Baker. “However, the location of coherence differed between male-male and female-female pairs.”

Interestingly, the cooperation performance of male-female pairs was just as good as that of male-male pairs, though opposite-sex pairs showed no evidence of interbrain coherence.

“It’s not that either males or females are better at cooperating or can’t cooperate with each other. Rather, there’s just a difference in how they’re cooperating.” – Dr. Allan Reiss

Baker cautions that their study is “pretty exploratory,” noting that it does not look at all forms of cooperation.

What is more, the researchers did not assess activity in all regions of participants’ brains, and they note that it is possible interbrain coherence in opposite-sex pairs arose in these unmeasured areas.

Still, they believe their findings may help researchers learn more about how cooperation has evolved differently between men and women, and they may even lead to new ways to boost cooperation, which could have clinical implications.

“There are people with disorders like autism who have problems with social cognition,” says Baker. “We’re absolutely hoping to learn enough information so that we might be able to design more effective therapies for them.”

http://www.medicalnewstoday.com/articles/310879.php

The interesting way that your brain makes space to build new and stronger connections so you can learn more

There’s an old saying in neuroscience: neurons that fire together wire together. This means the more you run a neuro-circuit in your brain, the stronger that circuit becomes. This is why, to quote another old saw, practice makes perfect. The more you practice piano, or speaking a language, or juggling, the stronger those circuits get.

For years this has been the focus for learning new things. But as it turns out, the ability to learn is about more than building and strengthening neural connections. Even more important is our ability to break down the old ones. It’s called “synaptic pruning.” Here’s how it works.

Imagine your brain is a garden, except instead of growing flowers, fruits, and vegetables, you grow synaptic connections between neurons. These are the connections that neurotransmitters like dopamine, seratonin, and others travel across.

“Glial cells” are the gardeners of your brain—they act to speed up signals between certain neurons. But other glial cells are the waste removers, pulling up weeds, killing pests, raking up dead leaves. Your brain’s pruning gardeners are called “microglial cells.” They prune your synaptic connections. The question is, how do they know which ones to prune?

Researchers are just starting to unravel this mystery, but what they do know is the synaptic connections that get used less get marked by a protein, C1q (as well as others). When the microglial cells detect that mark, they bond to the protein and destroy—or prune—the synapse.

This is how your brain makes the physical space for you to build new and stronger connections so you can learn more.

Have you ever felt like your brain is full? Maybe when starting a new job, or deep in a project. You’re not sleeping enough, even though you’re constantly taking in new information. Well, in a way, your brain actually is full

When you learn lots of new things, your brain builds connections, but they’re inefficient, ad hoc connections. Your brain needs to prune a lot of those connections away and build more streamlined, efficient pathways. It does that when we sleep.

Your brain cleans itself out when you sleep—your brain cells shrinking by up to 60% to create space for your glial gardeners to come in take away the waste and prune the synapses.

Have you ever woken up from a good night’s rest and been able to think clearly and quickly? That’s because all the pruning and pathway-efficiency that took place overnight has left you with lots of room to take in and synthesize new information—in other words, to learn.

This is the same reason naps are so beneficial to your cognitive abilities. A 10- or 20-minute nap gives your microglial gardeners the chance to come in, clear away some unused connections, and leave space to grow new ones.

Thinking with a sleep-deprived brain is like hacking your way through a dense jungle with a machete. It’s overgrown, slow-going, exhausting. The paths overlap, and light can’t get through. Thinking on a well-rested brain is like wandering happily through Central Park; the paths are clear and connect to one another at distinct spots, the trees are in place, you can see far ahead of you. It’s invigorating.

And in fact, you actually have some control over what your brain decides to delete while you sleep. It’s the synaptic connections you don’t use that get marked for recycling. The ones you do use are the ones that get watered and oxygenated. So be mindful of what you’re thinking about.

If you spend too much time reading theories about the end of Game of Thrones and very little on your job, guess which synapses are going to get marked for recycling?

If you’re in a fight with someone at work and devote your time to thinking about how to get even with them, and not about that big project, you’re going to wind up a synaptic superstar at revenge plots but a poor innovator.

To take advantage of your brain’s natural gardening system, simply think about the things that are important to you. Your gardeners will strengthen those connections and prune the ones that you care about less. It’s how you help the garden of your brain flower.

http://www.fastcompany.com/3059634/your-most-productive-self/your-brain-has-a-delete-button-heres-how-to-use-it

Viral and Bacterial Links to the Brain’s Decline


Herpes simplex viruses pass through the outer protein coat of a nucleus, magnified 40,000 times. Dr. Ruth Itzhak’s research published in 1997 revealed a potential link to the presence of HSV-1 (one specific variety of Herpes simplex) and the onset of Alzheimer’s in 60 percent of the cases they studied. However, she has only been able to study a low number of cases since the work has received only a cursory nod from the greater research world and little funding.

By Ed Cara

As recently as the 1970s, doctors stubbornly treated complaints of festering open sores in the stomach as a failing of diet or an inability to manage stress. Though we had long accepted the basic premise of Louis Pasteur’s germ theory—that flittering short bursts of disease and death are often caused by microscopic beings that could be stopped by sanitary food, water and specially crafted drugs—many researchers ardently resisted the idea that they could also trigger more complicated, chronic illnesses.

When it came to ulcers, no one believed that any microorganisms could endure in the acidic cauldron of our digestive system. It took the gumshoe work of Australian doctors and medical researchers Barry Marshall and Robin Warren in the 1980s to debunk that belief and discover the specific bug responsible for most chronic stomach ulcers, Helicobacter pylori. Marshall even went so far as to swallow the germ to prove the link was real and, obviously, became sick soon after. Thankfully, his self-sacrifice was eventually validated when he and Warren were awarded a Nobel Prize in 2005.

But while modern medicine has grown comfortable with the idea that even chronic physical ailments can be sparked by the living infinitesimal, there is an even bolder, more controversial proposition from a growing number of researchers. It’s the idea that certain germs, bugs and microbes can lie hidden in the body for decades, all the while slowly damaging our brains, even to the point of dementia, depression and schizophrenia.

In January 2016, a team led by Shawn Gale, an associate professor in psychology at Brigham Young University, looked at the infection history of 5,662 young to middle-aged adults alongside the results of tests intended to measure cognition. Gale’s rogues’ gallery included both parasites (the roundworm and Toxoplasma gondii ) and viruses (the hepatitis clan, cytomegalovirus, and herpes simplex virus Types 1 and 2). The team created an index of infectious disease —the more bugs a participant had been exposed to, the higher the person’s index score. It turned out that those with a higher score were more likely to have worse learning and memory skills, as well as slower information-processing speed than those with a lower score, even after controlling for other factors, like age, sex and financial status.

Aside from their shared ability to stay rooted inside us, the ways these pathogens might influence our noggins are as varied as their biology is from one another. Some, like T. gondii (often transmitted to humans via contaminated cats and infected dirt), can discreetly infest the brain and cause subtle changes to our brain chemistry, altering levels of neurotransmitters like dopamine while causing no overt signs of disease. Others, like hepatitis C, are suspected of hitching a ride onto infected white blood cells that cross the brain-blood barrier and, once inside, deplete our supply of white brain matter, the myelin-coated axons that help neurons communicate with each other and seem to actively shape how we learn. And still others, like H. pylori, could trigger a low-level but chronic inflammatory response that gradually wears down our body and mind alike.

Gale’s team found only fairly small deficits in cognition connected to infection. But other researchers, like Ruth Itzhaki, professor emeritus of molecular neurobiology at Britain’s University of Manchester, believe microbes may play an outsized role in one of the most devastating neurodegenerative disorders around: Alzheimer’s disease, which afflicted 47 million people worldwide in 2015. Last March, Itzhaki and a globe-spanning group of researchers penned an editorial in the Journal of Alzheimer’s Disease, imploring the scientific community to more seriously pursue a proposed link between Alzheimer’s and particular germs, namely herpes simplex virus Type 1 (HSV-1), Chlamydia pneumoniae and spirochetes—a diverse group of bacteria that include those responsible for syphilis and Lyme disease. The unusually direct plea, for scientists at least, was the culmination of decades of frustration.

“There’s great hostility to the microbial concept amongst certain influential people in the field, and they are the ones who usually determine whether or not one’s research grant application is successful,” says Itzhaki. “The irony is that they never provide scientific objections to the concepts—they just belittle them, so there’s nothing to rebut!”

It’s a frustration Itzhaki knows too well; in 1991, her lab published the first paper finding a clear HSV-1 link to Alzheimer’s. Since then, according to Itzhaki, over 100 published studies, from her lab and elsewhere, have been supportive of the same link. Nevertheless, Itzhaki says, the work has received only a cursory nod from the greater research world and little funding. Out of the $589 million allocated to Alzheimer’s research by the National Institutes of Health in 2015, exactly zero appeared to be spent on studying the proposed HSV-1 connection.

HSV-1 is more often known as the version of herpes that causes cold sores. Nearly all of us carry the virus from infancy; our peripheral nervous system serves as its dormant nesting ground. From there, HSV-1 can reactivate and occasionally cause mild flare-ups of disease, typically when our immune system is overwhelmed due to stress or other infections. Itzhaki’s lab, however, found that by the time we reach our golden years, the virus often migrates to the brain, where it remains capable of resurrecting itself and wreaking a new sort of havoc when opportunity presents, such as when our immune system wavers in old age.

Her team has also discovered the presence of HSV-1 in the telltale plaques—clumps of proteins in the nerve cells of the brain—used to diagnose Alzheimer’s. In mice and cell cultures infected with HSV-1, they’ve found accumulation of two proteins, beta-amyloid and tau, that form the main components of, respectively, plaques and tangles—twisted protein fibers that form inside dying cells and are another defining characteristic of Alzheimer’s. Plaques and tangles, while sometimes found in normal aging brains, have been found to overflow in the brains of deceased Alzheimer’s sufferers; neuroscientists believe these protein accumulations can cause neuron death and tissue loss. Itzhaki speculates that herpes-infected cells may either produce the proteins in an attempt to fend off HSV-1 or, because the virus itself commands them to, the proteins somehow needed to jump-start the virus’s replication.

Itzhaki, Gale and their colleagues emphasize that rather than being the sole cause of memory loss, slower reaction time or depression, viral and bacterial infections are likely just one ingredient in a soup of risk factors. But for Alzheimer’s, HSV-1 could be especially significant. Itzhaki has found that elderly people who carried both HSV-1 in the brain and the e-4 subtype of the APOE gene (responsible for creating a protein that helps transport cholesterol throughout the body) were 12 times more likely to develop Alzheimer’s than people without either.

APOE-e4, already considered a significant risk factor for Alzheimer’s and thought to make us more vulnerable to viral infection, has also been linked to a greater risk of dementia in HIV-infected patients. In a 1997 Lancet paper, Itzhaki’s group concluded that HSV-1 infection, in conjunction with APOE-e4, could account for about 60 percent of the Alzheimer’s cases they studied. Due to limited funds, however, her group was able to study only a relatively low number of cases.

“I think the proposed theory is certainly reasonable given the supporting evidence,” says Iain Campbell, a professor of molecular biology at the University of Sydney. “What is difficult to establish here is actual causality.”

It might be the case that HSV-1 and other suspects aren’t responsible for the emergence of Alzheimer’s but are simply given free rein to worsen its symptoms as the neurodegenerative disorder weakens both the immune and nervous systems. Deciphering the relationship between these latent infections and Alzheimer’s will take more dedicated research, an effort that Itzhaki feels has been stymied by the persistent lack of resources available to her and her like-minded colleagues.

As things stand, though, she believes there is enough evidence to go ahead with treatment trials; for instance, giving Alzheimer’s patients HSV-1-targeted antivirals in hopes of slowing down or stopping the progression of the disease. She and a team of clinicians are trying to obtain a grant for such a pilot clinical trial to do just that.

Exasperated as Itzhaki has been, the headwinds against her and those who share her beliefs about the brain are slowly dying down. In some cases, once-derided and obscure scientists studying how infections affect the brain are now getting some financial support. There’s Jaroslav Flegr, for example, who has for decades theorized that T. gondii could alter human behavior and even cause certain forms of schizophrenia. In the wake of increased media attention, Flegr’s volume of work on T. gondii has noticeably stepped up as well. From 2014 to 2015, he co-authored 13 papers on T. gondii, nearly twice the number he published the previous two years; the trend of increased T. gondii papers holds across all of PubMed, the largest database of published biomedical research available. “ I have no serious problem with funding of my Toxo research now,” Flegr says.

As of now, though, there have been no ulcer-related Sherlock moments to prove a link between mental dysfunction and latent infections—only indirect correlations clumping together to form a blurry snapshot of a potential crime scene. Which is why Gale and others recommend a wait-and-see approach for the public, even as they acknowledge the potentially vast implications of their research. “I wouldn’t want someone to go out tomorrow and get a whole battery of tests,” he says. “There’s still a lot we need to understand.”

http://www.newsweek.com/viral-bacterial-links-brains-decline-462194

How LSD Makes Your Brain One With The Universe

lsd

by Angus Chen

Some users of LSD say one of the most profound parts of the experience is a deep oneness with the universe. The hallucinogenic drug might be causing this by blurring boundaries in the brain, too.

The sensation that the boundaries between yourself and the world around you are erasing correlates to changes in brain connectivity while on LSD, according to a study published Wednesday in Current Biology. Scientists gave 15 volunteers either a drop of acid or a placebo and slid them into an MRI scanner to monitor brain activity.

After about an hour, when the high begins peaking, the brains of people on acid looked markedly different than those on the placebo. For those on LSD, activity in certain areas of their brain, particularly areas rich in neurons associated with serotonin, ramped up.

Their sensory cortices, which process sensations like sight and touch, became far more connected than usual to the frontal parietal network, which is involved with our sense of self. “The stronger that communication, the stronger the experience of the dissolution [of self],” says Enzo Tagliazucchi, the lead author and a researcher at the Netherlands Institute for Neuroscience.

Tagliazucchi speculates that what’s happening is a confusion of information. Your brain on acid, flooded with signals crisscrossing between these regions, begins muddling the things you see, feel, taste or hear around you with you. This can create the perception that you and, say, the pizza you’re eating are no longer separate entities. You are the pizza and the world beyond the windowsill. You are the church and the tree and the hill.

Albert Hofmann, the discoverer of LSD, described this in his book LSD: My Problem Child. “A portion of the self overflows into the outer world, into objects, which begin to live, to have another, a deeper meaning,” he wrote. He felt the world would be a better place if more people understood this. “What is needed today is a fundamental re-experience of the oneness of all living things.”

The sensation is neurologically similar to synesthesia, Tagliazucchi thinks. “In synesthesia, you mix up sensory modalities. You can feel the color of a sound or smell the sound. This happens in LSD, too,” Tagliazucchi says. “And ego dissolution is a form of synesthesia, but it’s a synesthesia of areas of brain with consciousness of self and the external environment. You lose track of which is which.”

Tagliazucchi and other researchers also measured the volunteers’ brain electrical activity with another device. Our brains normally generate a regular rhythm of electrical activity called the alpha rhythm, which links to our brain’s ability to suppress irrelevant activity. But in a different paper published on Monday in the Proceedings of the National Academy of Sciences, he and several co-authors show that LSD weakens the alpha rhythm. He thinks this weakening could make the hallucinations seem more real.

The idea is intriguing if still somewhat speculative, says Dr. Charles Grob, a psychiatrist at the Harbor-UCLA Medical Center who was not involved with the work. “They may genuinely be on to something. This should really further our understanding of the brain and consciousness.” And, he says, the work highlights hallucinogens’ powerful therapeutic potential.

The altered state of reality that comes with psychedelics might enhance psychotherapy, Grob thinks. “Hallucinogens are a catalyst,” he says. “In well-prepared subjects, you might elicit powerful, altered states of consciousness. [That] has been predicative of positive therapeutic outcomes.”

In recent years, psychedelics have been trickling their way back to psychiatric research. LSD was considered a good candidate for psychiatric treatment until 1966, when it was outlawed and became very difficult to obtain for study. Grob has done work testing the treatment potential of psilocybin, the active compound in hallucinogenic mushrooms.

He imagines a future where psychedelics are commonly used to treat a range of conditions. “[There could] be a peaceful room attractively fixed up with nice paintings, objects to look at, fresh flowers, a chair or recliner for the patient and two therapists in the room,” he muses. “A safe container for that individual as they explore deep inner space, inner terrain.”

Grob believes the right candidate would benefit greatly from LSD or other hallucinogen therapy, though he cautions that bad experiences can still happen for some on the drugs. Those who are at risk for schizophrenia may want to avoid psychedelics, Tagliazucchi says. “There has been evidence saying what could happen is LSD could trigger the disease and turn it into full-fledged schizophrenia,” he says. “There is a lot of debate around this. It’s an open topic.”

Tagliazucchi thinks that this particular ability of psychedelics to evoke a sense of dissolution of self and unity with the external environment has already helped some patients. “Psilocybin has been used to treat anxiety with terminal cancer patients,” he says. “One reason why they felt so good after treatment is the ego dissolution is they become part of something larger: the universe. This led them to a new perspective on their death.”

http://www.npr.org/sections/health-shots/2016/04/13/474071268/how-lsd-makes-your-brain-one-with-the-universe

Scientists discover key brain cells that control eating portion size

111064_web

While researching the brain’s learning and memory system, scientists at Johns Hopkins say they stumbled upon a new type of nerve cell that seems to control feeding behaviors in mice. The finding, they report, adds significant detail to the way brains tell animals when to stop eating and, if confirmed in humans, could lead to new tools for fighting obesity. Details of the study were published by the journal Science today.

“When the type of brain cell we discovered fires and sends off signals, our laboratory mice stop eating soon after,” says Richard Huganir, Ph.D., director of the Department of Neuroscience at the Johns Hopkins University School of Medicine. “The signals seem to tell the mice they’ve had enough.”

Huganir says his team’s discovery grew out of studies of the proteins that strengthen and weaken the intersections, or synapses, between brain cells. These are an important target of research because synapse strength, particularly among cells in the hippocampus and cortex of the brain, is important in learning and memory.

In a search for details about synapse strength, Huganir and graduate student Olof Lagerlöf, M.D., focused on the enzyme OGT — a biological catalyst involved in many bodily functions, including insulin use and sugar chemistry. The enzyme’s job is to add a molecule called N-acetylglucosamine (GlcNAc), a derivative of glucose, to proteins, a phenomenon first discovered in 1984 by Gerald Hart, Ph.D., director of the Johns Hopkins University School of Medicine’s Department of Biological Chemistry and co-leader of the current study. By adding GlcNAc molecules, OGT alters the proteins’ behavior.

To learn about OGT’s role in the brain, Lagerlöf deleted the gene that codes for it from the primary nerve cells of the hippocampus and cortex in adult mice. Even before he looked directly at the impact of the deletion in the rodents’ brains, Lagerlöf reports, he noticed that the mice doubled in weight in just three weeks. It turned out that fat buildup, not muscle mass, was responsible.

When the team monitored the feeding patterns of the mice, they found that those missing OGT ate the same number of meals — on average, 18 a day — as their normal littermates but tarried over the food longer and ate more calories at each meal. When their food intake was restricted to that of a normal lab diet, they no longer gained extra weight, suggesting that the absence of OGT interfered with the animals’ ability to sense when they were full.

“These mice don’t understand that they’ve had enough food, so they keep eating,” says Lagerlöf.

Because the hippocampus and cortex are not known to directly regulate feeding behaviors in rodents or other mammals, the researchers looked for changes elsewhere in the brain, particularly in the hypothalamus, which is known to control body temperature, feeding, sleep and metabolism. There, they found OGT missing from a small subset of nerve cells within a cluster of neurons called the paraventricular nucleus.

Lagerlöf says these cells already were known to send and receive multiple signals related to appetite and food intake. When he looked for changes in the levels of those factors that might be traced to the absence of OGT, he found that most of them were not affected, and the activity of the appetite signals that many other research groups have focused on didn’t seem to be causing the weight gain, he adds.

Next, the team examined the chemical and biological activity of the OGT-negative cells. By measuring the background electrical activity in nonfiring brain cells, the researchers estimated the number of incoming synapses on the cells and found that they were three times as few, compared to normal cells.

“That result suggests that, in these cells, OGT helps maintain synapses,” says Huganir. “The number of synapses on these cells was so low that they probably aren’t receiving enough input to fire. In turn, that suggests that these cells are responsible for sending the message to stop eating.”

To verify this idea, the researchers genetically manipulated the cells in the paraventricular nucleus so that they would add blue light-sensitive proteins to their membranes. When they stimulated the cells with a beam of blue light, the cells fired and sent signals to other parts of the brain, and the mice decreased the amount they ate in a day by about 25 percent.

Finally, because glucose is needed to produce GlcNAc, they thought that glucose levels, which increase after meals, might affect the activity of OGT. Indeed, they found that if they added glucose to nerve cells in petri dishes, the level of proteins with the GlcNAc addition increased in proportion to the amount of glucose in the dishes. And when they looked at cells in the paraventricular nucleus of mice that hadn’t eaten in a while, they saw low levels of GlcNAc-decorated proteins.

“There are still many things about this system that we don’t know,” says Lagerlöf, “but we think that glucose works with OGT in these cells to control ‘portion size’ for the mice. We believe we have found a new receiver of information that directly affects brain activity and feeding behavior, and if our findings bear out in other animals, including people, they may advance the search for drugs or other means of controlling appetites.”

http://www.eurekalert.org/pub_releases/2016-03/jhm-pcc031416.php

“Joke Addiction” As A Neurological Symptom

In a new paper, neurologists Elias D. Granadillo and Mario F. Mendez describe two patients in whom brain disorders led to an unusual symptom: “intractable joking.”

Patient #1 was

A 69-year-old right-handed man presented for a neuropsychiatric evaluation because of a 5-year history of compulsive joking… On interview, the patient reported feeling generally joyful, but his compulsive need to make jokes and create humor had become an issue of contention with his wife. He would wake her up in the middle of the night bursting out in laughter, just to tell her about the jokes he had come up with. At the request of his wife, he started writing down these jokes as a way to avoid waking her. As a result, he brought to our office approximately 50 pages filled with his jokes.

Granadillo and Mendez quote some of the patient’s gags:

Q: What is a pill-popping sexual molester guilty of? A: Rape and pillage.
Q: What did the proctologist say to his therapist? A: All day long I am dealing with assholes.

Went to the Department of Motor Vehicles to get my driver’s license. They gave me an eye exam and here is what they said:
ABCDEFG, HIJKMNLOP, QRS, TUV, WXY and Z; now I know my ABC’s, can I have my license please?

The man’s comedic compulsion was attributed to a stroke, which had damaged part of his left caudate nucleus, although an earlier lesion to the right frontal cortex, caused by a subarachnoid hemorrhage, may have contributed to the pathological punning. Granadillo and Mendez say that a series of medications, including antidepressants, had little impact on his “compulsive need to constantly make and tell jokes.”

Patient #2 was a 57-year old man, who had become “a jokester”, a transformation that had occurred gradually, over a three period. At the same time, the man became excessively forward and disinhibited, making inappropriate actions and remarks. He eventually lost his job after asking “Who the hell chose this God-awful place?”

The patient constantly told jokes and couldn’t stop laughing at them. However, he did not seem to find other people’s jokes funny at all.

The man’s case, however, came to a sad end. His behavior continued to deteriorate and he developed symptoms of Parkinson’s. He died several years later. The diagnosis was Pick’s disease, a rare form of dementia. A post mortem revealed widespread neurodegeneration: “frontotemporal atrophy, severe in the frontal lobes and moderate in the temporal lobes, affecting the right side more than the left” was noted.

Neuroskeptic
« The Myth of “Mind-Altering Parasite” Toxoplasma Gondii?
“Joke Addiction” As A Neurological Symptom
By Neuroskeptic | February 28, 2016 5:51 am
26
In a new paper, neurologists Elias D. Granadillo and Mario F. Mendez describe two patients in whom brain disorders led to an unusual symptom: “intractable joking.”

Patient #1 was

A 69-year-old right-handed man presented for a neuropsychiatric evaluation because of a 5-year history of compulsive joking… On interview, the patient reported feeling generally joyful, but his compulsive need to make jokes and create humor had become an issue of contention with his wife. He would wake her up in the middle of the night bursting out in laughter, just to tell her about the jokes he had come up with. At the request of his wife, he started writing down these jokes as a way to avoid waking her. As a result, he brought to our office approximately 50 pages filled with his jokes.

Granadillo and Mendez quote some of the patient’s gags:

Q: What is a pill-popping sexual molester guilty of? A: Rape and pillage.
Q: What did the proctologist say to his therapist? A: All day long I am dealing with assholes.

Went to the Department of Motor Vehicles to get my driver’s license. They gave me an eye exam and here is what they said:
ABCDEFG, HIJKMNLOP, QRS, TUV, WXY and Z; now I know my ABC’s, can I have my license please?

The man’s comedic compulsion was attributed to a stroke, which had damaged part of his left caudate nucleus, although an earlier lesion to the right frontal cortex, caused by a subarachnoid hemorrhage, may have contributed to the pathological punning. Granadillo and Mendez say that a series of medications, including antidepressants, had little impact on his “compulsive need to constantly make and tell jokes.”

granadillo_mendez

Patient #2 was a 57-year old man, who had become “a jokester”, a transformation that had occurred gradually, over a three period. At the same time, the man became excessively forward and disinhibited, making inappropriate actions and remarks. He eventually lost his job after asking “Who the hell chose this God-awful place?”

The patient constantly told jokes and couldn’t stop laughing at them. However, he did not seem to find other people’s jokes funny at all.

The man’s case, however, came to a sad end. His behavior continued to deteriorate and he developed symptoms of Parkinson’s. He died several years later. The diagnosis was Pick’s disease, a rare form of dementia. A post mortem revealed widespread neurodegeneration: “frontotemporal atrophy, severe in the frontal lobes and moderate in the temporal lobes, affecting the right side more than the left” was noted.

The authors say that both of these patients displayed Witzelsucht, a German term literally meaning ‘joke addiction’. Several cases have been reported in the neurological literature, often associated with damage to the right hemisphere of the brain. Witzelsucht should be distinguished from ‘pathological laughter‘, in which patients start laughing ‘out of the blue’ and the laughter is incongruent with their “mood and emotional experience.” In Witzelsucht, the laughter is genuine: patients really do find their own jokes funny, although they often fail to appreciate those of others.

Granadillo ED, & Mendez MF (2016). Pathological Joking or Witzelsucht Revisited. The Journal of Neuropsychiatry and Clinical Neurosciences PMID: 26900737

Phantom Eye Patients See and Feel with Missing Eyeballs

by Elizabeth Preston

Amputees often feel eerie sensations from their missing limbs. These “phantom limb” feelings can include pain, itching, tingling, or even a sense of trying to pick something up. Patients who lose an eye may have similar symptoms—with the addition of actual phantoms.

Phantom eye syndrome (PES) had been studied in the past, but University of Liverpool psychologist Laura Hope-Stone and her colleagues recently conducted the largest study of PES specifically in patients who’d lost an eye to cancer.

The researchers sent surveys to 239 patients who’d been treated for uveal melanoma at the Liverpool Ocular Oncology Centre. All of these patients had had one eye surgically removed. Some of their surgeries were only 4 months in the past; others had taken place almost 4 and a half years earlier. Three-quarters of the patients returned the surveys, sharing details about how they were doing in their new monocular lives.

Sixty percent of respondents said they had symptoms of phantom eye syndrome. These symptoms included pain, visual sensations, or the impression of actually seeing with the missing eye.

Patients with visual symptoms most often saw simple shapes and colors. But some people reported more distinct images, “for example, resembling wallpaper, a kaleidoscope, or fireworks, or even specific scenes and people,” the authors write.

Then there were the ghosts.

Some people said they had seen strangers haunting their fields of vision, as in these survey responses:

A survey isn’t a perfect way to measure how common PES is overall. But Hope-Stone says there were enough survey responses to produce helpful data for doctors who treat patients with eye cancer.

“We can now tell whether certain kinds of patients are more likely to have phantom symptoms,” she says. For example, “PES is more common in younger patients, and having pain in the non-existent eye is more likely in patients who are anxious and depressed, although we don’t know why.”

About a fifth of PES patients, understandably, said they were disturbed by their symptoms. A similar number found them “pleasurable,” Hope-Stone says.

Doctors aren’t sure exactly why phantom eye syndrome occurs. Since different patients have different symptoms, Hope-Stone says, “I suspect that…there may be a range of causes.”

For that matter, phantom limbs are still mysterious to doctors too. “Human perception is a complex process,” Hope-Stone explains. Even when our sensory organs are gone—the vision receptors in our eyes, the pain and touch receptors in our hands—the nerves and brain areas that used to talk to those organs keep working just fine. “Interactions between [these systems] may contribute to phantom sensations,” she says, although “the exact mechanisms are unclear.”

Even if they don’t know why it happens, doctors can warn their patients about the kinds of symptoms they’re likely to experience—and the ghosts they might see.

http://blogs.discovermagazine.com/inkfish/2015/06/05/phantom-eye-patients-see-and-feel-with-missing-eyeballs/#.VtM-OfkrIgv

People who exercise at middle age might have bigger brains later on

Poor physical fitness in middle age might be associated with a smaller brain size later on, according to a study published in an online issue of Neurology.

Brains shrink as people age, and the atrophy is related to cognitive decline and increased risk for dementia, a researcher said, and exercise reduces that deterioration and cognitive decline.

In this study, more than 1,500 people at an average age of 40 and without dementia or heart disease took a treadmill test. Twenty years later, they took another test, along with MRI brain scans. The study found those who didn’t perform as well on the treadmill test — a sign of poor fitness — had smaller brains 20 years later.

Among those who performed lower, people who hadn’t developed heart problems and weren’t using medication for blood pressure had the equivalent of one year of accelerated brain aging. Those who had developed heart problems or were using medication had the equivalent of two years of accelerated brain aging.

Their exercise capacity was measured using the length of time participants could exercise on the treadmill before their heart rate reached a certain level. Researchers measured heart rate and blood pressure responses to an early stage on the treadmill test, which provides a good picture for a person’s fitness level, according to the study author Nicole Spartano, a postdoctoral fellow at the Boston University School of Medicine.

Physical fitness is evolving as a significant factor related to cognitive health in older age. A study published in May 2015 found that higher levels of physical fitness in middle-aged adults were associated with larger brain volumes five years later.

This study shows that for people with heart disease, fitness might be particularly important for prevention of brain aging, Spartano said.

“We found that poor physical fitness in midlife was linked to more rapid brain aging two decades later,” she said. “This message may be especially important for people with heart disease or at risk for heart disease, in which we found an even stronger relationship between fitness and brain aging.”

The researchers also found that people with higher blood pressure and heart rate during exercise were more likely to have smaller brain sizes 20 years later. People with poor physical fitness usually have higher blood pressure and heart rate responses to low levels of exercise compared to people who exercise more, Spartano said

“From other studies, we know that exercise training programs that improve fitness may increase blood flow and oxygen to the brain over the short term,” Spartano said. “Over the course of a lifetime, improved blood flow may have an impact on brain aging and prevent cognitive decline in older age.”

The study suggests promotion of physical fitness during middle age is an important step toward ensuring healthy brain aging.

“The broad message,” Spartano said, “is that health and lifestyle choices that you make throughout your life may have consequences many years later.”

http://www.cnn.com/2016/02/15/health/poor-fitness-smaller-brain/index.html