44 year old man discovers he’s been living without 90% of his brain


A scan of the man missing 90% of his brain.

by Paul Ratner

What we think we know about our brains is nothing compared to what we don’t know. This fact is brought into focus by the medical mystery of a 44-year-old French father of two who found out one day that he had most of his brain missing. Instead his skull is mostly full of liquid, with almost no brain tissue left. He has a life-long condition known as hydrocephalus, commonly called “water on the brain” or “water head”. It happens when too much cerebrospinal fluid puts pressure on the brain and the brain’s cavities abnormally increase.

As Axel Cleeremans, a cognitive psychologist at the Université Libre in Brussels, who has lectured about this case, told CBC:

“He was living a normal life. He has a family. He works. His IQ was tested at the time of his complaint. This came out to be 84, which is slightly below the normal range … So, this person is not bright — but perfectly, socially apt”.

The complaint Cleeremans refers to is the original reason the man sought help – he had leg pain. Imagine that – you go to your doctor with a leg cramp and get told that you’re living without most of your brain.

The man continues to live a normal life, being a family man with a wife and kids, while working as a civil servant. All this while having 3 of his main brain cavities filled with only fluid and his brainstem and cerebellum stuck into a small space that they share with a cyst.

What can we learn from this rare case? As Cleeremans points out:

“One of the lessons is that plasticity is probably more pervasive than we thought it was… It is truly incredible that the brain can continue to function, more or less, within the normal range — with probably many fewer neurons than in a typical brain. Second lesson perhaps, if you’re interested in consciousness — that is the manner in which the biological activity of the brain produces awareness… One idea that I’m defending is the idea that awareness depends on the brain’s ability to learn.”

The French man’s story really challenges the idea that consciousness arises in one part of the brain only. Current theories hold that the part of the brain called the thalamus is responsible for our self-awareness. A man living with most of his brain missing does not fit neatly into such hypotheses.

http://bigthink.com/paul-ratner/the-medical-mystery-of-a-man-living-with-90-of-his-brain-missing?utm_source=Big+Think+Weekly+Newsletter+Subscribers&utm_campaign=709f2481ff-Newsletter_072016&utm_medium=email&utm_term=0_6d098f42ff-709f2481ff-41106061

Scientists May Have Discovered What Causes Migraines and a Path toward a Cure

by Philip Perry

Those who get migraines know how painful and debilitating they can be. In extreme cases, they can take you out of commission for days. One in seven suffer from them, making migraines the third most common illness in the world. Symptoms include a pounding headache, sometimes on one side of the head, nausea, vomiting, and sensitivity to light and sound.

A laundry list of causes and triggers have been implicated including genetics, eating certain foods, lack of sleep, hormonal changes, neurological issues, and much more. Though there have been lots of indicators, medical science has been stumped as to what causes them, which has made the development of new therapies difficult. Now, according to a group of scientists at the International Headache Genetics Consortium (IHGC), the cause has most likely been discovered. It all has to do with blood flow. Specifically, blood vessels within the brain becoming restricted may be what causes migraines.

There has been a long running debate as to whether migraines are caused by a neurological problem or a vascular one—having to do with circulation. This study, published in the journal Nature Genetics, is likely to put the controversy to rest, and help researchers develop novel approaches to treat the condition. 59,674 migraine sufferers and 316,078 controls, or those who didn’t get the headaches, participated. They hailed from 12 different countries. All participants were part of previous studies, where they had their DNA or genome scanned.


The part of the brain where migraines originate.

Researchers identified 38 specific genes or loci tied to migraines, 28 of which had never been implicated before. What’s interesting is these same genes are associated with other forms of illness, all in the realm of vascular disease. Due to this, researchers believe blood vessel problems are at the heart of migraines.

Aarno Palotie is the leader of the IHGC. He is also associated with the Center for Human Genome Research at Massachusetts General Hospital, in Boston, and at the Broad Institute of MIT and Harvard. Palotie hailed the discovery. He also said the IHGC’s approach was necessary in achieving it. “Because all of these variants modify the disease risk only slightly, the effect could only be seen when this large amount of samples became available.” Migraines have been difficult to treat. Symptoms and severity run the spectrum, and drugs effective in some patients, have been less potent, or even ineffective in others. Now, researchers have a place to start for developing new drugs, which must somehow target the “regulation of vascular tone.” John-Anker Zwart is another member of IHGC. He hails from the Oslo University Hospital in Norway.

Zwart said, “These genetic findings are the first concrete step towards developing personalized, evidence-based treatments for this very complex disease.” He added, “In the future, we hope this information can be utilized in dividing the patients into different genetic susceptibility groups for clinical drug trials, thus increasing the chances of identifying the best possible treatment for each subgroup.”

Previous studies implicated brain tissue genes. But researchers here say that those studies may not have used enough tissue samples. Another neurological theory was that it had something to do with ion channels in the central nervous system (CNS). This was thought to be an area that warranted more study, until now.

The authors of the IHGC study say that the widespread sharing of data played a critical role in this discovery. Palotie said, “We simply can’t overstate the importance of international collaboration when studying genetics of complex, common diseases.” More studies will now be conducted to understand the pathogenesis or development of migraines and what role each gene plays, in order to find entryways suitable for therapeutic intervention.

http://bigthink.com/philip-perry/scientists-discover-the-cause-of-migraines-and-a-path-toward-a-cure?utm_source=Big+Think+Weekly+Newsletter+Subscribers&utm_campaign=709f2481ff-Newsletter_072016&utm_medium=email&utm_term=0_6d098f42ff-709f2481ff-41106061

The risk of everlasting consequences if our brains don’t get adequate stimulation in our early years

by Bahar Golipour

What is the earliest memory you have?

Most people can’t remember anything that happened to them or around them in their toddlerhood. The phenomenon, called childhood amnesia, has long puzzled scientists. Some have debated that we forget because the young brain hasn’t fully developed the ability to store memories. Others argue it is because the fast-growing brain is rewiring itself so much that it overwrites what it’s already registered.

New research that appears in Nature Neuroscience this week suggests that those memories are not forgotten. The study shows that when juvenile rats have an experience during this infantile amnesia period, the memory of that experience is not lost. Instead, it is stored as a “latent memory trace” for a long time. If something later reminds them of the original experience, the memory trace reemerges as a full blown, long-lasting memory.

Taking a (rather huge) leap from rats to humans, this could explain how early life experiences that you don’t remember still shape your personality; how growing up in a rich environment makes you a smarter person and how early trauma puts you at higher risk for mental health problems later on.

Scientists don’t know whether we can access those memories. But the new study shows childhood amnesia coincides with a critical time for the brain ― specifically the hippocampus, a seahorse-shaped brain structure crucial for memory and learning. Childhood amnesia corresponds to the time that your brain matures and new experiences fuel the growth of the hippocampus.

In humans, this period occurs before pre-school, likely between the ages 2 and 4. During this time, a child’s brain needs adequate stimulation (mostly from healthy social interactions) so it can better develop the ability to learn.

And not getting enough healthy mental activation during this period may impede the development of a brain’s learning and memory centers in a way that it cannot be compensated later.

“What our findings tell us is that children’s brains need to get enough and healthy activation even before they enter pre-school,” said study leader Cristina Alberini, a professor at New York University’s Center for Neural Science. “Without this, the neurological system runs the risk of not properly developing learning and memory functions.”

The findings may illustrate one mechanism that could in part explain scientific research that shows poverty can shrink children’s brains.

Extensive research spanning decades has shown that low socioeconomic status is linked to problems with cognitive abilities, higher risk for mental health issues and poorer performance in school. In recent years, psychologists and neuroscientists have found that the brain’s anatomy may look different in poor children. Poverty is also linked to smaller brain surface area and smaller volume of the white matter connecting brain areas, as well as smaller hippocampus. And a 2015 study found that the differences in brain development explain up to 20 percent of academic performance gap between children from high- and low-income families.

Critical Periods

For the brain, the first few years of life set the stage for the rest of life.

Even though the nervous system keeps some of its ability to rewire throughout life, several biochemical events that shape its core structure happen only at certain times. During these critical periods of the developmental stages, the brain is acutely sensitive to new sights, sounds, experiences and external stimulation.

Critical periods are best studied in the visual system. In the 1960s, scientists David Hubel and Torsten Wiesel showed that if they close one eye of a kitten from birth for just for a few months, its brain never learns to see properly. The neurons in the visual areas of the brain would lose their ability respond to the deprived eye. Adult cats treated the same way don’t show this effect, which demonstrates the importance of critical periods in brain development for proper functioning. This finding was part of the pioneering work that earned Hubel and Wiesel the 1981 Nobel Prize in Physiology or Medicine.

In the new study in rats, the team shows that a similar critical period may be happening to the hippocampus.

Alberini and her colleagues took a close look at what exactly happens in the brain of rats in their first 17 days of life (equivalent to the first three years of a human’s life). They created a memory for the rodents of a negative experience: every time the animals entered a specific corner of their cage, they received a mildly painful shock to their foot. Young rats, like kids, aren’t great at remembering things that happened to them during their infantile amnesia. So although they avoided that corner right after the shock, they returned to it only a day later. In contrast, a group of older rats retained the memory and avoided this place for a long time.

However, the younger rats, had actually kept a trace of the memory. A reminder (such as another foot shock in another corner) was enough to resurrect the memory and make the animals avoid the first corner of the cage.

Researchers found a cascade of biochemical events in the young rats’ brains that are typically seen in developmental critical periods.

“We were excited to see the same type of mechanism in the hippocampus,” Alberini told The Huffington Post.

The Learning Brain And Its Mysteries

Just like the kittens’ brain needed light from the eyes to learn to see, the hippocampus may need novel experiences to learn to form memories.

“Early in life, while the brain cannot efficiently form long-term memories, it is ‘learning’ how to do so, making it possible to establish the abilities to memorize long-term,” Alberini said. “However, the brain needs stimulation through learning so that it can get in the practice of memory formation―without these experiences, the ability of the neurological system to learn will be impaired.”

This does not mean that you should put your kids in pre-pre-school, Alberini told HuffPost. Rather, it highlights the importance of healthy social interaction, especially with parents, and growing up in an environment rich in stimulation. Most kids in developed countries are already benefiting from this, she said.

But what does this all mean for children who grow up exposed to low levels of environmental stimulation, something more likely in poor families? Does it explain why poverty is linked to smaller brains? Alberini thinks many other factors likely contribute to the link between poverty and brain. But it is possible, she said, that low stimulation during the development of the hippocampus, too, plays a part.

Psychologist Seth Pollak of University of Wisconsin at Madison who has found children raised in poverty show differences in hippocampal development agrees.

Pollak believes the findings of the new study represent “an extremely plausible link between early childhood adversity and later problems.”

“We must always be cautious about generalizing studies of rodents to understanding human children,” Pollas added. “But the nonhuman animal studies, such as this one, provide testable hypotheses about specific mechanisms underlying human behavior.”

Although the link between poverty and cognitive performance has been repeatedly seen in numerous studies, scientists don’t have a good handle on how exactly many related factors unfold inside the developing brain, said Elizabeth Sowell, a researcher from the Children’s Hospital Los Angeles. Studies like this one provide “a lot of food for thought,” she added.

http://www.huffingtonpost.com.au/2016/07/24/the-things-you-dont-remember-shape-who-you-are/

Having a socially interactive job helps protect from Alzheimer’s disease.

By Patrick Foster

Lawyers, teachers and doctors have a better chance of fighting off the effects of Alzheimer’s disease, because of the complex nature of their jobs, scientists reported this week.

Researchers found that people whose jobs combined complex thinking with social engagement with others – such as social workers and engineers – were better protected against the onset of Alzheimer’s, compared to those in manual work.

The study came as another report suggested that people with a poor diet could protect themselves against cognitive decline by adopting a mentally stimulating lifestyle.

Both pieces of research, published at the international conference of the Alzheimer’s Association, in Toronto, examined the impact of complex thinking on the onset of the disease.

In the first study, carried out by scientists at the Alzheimer’s Disease Research Centre, in Wisconsin, researchers examined white matter hyperintensities (WMHs) – white spots that appear on brain scans and are associated with Alzheimer’s – in 284 late-middle-aged patients considered at risk of contracting the disease.

They found that people who worked primarily with other people, as opposed to with “things or data”, were less likely to be affected by brain damage indicated by WMHs.

While lawyers, social workers, teachers and doctors were best protected, those who enjoyed the least protection included shelf-stackers, machine operators and labourers.

Elizabeth Boots, a researcher on the project, said: “These findings indicate that participants with higher occupational complexity are able to withstand pathology associated with Alzheimer’s and cerebrovascular disease and perform at a similar cognitive level as their peers.

“This association is primarily driven by work with people, rather than data or things. These analyses underscore the importance of social engagement in the work setting for building resilience to Alzheimer’s disease.”

The second study, carried out by Baycrest Health Sciences, in Toronto, examined the diet of 351 older adults.

Researchers found that those who had a traditional Western diet of red and processed meat, white bread, potatoes and sweets were more likely to experience cognitive decline.

However, those who adhered to such a diet but who had a mentally stimulating lifestyle enjoyed some protection from such decline.

Dr Matthew Parrott, one member of the team, said: “Our results show the role higher educational attainment, mentally stimulating work and social engagement can play in protecting your brain from cognitive decline, counteracting some negative effects of an unhealthy diet.

“This adds to the growing body of evidence showing how various lifestyle factors may combine to increase or protect against vulnerability to Alzheimer’s disease.”

Other research put forward at the convention included a study showing that digital brain training exercises can help stave of Alzheimer’s, and another paper that suggested that some newly-identified genes may also increase resilience to the disease.

Maria C. Carrillo, the chief science officer at the Alzheimer’s Association, said: “These new data add to a growing body of research that suggests more stimulating lifestyles, including more complex work environments with other people, are associated with better cognitive outcomes in later life.

“As each new study emerges, we further understand just how powerful cognitive reserve can be in protecting the brain from disease. Formal education and complex occupation could potentially do more than just slow cognitive decline – they may actually help compensate for the cognitive damage done by bad diet and small vessel disease in the brain.

“It is becoming increasingly clear that in addition to searching for pharmacological treatments, we need to address lifestyle factors to better treat and ultimately prevent Alzheimer’s and other dementias.”

http://www.telegraph.co.uk/news/2016/07/24/stressful-job-it-might-help-you-fight-off-alzheimers/

New study shows that brief hyperthermia treats depression

Whole-body hyperthermia is a promising antidepressant modality that works quickly and offers prolonged benefit, according to a study recently published in the online JAMA Psychiatry.

Researchers came to that conclusion after conducting a double-blind study that randomized 30 adults with major depressive disorder to either a single session of active whole-body hyperthermia or a sham treatment that mimicked all aspects of whole-body hyperthermia except its intense heat.

The sham condition was included to strengthen the study design.

“A prior open trial found that a single session of whole-body hyperthermia reduced depressive symptoms,” researchers wrote. “However, the lack of a placebo control raises the possibility that the observed antidepressant effects resulted not from hyperthermia per se, but from nonspecific aspects of the intervention.”

Among participants randomized to sham treatment in the new study, more than 70% believed they had received whole-body hyperthermia, researchers reported, suggesting the placebo was convincing.

When researchers looked at participants’ scores on the Hamilton Depression Rating Scale throughout the 6-week period following the session, they found participants who received active whole-body hyperthermia had significantly reduced scores compared to participants who received sham treatment. Adverse events were mild.

Psych Congress Steering Committee member Charles L. Raison, MD, discussed the findings prior to their publication during a session at last year’s U.S. Psychiatric and Mental Health Congress in San Diego.

“Like ketamine, like scopolamine, and other rapid treatments for depression that are of intense interest in psychiatry, hyperthermia shows the same effect,” he said. “It doesn’t take a week or 2 to work. People feel better very, very quickly, and the effects appear to persist for an extended period of time.”

– Jolynn Tumolo

References

Janssen CW, Lowry CA, Mehl MR, et al. Whole-body hyperthermia for the treatment of major depressive disorder: a randomized clinical trial. JAMA Psychiatry. 2016 May 12. [Epub ahead of print].

Lebano L. New data support whole body hyperthermia for rapid treatment of major depression. Psych Congress Network. 2015 Sept. 10.

http://www.psychcongress.com/article/hyperthermia-provides-significant-rapid-relief-depression-study-suggests-27981

How the eyes betray your thoughts

By Mo Costandi

It’s sometimes said that the eyes are windows into the soul, revealing deep emotions that we might otherwise want to hide. The eyes not only reflect what is happening in the brain but may also influence how we remember things and make decisions.

Our eyes are constantly moving, and while some of those movements are under conscious control, many of them occur subconsciously. When we read, for instance, we make a series of very quick eye movements called saccades that fixate rapidly on one word after another. When we enter a room, we make larger sweeping saccades as we gaze around. Then there are the small, involuntary eye movements we make as we walk, to compensate for the movement of our head and stabilise our view of the world. And, of course, our eyes dart around during the ‘rapid eye movement’ (REM) phase of sleep.

What is now becoming clear is that some of our eye movements may actually reveal our thought process.

Research published last year shows that pupil dilation is linked to the degree of uncertainty during decision-making: if somebody is less sure about their decision, they feel heightened arousal, which causes the pupils to dilate. This change in the eye may also reveal what a decision-maker is about to say: one group of researchers, for example, found that watching for dilation made it possible to predict when a cautious person used to saying ‘no’ was about to make the tricky decision to say ‘yes’.

Watching the eyes can even help predict what number a person has in mind. Tobias Loetscher and his colleagues at the University of Zurich recruited 12 volunteers and tracked their eye movements while they reeled off a list of 40 numbers.

They found that the direction and size of the participants’ eye movements accurately predicted whether the number they were about to say was bigger or smaller than the previous one – and by how much. Each volunteer’s gaze shifted up and to the right just before they said a bigger number, and down and to the left before a smaller one. The bigger the shift from one side to the other, the bigger the difference between the numbers.

This suggests that we somehow link abstract number representations in the brain with movement in space. But the study does not tell us which comes first: whether thinking of a particular number causes changes in eye position, or whether the eye position influences our mental activity. In 2013, researchers in Sweden published evidence that it’s the latter that may be at work: eye movements may actually facilitate memory retrieval.

They recruited 24 students and asked each one to carefully examine a series of objects displayed to them in one corner of a computer screen. The participants were then told to listen to a series of statements about some of the objects they had seen, such as “The car was facing to the left” and asked to indicate as quickly as possible if each was true or false. Some participants were allowed to let their eyes roam about freely; others were asked to fix their gaze on a cross at the centre of the screen, or the corner where the object had appeared, for example.

The researchers found that those who were allowed to move their eyes spontaneously during recall performed significantly better than those who fixed on the cross. Interestingly, though, participants who were told to fix their gaze in the corner of the screen in which objects had appeared earlier performed better than those told to fix their gaze in another corner. This suggests that the more closely the participants’ eye movements during information encoding corresponded with those that occurred during retrieval of the information, the better they were at remembering the objects. Perhaps that’s because eye movements help us to recall the spatial relationships between objects in the environment at the time of encoding.

These eye movements can occur unconsciously. “When people are looking at scenes they have encountered before, their eyes are frequently drawn to information they have already seen, even when they have no conscious memory of it,” says Roger Johansson, a psychologist at Lund University who led the study.

Watching eye movements can also be used to nudge people’s decisions. One recent study showed – maybe worryingly – that eye-tracking can be exploited to influence the moral decisions we take.

Researchers asked participants complex moral questions such as “Can murder ever be justified?” and then displayed, on a computer screen, alternative answers (“sometimes justifiable” or “never justifiable”). By tracking the participants’ eye movements, and removing the two answer options immediately after a participant had spent a certain amount of time gazing at one of the two options, the researchers found that they could nudge the participants to provide that particular option as their answer.

“We didn’t give them any more information,” says neuroscientist Daniel Richardson of University College London, senior author of study. “We simply waited for their own decision-making processes to unfold and interrupted them at exactly the right point. We made them change their minds just by controlling when they made the decision.”

Richardson adds that successful salespeople may have some insight into this, and use it to be more persuasive with clients. “We think of persuasive people as good talkers, but maybe they’re also observing the decision-making process,” he says. “Maybe good salespeople can spot the exact moment you’re wavering towards a certain choice, and then offer you a discount or change their pitch.”

The ubiquity of eye-tracking apps for smartphones and other hand-held devices raises the possibility of altering people’s decision-making process remotely. “If you’re shopping online, they might bias your decision by offering free shipping at the moment you shift your gaze to a particular product.”

Thus, eye movements can both reflect and influence higher mental functions such as memory and decision-making, and betray our thoughts, beliefs, and desires. This knowledge may give us ways of improving our mental functions – but it also leaves us vulnerable to subtle manipulation by other people.

“The eyes are like a window into our thought processes, and we just don’t appreciate how much information might be leaking out of them,” says Richardson. “They could potentially reveal things that a person might want to suppress, such as implicit racial bias.”

“I can see eye-tracking apps being used for, say, supportive technologies that figure out what phone function you need and then help out,” he adds, “but if they’re left on all the time they could be used to track all sorts of other things. This would provide much richer information, and raises the possibility of unwittingly sharing our thoughts with others.”

http://www.bbc.com/future/story/20150521-how-the-eyes-betray-your-thoughts

People who need very little sleep

By Helen Thomson

What would you do if you had 60 days of extra free time a year? Ask Abby Ross, a retired psychologist from Miami, Florida, a “short-sleeper”. She needs only four hours sleep a night, so has a lot of spare time to fill while the rest of the world is in the land of nod.

“It’s wonderful to have so many hours in my day – I feel like I can live two lives,” she says.

Short-sleepers like Ross never feel lethargic, nor do they ever sleep in. They wake early – normally around four or five o’clock – raring to get on with their day. Margaret Thatcher may have been one – she famously said she needed just four hours a night, whereas Mariah Carey claims she needs 15.

What makes some people fantastically efficient sleepers, while others spend half their day snoozing? And can we change our sleeping pattern to make it more efficient?

In 2009, a woman came into Ying-Hui Fu’s lab at the University of California, San Francisco, complaining that she always woke up too early. At first, Fu thought the woman was an extreme morning lark – a person who goes to bed early and wakes early. However, the woman explained that she actually went to bed around midnight and woke at 4am feeling completely alert. It was the same for several members of her family, she said.

Fu and her colleagues compared the genome of different family members. They discovered a tiny mutation in a gene called DEC2 that was present in those who were short-sleepers, but not in members of the family who had normal length sleep, nor in 250 unrelated volunteers.

When the team bred mice to express this same mutation, the rodents also slept less but performed just as well as regular mice when given physical and cognitive tasks.

Getting too little sleep normally has a significant impact on health, quality of life and life expectancy. It can cause depression, weight gain and put you at greater risk of stroke and diabetes. “Sleep is so important, if you sleep well you can avoid many diseases, even dementia,” says Fu. “If you deprive someone of just two hours sleep a day, their cognitive functions become significantly impaired almost immediately.”

But why sleep is so important is still a bit of a mystery. The general consensus is that the brain needs sleep to do some housekeeping and general maintenance, since it doesn’t get much downtime during the day. While we sleep, the brain can repair cellular damage, remove toxins that accumulate during the day, boost flagging energy supplies and lay down memories.

“Clearly people with the DEC2 mutation can do the same cleaning up process in a shorter period of time – they are just more efficient than the rest of us at sleeping,” says Fu. “But how are they doing that? That’s the key question.”

Since discovering the DEC2 mutation, a lot of people have come forward claiming to only sleep a few hours a day, says Fu. Most of these had insomnia, she says. “We’re not focusing on those people who have sleeping issues that make them sleep less, we wanted to focus on people who sleep for a few hours and feel great.”

A positive outlook is common among all of the short-sleepers that Fu has studied. “Anecdotally,” she says, “they are all very energetic, very optimistic. It’s very common for them to feel like they want to cram as much into life as they can, but we’re not sure how or whether this is related to their mutations.”

Ross would seem to fit that mould. “I always feel great when I wake up,” she says. She has been living on four to five hours sleep every day for as long as she can remember.

“Those hours in the morning – around five o’clock – are just fabulous. It’s so peaceful and quiet and you can get so much done. I wish more shops were open at that time, but I can shop online, or I can read – oh there’s so much to read in this world! Or I can go out and exercise before anyone else is up, or talk to people in other time zones.”

Her short sleeping patterns allowed her to complete university in two and a half years, as well as affording her time to learn lots of new skills. For example, just three weeks after giving birth to her first son, Ross decided to use one of her early mornings to attempt to run around the block. It took her 10 minutes. The following day she did it again, running a little further. She slowly increased the time she ran, finally completing not one, but 37 marathons – one a month over three years – plus several ultramarathons. “I can get up and do my exercise before anyone else is up and then it’s done, out of the way,” she says.

As a child, Ross remembers spending very early mornings with her dad, another short-sleeper. “Our early mornings gave us such a special time together,” she says. Now, if she ever oversleeps – which she says has only ever happened a handful of times, her husband thinks she’s dead. “I just don’t lay in, I’d feel terrible if I did,” she says.

Fu has subsequently sequenced the genomes of several other families who fit the criteria of short-sleepers. They’re only just beginning to understand the gene mutations that lead to this talent, but in principle, she says, it might one day be possible to enable short sleeping in others.

Until then, are there any shortcuts to a more efficient night’s sleep for the rest of us? Neil Stanley, an independent sleep consultant, says yes: “The most effective way to improve your sleep is to fix your wake-up time in the morning.”

Stanley says that when your body gets used to the time it needs to wake up, it can use the time it has to sleep as efficiently as possible. “Studies show that your body prepares to wake up one and a half hours prior to actually waking up. Your body craves regularity, so if you chop and change your sleep pattern, your body hasn’t got a clue when it should prepare to wake up or not.”

You could also do yourself a favour by ignoring society’s views on sleep, he says. “There’s this social view that short sleeping is a good thing and should be encouraged – we’re always hauling out the example of Margaret Thatcher and top CEOs who don’t need much sleep. In fact, the amount of sleep you need is genetically determined as much as your height or shoe size. Some people need very little sleep, others need 11 or 12 hours to feel their best.”

Stanley says that a lot of people with sleep issues actually don’t have any problem sleeping, instead they have an expectation that they need to sleep for a certain amount of time. “If we could all figure out what kind of sleeper we are, and live our life accordingly, that would make a huge difference to our quality of life,” he says.

http://www.bbc.com/future/story/20150706-the-woman-who-barely-sleeps

Mystery of what sleep does to our brains may finally be solved

By Clare Wilson

It is one of life’s great enigmas: why do we sleep? Now we have the best evidence yet of what sleep is for – allowing housekeeping processes to take place that stop our brains becoming overloaded with new memories.

All animals studied so far have been found to sleep, but the reason for their slumber has eluded us. When lab rats are deprived of sleep, they die within a month, and when people go for a few days without sleeping, they start to hallucinate and may have epileptic seizures.

One idea is that sleep helps us consolidate new memories, as people do better in tests if they get a chance to sleep after learning. We know that, while awake, fresh memories are recorded by reinforcing connections between brain cells, but the memory processes that take place while we sleep have remained unclear.

Support is growing for a theory that sleep evolved so that connections in the brain can be pruned down during slumber, making room for fresh memories to form the next day. “Sleep is the price we pay for learning,” says Giulio Tononi of the University of Wisconsin-Madison, who developed the idea.

Now we have the most direct evidence yet that he’s right. Tononi’s team measured the size of these connections or synapses in brain slices taken from mice. The synapses in samples taken at the end of a period of sleep were 18 per cent smaller than those in samples taken from before sleep, showing that the synapses between neurons are weakened during slumber.

A good night’s sleep

Tononi announced these findings at the Federation of European Neuroscience Societies meeting in Copenhagen, Denmark, last week. “The data was very solid and well documented,” says Maiken Nedergaard of the University of Rochester, who attended the conference.

“It’s an extremely elegant idea,” says Vladyslav Vyazovskiy of the University of Oxford

If the housekeeping theory is right, it would explain why, when we miss a night’s sleep, the next day we find it harder to concentrate and learn new information – we may have less capacity to encode new experiences. The finding suggests that, as well as it being important to get a good night’s sleep after learning something, we should also try to sleep well the night before.

It could also explain why, if our sleep is interrupted, we feel less refreshed the next day. There is some indirect evidence that deep, slow-wave sleep is best for pruning back synapses, and it takes time for our brains to reach this level of unconsciousness.

Waking refreshed

Previous evidence has also supported the housekeeping theory. For instance, EEG recordings show that the human brain is less electrically responsive at the start of the day – after a good night’s sleep – than at the end, suggesting that the connections may be weaker. And in rats, the levels of a molecule called the AMPA receptor – which is involved in the functioning of synapses – are lower at the start of their wake periods.

The latest brain-slice findings that synapses get smaller is the most direct evidence yet that the housekeeping theory is right, says Vyazovskiy. “Structural evidence is very important,” he says. “That’s much less affected by other confounding factors.”

Protecting what matters

Getting this data was a Herculean task, says Tononi. They collected tiny chunks of brain tissue, sliced it into ultrathin sections and used these to create 3D models of the brain tissue to identify the synapses. As there were nearly 7000 synapses, it took seven researchers four years.

The team did not know which mouse was which until last month, says Tononi, when they broke the identification code, and found their theory stood up.

“People had been working for years to count these things. You start having stress about whether it’s really possible for all these synapses to start getting fatter and then thin again,” says Tononi.

The team also discovered that some synapses seem to be protected – the biggest fifth stayed the same size. It’s as if the brain is preserving its most important memories, says Tononi. “You keep what matters.”

https://www.newscientist.com/article/2096921-mystery-of-what-sleep-does-to-our-brains-may-finally-be-solved/

New discovery on brain chemistry of patients with schizophrenia and their relatives

katharine-thakkar

People with schizophrenia have different levels of the neurotransmitters glutamate and gamma-aminobutyric acidergic (GABA) than healthy people do, and their relatives also have lower glutamate levels, according to a study published online in Biological Psychiatry.

Using magnetic resonance spectroscopy, researchers discovered reduced levels of glutamate — which promotes the firing of brain cells — in both patients with schizophrenia and healthy relatives. Patients also showed reduced levels of GABA, which inhibits neural firing. Healthy relatives, however, did not.

Researchers are unsure why healthy relatives with altered glutamate do not show symptoms of schizophrenia or how they maintain normal GABA levels despite a predisposition to the illness.

“This finding is what’s most exciting about our study,” said lead investigator Katharine Thakkar, PhD, assistant professor of clinical psychology at Michigan State University, East Lansing. “It hints at what kinds of things have to go wrong for someone to express this vulnerability toward schizophrenia. The study gives us more specific clues into what kinds of systems we want to tackle when we’re developing new treatments for this very devastating illness.”

The study included 21 patients with chronic schizophrenia, 23 healthy relatives of other people with schizophrenia not involved in the study, and 24 healthy nonrelatives who served as controls.

Many experts believe there are multiple risk factors for schizophrenia, including dopamine and glutamate-GABA imbalance. Drugs that regulate dopamine do not work for all patients with schizophrenia. Dr. Thakkar believes magnetic resonance spectroscopy may help clinicians target effective treatments for specific patients.

“There are likely different causes of the different symptoms and possibly different mechanisms of the illness across individuals,” said Dr. Thakkar.

“In the future, as this imaging technique becomes more refined, it could conceivably be used to guide individual treatment recommendations. That is, this technique might indicate that one individual would benefit more from treatment A and another individual would benefit more from treatment B, when these different treatments have different mechanisms of action.”

—Jolynn Tumolo

References

Thakkar KN, Rösler L, Wijnen JP, et al. 7T proton magnetic resonance spectroscopy of GABA, glutamate, and glutamine reveals altered concentrations in schizophrenia patients and healthy siblings [publisehd online ahead of print April 19, 2016]. Biological Psychiatry.
Study uncovers clue to deciphering schizophrenia [press release]. Washington, DC: EurekAlert!; June 7, 2016.

You are surprisingly likely to have a living doppelganger

By Zaria Gorvett

It’s on your passport. It’s how criminals are identified in a line-up. It’s how you’re recognised by old friends on the street, even after years apart. Your face: it’s so tangled up with your identity, soon it may be all you need to unlock your smartphone, access your office or buy a house.

Underpinning it all is the assurance that your looks are unique. And then, one day your illusions are smashed.

“I was the last one on the plane and there was someone in my seat, so I asked the guy to move. He turned around and he had my face,” says Neil Douglas, who was on his way to a wedding in Ireland when it happened.

“The whole plane looked at us and laughed. And that’s when I took the selfie.” The uncanny events continued when Douglas arrived at his hotel, only to find the same double at the check-in desk. Later their paths crossed again at a bar and they accepted that the universe wanted them to have a drink. He woke up the next morning with a hangover and an Argentinian radio show on the phone – the picture had gone viral.

Folk wisdom has it that everyone has a doppelganger; somewhere out there there’s a perfect duplicate of you, with your mother’s eyes, your father’s nose and that annoying mole you’ve always meant to have removed. The notion has gripped the popular imagination for millennia – it was the subject of one of the oldest known works of literature – inspiring the work of poets and scaring queens to death.

But is there any truth in it? We live on a planet of over seven billion people, so surely someone else is bound to have been born with your face? It’s a silly question with serious implications – and the answer is more complicated than you might think.

In fact until recently no one had ever even tried to find out. Then last year Teghan Lucas set out to test the risk of mistaking an innocent double for a killer.

Armed with a public collection of photographs of U.S. military personnel and the help of colleagues from the University of Adelaide, Teghan painstakingly analysed the faces of nearly four thousand individuals, measuring the distances between key features such as the eyes and ears. Next she calculated the probability that two peoples’ faces would match.

What she found was good news for the criminal justice system, but likely to disappoint anyone pining for their long-lost double: the chances of sharing just eight dimensions with someone else are less than one in a trillion. Even with 7.4 billion people on the planet, that’s only a one in 135 chance that there’s a single pair of doppelgangers. “Before you could always be questioned in a court of law, saying ‘well what if someone else just looks like him?’ Now we can say it’s extremely unlikely,” says Teghan.

The results can be explained by the famed infinite monkey problem: sit a monkey in front of a typewriter for long enough and eventually it will surely write the Complete Works of William Shakespeare by randomly hitting, biting and jumping up and down on the keys on the board.

It’s a mathematical certainty, but reversing the problem reveals just how staggeringly long the monkey would have to toil. Ignoring grammar, the monkey has a one in 26 chance of correctly typing the first letter of Macbeth. So far, so good. But already by the second letter the chance has shrunk to one in 676 (26 x 26) and by the end of the fourth line (22 letters) it’s one in 13 quintillion. When you multiply probabilities together, the chances of something actually happening disappear very, very quickly.

Besides, the wide array in human guises is undoubtedly down to more than eight traits. Far from everyone having a long-lost “twin”, in Teghan’s view it’s more likely no one does.

But that’s not quite the end of the story. The study relied on exact measurements; if your doppelganger’s ears are 59 mm but yours are 60, your likeness wouldn’t count. In any case, you probably won’t remember the last time you clocked an uncanny resemblance based on the length of someone’s ears.

There may be another way – and it all comes down to what you mean by a doppelganger. “It depends whether we mean ‘lookalike to a human’ or ‘lookalike to facial recognition software’,” says David Aldous, a statistician at U.C. Berkeley.

Francois Brunelle, who has photographed over 200 pairs of doubles for his project I’m not a look-alike, agrees. “For me it’s when you see someone and you think it’s the other person. It’s the way of being, the sum of the parts.” When seen apart, his subjects looked like perfect clones. “When you get them together and you see them side by side, sometimes you feel that they are not the same at all.”
If fine details aren’t important, suddenly the possibility of having a lookalike looks a lot more realistic. But is this really true? To find out, first we need to get to grips with what’s going on when we recognise a familiar face.

Take the illusion of Bill Clinton and Al Gore which circulated the internet before their re-election in 1997. It features a seemingly unremarkable picture of the two men standing side by side. On closer inspection, you can see that Gore’s “internal” facial features – his eyes, nose and mouth – have been replaced by Clinton’s. Even without these traits, with his underlying facial structure intact Al Gore looks completely normal.

It’s a striking demonstration of the way faces are stored in the brain: more like a map than an image. When you bump into a friend on the street, the brain immediately sets to work recognising their features – such as hairline and skin tone – individually, like recognising Italy by its shape alone. But what if they’ve just had a haircut? Or they’re wearing makeup?

To ensure they can be recognised in any context, the brain employs an area known as the fusiform gyrus to tie all the pieces together. If you compare it to finding a country on a map, this is like checking it has a border with France and a coast. This holistic ‘sum of the parts’ perception is thought to make recognising friends a lot more accurate than it would be if their features were assessed in isolation. Crucially, it also fudges the importance of some of the subtler details.

“Most people concentrate on superficial characteristics such as hair-line, hair style, eyebrows,” says Nick Fieller, a statistician involved in The Computer-Aided Facial Recognition Project. Other research has shown we look to the eyes, mouth and nose, in that order.

Then it’s just a matter of working out the probability that someone else will have all the same versions as you. “There are only so many genes in the world which specify the shape of the face and millions of people, so it’s bound to happen,” says Winrich Freiwald, who studies face perception at Rockefeller University. “For somebody with an ‘average’ face it’s comparatively easy to find good matches,” says Fieller.

Let’s assume our man has short blonde hair, brown eyes, a fleshy nose (like Prince Philip, the Duke of Edinburgh), a round face and a full beard. Research into the prevalence of these features is hard to come by, but he’s off to a promising start: 55% of the global population has brown eyes.

Meanwhile more than one in ten people have round faces, according to research funded by a cosmetics company. Then there’s his nose. A study of photographs taken in Europe and Israel identified the ‘fleshy’ type as the most prevalent (24.2%). In the author’s view these are also the least attractive.

Finally – how much hair is there out there? If you thought this was too frivolous for serious investigation, you’d be wrong: among 24,300 people surveyed at a Florida theme park, 82% of men had hair shorter than shoulder-length. Natural blondes, however, constitute just 2%. As the ‘beard capital’ of the world, in the UK most men have some form of facial hair and nearly one in six have a full beard.
A simple calculation (male x brown eyes x blonde x round face x fleshy nose x short hair x full beard) reveals the probability of a person possessing all these features is just over one in 100,000 (0.00001020%).

That would give our guy no less than 74,000 potential doppelgangers. Of course many of these prevalence rates aren’t global, so this is very imprecise. But judging by the number of celebrity look-alikes out there, it might not be far off. “After the picture went viral I think there was a small army of us at some point,” says Douglas.

So what’s the probability that everyone has a duplicate roaming the earth? The simplest way to guess would be to estimate the number of possible faces and compare it to the number of people alive today.
You might expect that even if there are 7.4 billion different faces out there, with 7.4 billion people on the planet there’s clearly one for everyone. But there’s a catch. You’d actually need close to 150 billion people for that to be statistically likely. The discrepancy is down to a statistical quirk known as the coupon collector’s problem. Let’s say there are 50 coupons in a jar and each time you draw one it’s put back in. How many would you need to draw before it’s likely you’ve chosen each coupon at least once?

It takes very little time to collect the first few coupons. The trouble is finding the last few: on average drawing the last one takes about 50 draws on its own, so to collect all 50 you need about 225. It’s possible that most people have a doppelganger – but everyone? “There’s a big difference between being lucky sometimes and being lucky always,” says Aldous.

No one has any good idea what the first number is. Indeed, it may never be possible to say definitively, since the perception of facial resemblance is subjective. Some people have trouble recognising themselves in photos, while others rarely forget a face. And how we perceive similarity is heavily influenced by familiarity. “Some doubles when they get together, they say ‘No I don’t see it. Really, I don’t.’ It’s so obvious to everyone else; it’s a little crazy to hear that,” says Brunelle.
Even so, Fieller thinks there’s a good chance. “I think most people have somebody who is a facial lookalike unless they have a truly exceptional and unusual face,” he says. Friewald agrees. “I think in the digital age which we are entering, at some point we will know because there will be pictures of almost everyone online,” he says.

Why are we so interested anyway? “If you meet someone that looks like you, you have an instant bond because you share something.” Brunelle has received interest from thousands of people searching for their lookalikes, especially from China – a fact he puts down to the one-child policy. Research has shown we judge similar looking-people to be more trustworthy and attractive – a factor thought to contribute to our voting choices.

It may stem back to our deep evolutionary past, when facial resemblance was a useful indicator of kinship. In today’s globalised world, this is misguided. “It is entirely possible for two people with similar facial features to have DNA that is no more similar than that of two random people,” says Lavinia Paternoster, a geneticist at the University of Bristol.

And before you go fantasising about doing a temporary life-swap with your ‘twin’, there’s no guarantee you’ll have anything in common physically either. “Well I’m 5’7 and he’s 6’3… so it’s mainly in the face,” says Douglas.

http://www.bbc.com/future/story/20160712-you-are-surprisingly-likely-to-have-a-living-doppelganger