New treatment protocol shows promise of improving cognition in patients with Alzheimer’s disease

Ten patients with early Alzheimer’s disease or its precursors showed improvement in memory after treatment with Metabolic Enhancement for NeuroDegeneration (MEND), a programmatic and personalized therapy protocol.

Researchers described results from the small trial, which used quantitative MRI and neuropsychological testing of participants before and after treatment, in the study published online in Aging.

“ The magnitude of the improvement is unprecedented,” researchers wrote, “providing additional objective evidence that this programmatic approach to cognitive decline is highly effective.”

Before starting the program, the 10 participants had well-defined mild cognitive impairment, subjective cognitive impairment, or had been diagnosed with Alzheimer’s disease. Their subsequent treatment consisted of a complex, 36-point therapeutic personalized program that included comprehensive changes in diet, brain stimulation, exercise, optimization of sleep, specific pharmaceuticals and vitamins, and multiple additional steps that affect brain chemistry.

Researcher Dale Bredesen, MD, a professor at the Buck Institute for Research on Aging and at the Easton Laboratories for Neurodegenerative Disease Research at UCLA, Los Angeles, believes the protocol’s broader-based approach is key to its apparent success in reversing cognitive decline.

“Imagine having a roof with 36 holes in it, and your drug patched one hole very well — the drug may have worked, a single ‘hole’ may have been fixed, but you still have 35 other leaks, and so the underlying process may not be affected much,” Dr. Bredesen said. “We think addressing multiple targets within the molecular network may be additive, or even synergistic, and that such a combinatorial approach may enhance drug candidate performance as well.”

Tests showed some participants “going from abnormal to normal,” Dr. Bredesen said.

In Aging , researchers describe the impact of MEND on all 10 patients, including:
•A 66-year-old man whose neuropsychological testing was compatible with a diagnosis of mild cognitive impairment. After 10 months on the MEND protocol, his hippocampal volume increased from the 17 th percentile for his age to the 75 th percentile, with an associated absolute increase in volume of nearly 12%.
•A 69-year-old entrepreneur with 11 years of progressive memory loss. After 22 months on the protocol, he showed marked improvements in all categories of neuropsychological testing, with long-term recall increasing from the 3 rd to 84 th percentile.
•A 49-year-old woman in the early stages of cognitive decline who, after 9 months on the protocol, no longer showed evidence on quantitative neuropsychological testing of cognitive decline.

Plans for larger studies are under way.

“Even though we see the far-reaching implications of this success,” Dr. Bredesen said, “we also realize that this is a very small study that needs to be replicated in larger numbers at various sites.”

http://www.psychcongress.com/article/mend-protocol-reverses-memory-loss-alzheimer%E2%80%99s-disease-27858

Urinary biomarker of Parkinson’s disease identified

New findings indicate that phosphorylated LRRK2 (leucine-rich repeat kinase 2) protein levels in urine are elevated in patients diagnosed with idiopathic Parkinson Disease (PD), and that urinary phosphorylated LRRK2 levels correlate with the presence and severity of symptoms such as cognitive impairment in individuals with PD. Researchers affiliated with the University of Alabama at Birmingham published their findings in Neurology and in Movement Disorders (1,2).

The etiology of PD is currently unknown and mechanisms of action are still not completely clarified. It is well established, however, that aging is the single most important risk factor. PD is the second most frequent age-related neurodegenerative disorder, and one of the key pathogenic features is slow and progressive neuronal death that is concomitant with cognitive dysfunction. Current therapeutic modalities are inadequate and clinical need is significant. More than 6 million individuals worldwide are diagnosed with PD.

To date, several common genetic variants, or single nucleotide polymorphisms (SNPs), have been identified that influence the risk for disease. For example, polymorphic variants in LRRK2 gene have previously been validated as genetic factors that confer susceptibility to PD.

Although the gene remains poorly characterized, five different mutations in the gene encoding LRRK2 are considered a common cause of inherited PD (3). One of the five mutations that are causal is the G2019S mutation in the LRRK2 kinase domain, a mutation that significantly increases phosphorylation activity (1,3).

“There are currently no known ways to predict which G2019S mutation carriers will develop PD,” the authors wrote in the Neurology publication. Investigators purified LRRK2 protein from urinary exosomes collected from a total of 76 men. (Exosomes are membrane vesicles of endosomal origin that are secreted by most cells in culture, and are present in most biological fluids such as urine, blood, and saliva.) Then, they compared the ratio of phosphorylated LRRK2 to total LRRK2 in urine exosomes. Results show that “elevated … phosphorylated LRRK2 predicted the risk” for onset of PD in LRRK2 G2019S mutation carriers (1).

In their follow-up study, which was published in Movement Disorders, investigators compared phosphorylated LRRK2 levels in urine samples of 79 individuals diagnosed with PD to those of 79 healthy control participants. Results show that phosphorylated LRRK2 levels were significantly elevated in patients with PD when compared to those of controls. Also, phosphorylated LRRK2 levels correlated with the severity of cognitive impairment in patients with PD (2).

“Because few viable biomarkers for PD exist … phosphorylated LRRK2 levels may be a promising candidate for further exploration,” the authors concluded in their publication.

References
1. Fraser KB, Moehle MS, Alcalay RN, et al. Urinary LRRK2 phosphorylation predicts parkinsonian phenotypes in G2019S LRRK2 carriers. Neurology. 2016;86:994-999.
2. Fraser KB, Rawlins AB, Clar RG, et al. Ser(P)-1292 LRRK2 in urinary exosomes is elevated in idiopathic Parkinson’s disease. Mov Disord. 2016. doi: 10.1002/mds.26686.
3. Greggio E, Cookson MR. Leucine-rich repeat kinase 2 mutations and Parkinson’s disease: three questions. ASN Neuro. 2009;1:e00002.

http://www.psychiatryadvisor.com/neurocognitive-disorders/urinary-biomarker-of-parkinson-disease-identified/article/508195/?DCMP=EMC-PA_Update_RD&cpn=psych_md,psych_all&hmSubId=&hmEmail=5JIkN8Id_eWz7RlW__D9F5p_RUD7HzdI0&NID=1710903786&dl=0&spMailingID=14919209&spUserID=MTQ4MTYyNjcyNzk2S0&spJobID=820575619&spReportId=ODIwNTc1NjE5S0

Will machines one day control our decisions?

New research suggests it’s possible to detect when our brain is making a decision and nudge it to make the healthier choice.

In recording moment-to-moment deliberations by macaque monkeys over which option is likely to yield the most fruit juice, scientists have captured the dynamics of decision-making down to millisecond changes in neurons in the brain’s orbitofrontal cortex.

“If we can measure a decision in real time, we can potentially also manipulate it,” says senior author Jonathan Wallis, a neuroscientist and professor of psychology at the University of California, Berkeley. “For example, a device could be created that detects when an addict is about to choose a drug and instead bias their brain activity towards a healthier choice.”

Located behind the eyes, the orbitofrontal cortex plays a key role in decision-making and, when damaged, can lead to poor choices and impulsivity.

While previous studies have linked activity in the orbitofrontal cortex to making final decisions, this is the first to track the neural changes that occur during deliberations between different options.

“We can now see a decision unfold in real time and make predictions about choices,” Wallis says.

Measuring the signals from electrodes implanted in the monkeys’ brains, researchers tracked the primates’ neural activity as they weighed the pros and cons of images that delivered different amounts of juice.

A computational algorithm tracked the monkeys’ orbitofrontal activity as they looked from one image to another, determining which picture would yield the greater reward. The shifting brain patterns enabled researchers to predict which image the monkey would settle on.

For the experiment, they presented a monkey with a series of four different images of abstract shapes, each of which delivered to the monkey a different amount of juice. They used a pattern-recognition algorithm known as linear discriminant analysis to identify, from the pattern of neural activity, which picture the monkey was looking at.

Next, they presented the monkey with two of those same images, and watched the neural patterns switch back and forth to the point where the researchers could predict which image the monkey would choose based on the length of time that the monkey stared at the picture.

The more the monkey needed to think about the options, particularly when there was not much difference between the amounts of juice offered, the more the neural patterns would switch back and forth.

“Now that we can see when the brain is considering a particular choice, we could potentially use that signal to electrically stimulate the neural circuits involved in the decision and change the final choice,” Wallis says.

Erin Rich, a researcher at the Helen Wills Neuroscience Institute, is lead author of the study published in the journal Nature Neuroscience. The National Institute on Drug Abuse and the National Institute of Mental Health funded the work.

http://www.futurity.org/brains-decisions-1181542/

Rare Form of MS May Be Caused by a Single Gene Mutation

A single genetic mutation may increase a person’s risk of developing a rare, severe form of multiple sclerosis (MS) by roughly 60 percent, according to a study published recently in the journal Neuron.

That’s an unusually straightforward result for a complex disease like MS, which has previously been traced to hundreds of mutations that each increases the risk of developing the disease only slightly.

“That’s why our finding is unprecedented,” Carles Vilariño-Güell, Ph.D., an assistant professor of medical genetics at The University of British Columbia and one of the paper’s senior authors, told Healthline.

His team found the mutation by combing through a database of Canadians with MS who had donated blood samples as part of the Canadian Collaborative Project on Genetic Susceptibility to MS.

Some of these samples belonged to a family that was disproportionately diagnosed with the disease. Four first cousins and two parents developed MS.

The team isolated a common mutation from their DNA, and looked for that mutation in other individuals in the database.

That’s how they found a second family similarly afflicted. Three first cousins and two parents were diagnosed with MS.

Having so many cases of MS within a family is rare. The disease is not considered truly heritable, although a person’s risk does increase if a parent or sibling has the disease.

The families shared another rare trait. Most had the more severe version of the disease known as primary progressive MS, which makes up 10 to 15 percent of all MS cases.

Treatments for primary progressive MS have so far eluded scientists, although there are promising clinical trials underway of a drug called Ocrelizumab.


Future Research

The study found the mutation only in a handful of people, all of whom were diagnosed with a rare form of the disease.

Therefore, the researchers don’t suggest they have found the genetic basis of MS.

But they do think they’ve discovered a way to study how the disease progresses in the body and what drugs could be developed to slow or even stop it

Bruce Bebo, Ph.D., vice president of research at the National Multiple Sclerosis Society, agrees.

“Studying the genetics of a very rare form that is inherited can give us clues about pathways involved in MS in the general population,” he told Healthline.

The mutation appears to disable a regulatory gene called NR1H3, which codes for a protein that helps regulate the inflammation and the metabolism of lipids.

The researchers now plan to engineer a similar mutation in mice so they can study the outcome of a disabled NR1H3 gene and test potential new drugs in an animal model.

And because the NR1H3 pathway has already been implicated in diseases like atherosclerosis and heart disease, there are already drugs in clinical trials for safety that could be repurposed for treating MS, Vilariño-Güell said.

“Understanding the genetics of MS could help us get closer to individualizing therapy to people for better outcomes,” Bebo said.

Getting Personal with Treatment

People with a disease like MS, which appears in so many different ways and can be linked to so many different genetic components, could benefit by personalized medicine.

If the mechanism of each disease causing mutation or group of mutations is pinpointed, scientists could potentially design more effective, targeted treatments rather than the standard one-size-fits-all therapies.

That means tracking down the many different genetic hotspots that are linked to MS.

Overall, genetic predisposition accounts for only about a third of a person’s risk of developing the disease, Bebo said. Within that category only about half the genes responsible can be identified.

Researchers don’t know where the other half of that genetic risk comes from, Bebo said, but it makes sense that it would include rare mutations like this one that help explain risk in a small fraction of MS patients.

And there could be many different versions of these mutations.

“Odds are if you look at a different family the genetic risk would probably be something different than this,” Bebo said.

Speeding Through the Genome

The Canadian database has been available since the late 1990s, but only recently has the team had access to exome sequencing, a powerful, efficient tool that makes searching for tiny genetic changes easier.

This technique sequences only the DNA that codes for proteins — leaving the other 98 percent behind. It’s like speed reading the genome.

Exome sequencing has been particularly helpful for finding so-called “Mendelian” diseases — diseases that can be traced to a single, heritable mutation just like Gregor Mendel’s purple and white pea flowers. Cystic fibrosis and sickle cell anemia are two examples of these diseases.

With this discovery, the researchers say that have found a Mendelian form of MS.

That doesn’t mean the discovery won’t be beneficial for the 85 percent of people diagnosed with relapsing remitting MS. In many of those patients, the disease eventually changes course and becomes progressive.

Whatever is learned about primary progressive MS — a condition that doesn’t respond to treatments for other types of MS — could also potentially help those with secondary progressive MS, the researchers say.

http://www.healthline.com/health-news/form-of-ms-could-be-caused-by-single-genetic-mutation#5

The Reanima Project – Scientists Are Attempting to Reanimate the Brain Dead


Model of the human brain. The Reanima Project aims to regrow parts of the brain stem.

by Philip Perry

Imagine this, your loved one gets into a serious accident. You and your family gather at the hospital. In the I.C.U. the doctor makes a grim announcement, they‘re brain dead. It is highly unlikely they will ever come out of a vegetative state. Today, there is no way past such horror, save for a miracle. But if one biotech firm has its way, soon doctors would be able to regrow the person’s brain, using a new procedure and a host of technologies, which could theoretically restore them to who they were before. Even so, there are lots of questions and ethical dilemmas surrounding this procedure, and the advancements it may someday thrust upon the world.

The idea originates from nature, as certain fish and amphibians can actually heal whole sections of the brain, brain stem, and other portions of the central nervous system, even after significant injury. Scientists believe they can someday mimic this process in human patients.

This study surrounds Bioquark, Inc., a Philadelphia-based company, who has received ethical approval by a U.S. and Indian Institutional Review Board. Bioquark will collaborate with Revita Life Sciences, led by famed specialist Dr. Himanshu Bansaa. The team will run a pilot study of 20 clinically brain dead patients, each having suffered a traumatic brain injury (TBI). Taking place at Anupam Hospital in India, Bioquark is currently recruiting patients for the study, expected to take place over six weeks.

Known as the “Reanima Project,” several different therapies will be employed in combination, including stem cells injected into the brain to try and regrow damaged portions, lasers, nerve stimulation techniques—which have been successful in waking patients out of a coma, and a combination of different peptides. The peptides will be introduced daily through a spinal cord pump, and the stem cells injected every other week. The patients will be evaluated and monitored for months with brain imaging technology and an EEG to see if the brain, particularly the upper spinal cord or lower brain stem region, is regenerated. This is the oldest part of the brain which controls breathing and heartbeat.

The CEO of Bioquark Inc. Dr. Ira Pastor, said in a statement that this was the first step toward the “eventual reversal of death in our lifetime.” He believes they will achieve results within the first couple of months or so. This is the seminal stage, a “proof of concept” study. If you are afraid of the zombie apocalypse, Dr. Pastor says a common sense protocol, adopted industry-wide, should avoid any nasty scenarios from taking place. But every technology or advanced method is always thought ironclad at the onset. He believes this study will show that brain death is recoverable. Dr. Bansal has attempted a similar procedure on two brain dead patients, one in Europe and another in the Persian Gulf. They are currently in a “minimal conscious state,” but may still come out of it.

According to Dr. Bansal, “We are now trying to create a definitive study in 20 subjects and prove that the brain death is reversible. This will open the door for future research and especially for people who lose their dear ones suddenly.” Brain stem death is defined as the loss of such functions as breathing and consciousness. When a person’s brain stem has stopped functioning, there is no chance for recovery, as it stands.

Those on life support deemed brain dead still have active bodies which grow, mature, heal, digest, circulate blood, and excrete waste. A woman can even gestate and deliver a baby in this state. Some new studies suggest that even after brain death, blood flow and limited electrical activity take place inside the brain. But it isn’t enough to repair the damage, nor live without life support.

Dr. Sergei Paylian is the founder, president, and chief science officer of Bioquark Inc. He said that this experiment is not only important in developing our understanding of brain death, but also the vegetative and minimally conscious states, coma, and even neurodegenerative conditions, like Parkinson’s and Alzheimer’s. Critics urge that though these areas may not be irreparable, one pilot study is far from a complete neurological transformation. Truly it will take years or even decades for such a technique to be refined, should it even work.

Beyond that, advancements in science are always a mixed blessing. The splitting of the atom brought the microwave, the horrors of Hiroshima and Nagasaki, and generations afterward living under constant fear of nuclear annihilation. The internal combustion engine has wrought the transportation industry and climate change. What could reanimating a human brain after such trauma ultimately produce?

One wonders if neurons will grow back exactly as they were, or will the person be a blank slate? The attempt will try and engage a functional epimorphic event. Epimorphic cells are those that can wipe their memory banks clean and start anew. So is this what will happen with the brain dead, should their brains be neuro-regenerated? Think of the emotional trauma to families who aren’t recognized by a healed loved one, not to mention the trauma to the person themselves? Will adults be like walking babies and need to relearn everything over again? Will it be like with amnesia? There’s no way to tell at this point.

http://bigthink.com/philip-perry/scientists-attempt-to-reanimate-the-brain-dead-what-are-the-implications?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+bigthink%2Fmain+%28Big+Think+Main%29

Brain activity differs between men and women when cooperating


When it comes to social behavior, there are clear differences between men and women, and a new study suggests cooperation with others is no exception.

Written by Honor Whiteman

Published in the journal Scientific Reports, the study reveals that men and women show significant differences in brain activity when working with others in order to complete a task.

The research team – co-led by Joseph Baker, Ph.D., a postdoctoral fellow at Stanford University School of Medicine – says the findings may shed light on the evolutionary differences in cooperation between men and women.

Additionally, they could help inform new strategies to enhance cooperation, which could prove useful for people with disorders that affect social behavior, such as autism.

This latest study is not the first to identify sex differences in cooperation – defined as “a situation in which people work together to do something.”

For example, previous research has shown that a pair of men tend to cooperate better than a pair of women. In mixed-sex pairs, however, women tend to cooperate better than men.

While a number of theories have been put forward to explain these differences, Baker and colleagues note that there is limited data on the neurological processes at play.


The cooperation task

To further investigate, the team enrolled 222 participants – of whom 110 were female – and assigned each of them a partner.

Each pair was made up of either two males, two females, or one male and one female.

The pairs were required to engage in a cooperation task, in which each partner sat in front of a computer opposite from one another. Each partner could see the other, but they were instructed not to talk.

Each individual was instructed to press a button when a circle on their computer screen changed color; their goal was to try and press the button at the same time as their partner.

The pairs were given 40 tries to get the timing of their button presses as close to each other as possible, and after each try, they were told which partner had pressed the button first.

During the task, the researchers recorded the brain activity of each participant simultaneously using hyperscanning and functional near-infrared spectroscopy (fNIRS).

“We developed this test because it was simple, and you could easily record responses,” notes senior study author Dr. Allan Reiss, professor of psychiatry and behavioral sciences and psychology at Stanford.

No ‘interbrain coherence’ when opposite-sex pairs cooperate

Overall, the team found that, compared with female-female pairs, male-male pairs were better at timing their button pushes more closely.

From the brain imaging results, however, the researchers noticed that both partners in each of the same-sex pairs had highly synchronized brain activity during the task – representing greater “interbrain coherence.”

“Within same-sex pairs, increased coherence was correlated with better performance on the cooperation task,” says Baker. “However, the location of coherence differed between male-male and female-female pairs.”

Interestingly, the cooperation performance of male-female pairs was just as good as that of male-male pairs, though opposite-sex pairs showed no evidence of interbrain coherence.

“It’s not that either males or females are better at cooperating or can’t cooperate with each other. Rather, there’s just a difference in how they’re cooperating.” – Dr. Allan Reiss

Baker cautions that their study is “pretty exploratory,” noting that it does not look at all forms of cooperation.

What is more, the researchers did not assess activity in all regions of participants’ brains, and they note that it is possible interbrain coherence in opposite-sex pairs arose in these unmeasured areas.

Still, they believe their findings may help researchers learn more about how cooperation has evolved differently between men and women, and they may even lead to new ways to boost cooperation, which could have clinical implications.

“There are people with disorders like autism who have problems with social cognition,” says Baker. “We’re absolutely hoping to learn enough information so that we might be able to design more effective therapies for them.”

http://www.medicalnewstoday.com/articles/310879.php

Toxoplasma infection might trigger neurodegenerative disease


Infection with the common parasite Toxoplasma gondii promotes accumulation of a neurotransmitter in the brain called glutamate, triggering neurodegenerative diseases in individuals predisposed to such conditions.

Written by Honor Whiteman

This is the finding of a new study conducted by researchers from the University of California-Riverside (UC-Riverside), recently published in PLOS Pathogens.

T. gondii is a single-celled parasite that can cause a disease known as toxoplasmosis.

Infection with the parasite most commonly occurs through eating undercooked, contaminated meat or drinking contaminated water.

It may also occur through accidentally swallowing the parasite after coming into contact with cat feces – by cleaning a litter tray, for example.

Though more than 60 million people in the United States are believed to be infected with T. gondii, few people become ill from it; a healthy immune system can normally stave it off.

As such, most people who become infected with the parasite are unaware of it.

Those who do become ill from T. gondii infection may experience flu-like symptoms – such as swollen lymph glands or muscle aches – that last for at least a month.

In severe cases, toxoplasmosis can cause damage to the eyes, brain, and other organs, though such complications usually only arise in people with weakened immune systems.

The new study, however, suggests there may be another dark side to T. gondii infection: it may lead to development of neurodegenerative disease in people who are predisposed to it.

To reach their findings, lead author Emma Wilson – an associate professor in the Division of Biomedical Sciences at the UC-Riverside School of Medicine – and colleagues focused on how T. gondii infection in mice affects glutamate production

How a build-up of glutamate can damage the brain

Glutamate is an amino acid released by nerve cells, or neurons. It is one of the brain’s most abundant excitatory neurotransmitters, aiding communication between neurons.

However, previous studies have shown that too much glutamate may cause harm; a build-up of glutamate is often found in individuals with traumatic brain injury (TBI) and people with certain neurodegenerative diseases, such as multiple sclerosis (MS) and amyotrophic lateral sclerosis (ALS).

The researchers explain that excess glutamate accumulates outside of neurons, and this build-up is regulated by astrocytes – cells in the central nervous system (CNS).

Astrocytes use a glutamate transporter called GLT-1 in an attempt to remove excess glutamate from outside of neurons and convert it into a less harmful substance called glutamine, which cells use for energy.

“When a neuron fires, it releases glutamate into the space between itself and a nearby neuron,” explains Wilson. “The nearby neuron detects this glutamate, which triggers a firing of the neuron. If the glutamate isn’t cleared by GLT-1 then the neurons can’t fire properly the next time and they start to die.”


T. gondii increases glutamate by inhibiting GLT-1

n mice infected with T. gondii, the researchers identified an increase in glutamate levels.

They found that the parasite causes astrocytes to swell, which impairs their ability to regulate glutamate accumulation outside of neurons.

Furthermore, the parasite prevents GLT-1 from being properly expressed, which causes an accumulation of glutamate and misfiring of neurons. This may lead to neuronal death, and ultimately, neurodegenerative disease.

“These results suggest that in contrast to assuming chronic Toxoplasma infection as quiescent and benign, we should be aware of the potential risk to normal neurological pathways and changes in brain chemistry.” – Emma Wilson

Next, the researchers gave the infected mice an antibiotic called ceftriaxone, which has shown benefits in mouse models of ALS and a variety of CNS injuries.

They found the antibiotic increased expression of GLT-1, which led to a reduction in glutamate build-up and restored neuronal function.

Wilson says their study represents the first time that T. gondii has been shown to directly disrupt a key neurotransmitter in the brain.

“More direct and mechanistic research needs to be performed to understand the realities of this very common pathogen,” she adds.

While their findings indicate a link between T. gondii infection and neurodegenerative disease, Wilson says they should not be cause for panic.

“We have been living with this parasite for a long time,” she says. “It does not want to kill its host and lose its home. The best way to prevent infection is to cook your meat and wash your hands and vegetables. And if you are pregnant, don’t change the cat litter.”

The team now plans to further investigate what causes the reduced expression of GLT-1 in T. gondii infection.

http://www.medicalnewstoday.com/articles/310865.php

Computers can now accurately predict future development of schizophrenia based on how a person talks


A new study finds an algorithmic word analysis is flawless at determining whether a person will have a psychotic episode.

by ADRIENNE LAFRANCE

Although the language of thinking is deliberate—let me think, I have to do some thinking—the actual experience of having thoughts is often passive. Ideas pop up like dandelions; thoughts occur suddenly and escape without warning. People swim in and out of pools of thought in a way that can feel, paradoxically, mindless.

Most of the time, people don’t actively track the way one thought flows into the next. But in psychiatry, much attention is paid to such intricacies of thinking. For instance, disorganized thought, evidenced by disjointed patterns in speech, is considered a hallmark characteristic of schizophrenia. Several studies of at-risk youths have found that doctors are able to guess with impressive accuracy—the best predictive models hover around 79 percent—whether a person will develop psychosis based on tracking that person’s speech patterns in interviews.

A computer, it seems, can do better.

That’s according to a researchers at Columbia University, the New York State Psychiatric Institute, and the IBM T. J. Watson Research Center. They used an automated speech-analysis program to correctly differentiate—with 100-percent accuracy—between at-risk young people who developed psychosis over a two-and-a-half year period and those who did not. The computer model also outperformed other advanced screening technologies, like biomarkers from neuroimaging and EEG recordings of brain activity.

“In our study, we found that minimal semantic coherence—the flow of meaning from one sentence to the next—was characteristic of those young people at risk who later developed psychosis,” said Guillermo Cecchi, a biometaphorical-computing researcher for IBM Research, in an email. “It was not the average. What this means is that over 45 minutes of interviewing, these young people had at least one occasion of a jarring disruption in meaning from one sentence to the next. As an interviewer, if my mind wandered briefly, I might miss it. But a computer would pick it up.”

Researchers used an algorithm to root out such “jarring disruptions” in otherwise ordinary speech. Their semantic analysis measured coherence and two syntactic markers of speech complexity—including the length of a sentence and how many clauses it entailed. “When people speak, they can speak in short, simple sentences. Or they can speak in longer, more complex sentences, that have clauses added that further elaborate and describe the main idea,” Cecchi said. “The measures of complexity and coherence are separate and are not correlated with one another. However, simple syntax and semantic incoherence do tend to aggregate together in schizophrenia.”

Here’s an example of a sentence, provided by Cecchi and revised for patient confidentiality, from one of the study’s participants who later developed psychosis:

I was always into video games. I mean, I don’t feel the urge to do that with this, but it would be fun. You know, so the one block thing is okay. I kind of lied though and I’m nervous about going back.

While the researchers conclude that language processing appears to reveal “subtle, clinically relevant mental-state changes in emergent psychosis,” their work poses several outstanding questions. For one thing, their sample size of 34 patients was tiny. Researchers are planning to attempt to replicate their findings using transcripts from a larger cohort of at-risk youths.

They’re also working to contextualize what their findings might mean more broadly. “We know that thought disorder is an early core feature of schizophrenia evident before psychosis onset,” said Cheryl Corcoran, an assistant professor of clinical psychiatry at Columbia University. “The main question then is: What are the brain mechanisms underlying this abnormality in language? And how might we intervene to address it and possibly improve prognosis? Could we improve the concurrent language problems and function of children and teenagers at risk, and either prevent psychosis or at least modify its course?”

Intervention has long been the goal. And so far it has been an elusive one. Clinicians are already quite good at identifying people who are at increased risk of developing schizophrenia, but taking that one step farther and determining which of those people will actually end up having the illness remains a huge challenge.

“Better characterizing a behavioral component of schizophrenia may lead to a clearer understanding of the alterations to neural circuitry underlying the development of these symptoms,” said Gillinder Bedi, an assistant professor of clinical psychology at Columbia University. “If speech analyses could identify those people most likely to develop schizophrenia, this could allow for more targeted preventive treatment before the onset of psychosis, potentially delaying onset or reducing the severity of the symptoms which do develop.”

All this raises another question about the nature of human language. If the way a person speaks can be a window into how that person is thinking, and further, a means of assessing how they’re doing, which mechanisms of language are really most meaningful? It isn’t what you say, the aphorism goes, it’s how you say it. Actually, though, it’s both.

As Cecchi points out, the computer analysis at the center of the study didn’t include any acoustic features like intonation, cadence, volume—all characteristics which could be meaningful in interpreting a person’s pattern of speaking and, by extension, thinking. “There is a deeper limitation, related to our current understanding of language and how to measure the full extent of what is being expressed and communicated when people speak to each other, or write,” Cecchi said. “The discriminative features that we identified are still a very simplified description of language. Finally, while language provides a unique window into the mind, it is still just one aspect of human behavior and cannot fully substitute for a close observation and interaction with the patient.”

http://www.theatlantic.com/technology/archive/2015/08/speech-analysis-schizophrenia-algorithm/402265/

Ingredient in green tea shown to help people with Down’s syndrome

By Agence France-Presse

A chemical in green tea has been shown to improve cognitive ability in people with Down’s syndrome, scientists and doctors said on Tuesday.

In a year-long clinical trial, the treatment led to improved scores on memory and behaviour tests, they reported in a study, published in the The Lancet Neurology.

The positive impact remained six months after the trial ended.

Brain scans revealed that the compound, called epigallocatechin gallate, altered the way neurons in the brain connect with one another.

“This is the first time that a treatment has shown efficacy in the cognitive improvement of persons with this syndrome,” said Mara Dierssen, senior author of the study and a researcher at the Centre for Genomic Regulation in Barcelona.

While significant, she added in a statement, the results should not be interpreted as a “cure”.

“But it may be a tool to improve these individuals’ quality of life.”

Experts not involved in the study described it as “exciting” and “an important piece of work.”

At the same time, they cautioned, the findings must be validated in additional trials.

Down’s syndrome is the most common genetic form of intellectual disability, and afflicts approximately one in 1,000 people, according to the World Health Organisation.

Also known as trisomy 21, the condition is caused by the presence of an extra, or third, copy of chromosome number 21.

Humans normally have 23 pairs of chromosomes, which together contain up to 25,000 protein-coding genes.

In Down’s syndrome, the extra copy causes some of the genes in chromosome 21 to be “over-expressed”, leading to reduced cognitive abilities and other health problems.

In earlier experiments with mice designed to mimic Down’s, Dierssen had shown that inhibiting one of these genes, DYRK1A, improved function and development in the brain.

But the technique used – gene therapy – was not an option for humans, so the researchers turned to the green tea compound.

In the trials, 84 young adults with Down’s syndrome were split into two groups.

One was given a decaffeinated green tea supplement containing 45 percent epigallocatechin gallate, along with weekly online cognitive training.

The second group had the same training, but ingested a look-alike placebo instead of the supplement.

The subjects took cognitive tests after three, six and 12 months.

There was little-to-no change in most categories, but in a few – the ability to remember patterns, verbal recall, adaptive behaviour – the “green tea” group scored significantly better.

Moreover, they improved over time.

“It’s exciting that an understanding of the genetic neurobiology of Down’s syndrome is leading to the possibility of disorder-specific treatments,” said David Nutt, head of the Centre for Neuropsychopharmacology at Imperial College London, in commenting on the study.

Marie-Claude Potier, a Down’s specialist at the Brain and Spine Institute in Paris, said the results were a “leap forward,” but that safety and efficacy need to be confirmed.

Still genetics is not everything, cautioned another pair of researchers, even as they recognised the importance of the new study.

“We can no longer afford to view someone with Down’s syndrome solely through the lens of trisomy 21,” noted Fabian Fernandez and Jamie Edgin of the Evelyn F. McKnight Brain Institute at the University of Arizona in a commentary.

It is equally important to “understand each individual in light of their larger genetic and environmental background,” as well as other health problems and access to education, they wrote in The Lancet Neurology.

http://www.telegraph.co.uk/news/2016/06/06/downs-syndrome-can-be-treated-with-green-tea/

Fish can recognize human faces, study shows

by Jamie K. White

Can your pet fish recognize your face? A new study says, Yes, it probably can.

Researchers studying archerfish found the fish can tell a familiar human face from dozens of new faces with surprising accuracy.

This is a big, big deal. It’s the first time fish have demonstrated this ability.

Think about it: All faces have two eyes sitting above a nose and a mouth. And for us to be able to tell them apart, we need to be able to pick up the subtle differences in features.

We’re good at this because we are smart, i.e. we have large and complex brains. Other primates can do this too. Some birds as well.

But a fish? A fish has a tiny brain. And it would have no reason in its evolution to learn how to recognize humans.

So this study, published Tuesday in the journal “Scientific Reports,” throws on its head all our conventional thinking. It was done by scientists at University of Oxford in the U.K. and the University of Queensland in Australia.

And, for us, it raises many, many questions:

Does this mean my pet goldfish knows me? Do fish recognize each other? CAN DORY REALLY FIND NEMO?

To find out more, we talked to Dr. Cait Newport, a research fellow in Oxford University’s zoology department and co-author of the study.

What were the scientists trying to figure out?

The scientists wanted to know how well animals with simple brains do with facial recognition. A fish was a good choice. Their brains lack the section that we use for facial recognition. That made them perfect as subjects for an experiment to see if simple brains can perform complex tasks.

What’s an archerfish?

It’s a species of tropical fish. They spit jets of water from their mouth to knock down insects from branches. They’re the sharpshooters of the animal kingdom.

Why did scientists use archerfish?

Archerfish can indicate a choice clearly (the spitting) whereas other fish cannot. “There is no ambiguity in where they are shooting,” Newport said.

How did the experiment work?

Scientists presented the fish with two images of human faces and trained them to choose one by spitting their jets at that picture.

Wait, hold up. How do you ‘train’ an archerfish?

The old, time-tested way. Bribe them. When they spit at the image the scientists wanted them to spit at, they were rewarded with a pellet of food, Newport said.

How long did that take?

In some cases, only a few days. In others, up to two weeks. “Something like 60 to 90 trials,” Newport said.

How many people did it take?

A total of four (really smart) people: Newport and her co-authors Guy Wallis, Yarema Reshitnyk and Ulrike E Siebeck.

What did they do?

They presented the fish with the picture of the face they wanted the fish to learn and a bunch of new faces. Up to 44 new ones. The fish were able to pick the familiar face correctly 81% of the time.

Impressive. And then?

The researchers decided to make things a little harder. They took the pictures and made them black and white and evened out the head shapes. You’d think that would throw the fish for a loop. But no, they were able to pick the familiar face even then — and with more accuracy: 86%!

What will they test next?

They plan to test for other recognitions beyond just faces, Newport said.

Do fish only recognize human faces?

Humans use lots of devices to recognize people, including social cues. “Fish are not doing this,” Newport said. “For them, they are just looking for patterns.” That would answer the question whether Dory could find Nemo.

Finally, for the big one: Does my pet fish know me?

Possibly.

“There’s something like 30,000 species of fish. A blind fish is not going to be able to do this, sharks are fish and they can see color — so maybe,” Newport said.

Then she shared this observation.

When strangers walk into her lab, the fish “act skittish,” she said.

“When I walk in, they start spitting at me — many cases right in the eye.”

How’s that for accuracy?

http://www.cnn.com/2016/06/07/health/fish-human-face-recognition-study-trnd/