Study: Early humans loved to eat brains

antelope1-4_3_rx404_c534x401

Early members of the human family enjoyed digging into the heads of antelope and wildebeests.

Sets of animal bones recently unearthed in Kenya, believed to be the earliest evidence of hominid hunting, show previous members of the human family enjoyed digging into the heads of antelope and wildebeests, as well as snacking on gazelle meat.

They knew a thing or two about butchery, too, cutting the animals into parts before selecting the meatiest bones.

Scientists also have found a disproportionate number of animal skulls in the area, suggesting our ancestors scavenged the untouched heads from carcasses left behind by big cats after their own meals.

Dents inside the skulls indicate they dug in with stones to get at the delicious, juicy brains inside. According to a study of the findings, this nutrient-rich brain tissue may have helped homo erectus support larger bodies, bigger brains, and travel longer distances.

http://www.usatoday.com/story/news/nation/2013/05/05/evolution-early-humans-ate-brains/2136493/

Cocaine Vaccine Passes Key Testing Hurdle of Preventing Drug from Reaching the Brain – Human Clinical Trials soon

cocaine

Researchers at Weill Cornell Medical College have successfully tested their novel anti-cocaine vaccine in primates, bringing them closer to launching human clinical trials. Their study, published online by the journal Neuropsychopharmacology, used a radiological technique to demonstrate that the anti-cocaine vaccine prevented the drug from reaching the brain and producing a dopamine-induced high.

“The vaccine eats up the cocaine in the blood like a little Pac-man before it can reach the brain,” says the study’s lead investigator, Dr. Ronald G. Crystal, chairman of the Department of Genetic Medicine at Weill Cornell Medical College. “We believe this strategy is a win-win for those individuals, among the estimated 1.4 million cocaine users in the United States, who are committed to breaking their addiction to the drug,” he says. “Even if a person who receives the anti-cocaine vaccine falls off the wagon, cocaine will have no effect.”

Dr. Crystal says he expects to begin human testing of the anti-cocaine vaccine within a year.

Cocaine, a tiny molecule drug, works to produce feelings of pleasure because it blocks the recycling of dopamine — the so-called “pleasure” neurotransmitter — in two areas of the brain, the putamen in the forebrain and the caudate nucleus in the brain’s center. When dopamine accumulates at the nerve endings, “you get this massive flooding of dopamine and that is the feel good part of the cocaine high,” says Dr. Crystal.

The novel vaccine Dr. Crystal and his colleagues developed combines bits of the common cold virus with a particle that mimics the structure of cocaine. When the vaccine is injected into an animal, its body “sees” the cold virus and mounts an immune response against both the virus and the cocaine impersonator that is hooked to it. “The immune system learns to see cocaine as an intruder,” says Dr. Crystal. “Once immune cells are educated to regard cocaine as the enemy, it produces antibodies, from that moment on, against cocaine the moment the drug enters the body.”

In their first study in animals, the researchers injected billions of their viral concoction into laboratory mice, and found a strong immune response was generated against the vaccine. Also, when the scientists extracted the antibodies produced by the mice and put them in test tubes, it gobbled up cocaine. They also saw that mice that received both the vaccine and cocaine were much less hyperactive than untreated mice given cocaine.

In this study, the researchers sought to precisely define how effective the anti-cocaine vaccine is in non-human primates, who are closer in biology to humans than mice. They developed a tool to measure how much cocaine attached to the dopamine transporter, which picks up dopamine in the synapse between neurons and brings it out to be recycled. If cocaine is in the brain, it binds on to the transporter, effectively blocking the transporter from ferrying dopamine out of the synapse, keeping the neurotransmitter active to produce a drug high.

In the study, the researchers attached a short-lived isotope tracer to the dopamine transporter. The activity of the tracer could be seen using positron emission tomography (PET). The tool measured how much of the tracer attached to the dopamine receptor in the presence or absence of cocaine.

The PET studies showed no difference in the binding of the tracer to the dopamine transporter in vaccinated compared to unvaccinated animals if these two groups were not given cocaine. But when cocaine was given to the primates, there was a significant drop in activity of the tracer in non-vaccinated animals. That meant that without the vaccine, cocaine displaced the tracer in binding to the dopamine receptor.

Previous research had shown in humans that at least 47 percent of the dopamine transporter had to be occupied by cocaine in order to produce a drug high. The researchers found, in vaccinated primates, that cocaine occupancy of the dopamine receptor was reduced to levels of less than 20 percent.

“This is a direct demonstration in a large animal, using nuclear medicine technology, that we can reduce the amount of cocaine that reaches the brain sufficiently so that it is below the threshold by which you get the high,” says Dr. Crystal.

When the vaccine is studied in humans, the non-toxic dopamine transporter tracer can be used to help study its effectiveness as well, he adds.

The researchers do not know how often the vaccine needs to be administered in humans to maintain its anti-cocaine effect. One vaccine lasted 13 weeks in mice and seven weeks in non-human primates.

“An anti-cocaine vaccination will require booster shots in humans, but we don’t know yet how often these booster shots will be needed,” says Dr. Crystal. “I believe that for those people who desperately want to break their addiction, a series of vaccinations will help.”

Co-authors of the study include Dr. Anat Maoz, Dr. Martin J. Hicks, Dr. Shankar Vallabhajosula, Michael Synan, Dr. Paresh J. Kothari, Dr. Jonathan P. Dyke, Dr. Douglas J. Ballon, Dr. Stephen M. Kaminsky, Dr. Bishnu P. De and Dr. Jonathan B. Rosenberg from Weill Cornell Medical College; Dr. Diana Martinez from Columbia University; and Dr. George F. Koob and Dr. Kim D. Janda from The Scripps Research Institute.

The study was funded by grants from the National Institute on Drug Abuse (NIDA).

Thanks to Kebmodee and Dr. Rajadhyaksha for bringing this to the attention of the It’s Interesting community.

New study links first-person singular pronouns to relationship problems and higher rates of depression

me

Researchers in Germany have found that people who frequently use first-person singular words like “I,” “me,” and “myself,” are more likely to be depressed and have more interpersonal problems than people who often say “we” and “us.”

In the study, 103 women and 15 men completed 60- to 90-minute psychotherapeutic interviews about their relationships, their past, and their self-perception. (99 of the subjects were patients at a psychotherapy clinic who had problems ranging from eating disorders to anxiety.) They also filled out questionnaires about depression and their interpersonal behavior.

Then, researchers led by Johannes Zimmerman of Germany’s University of Kassel counted the number of first-person singular (I, me) and first-person plural (we, us) pronouns used in each interview. Subjects who said more first-personal singular words scored higher on measures of depression. They also were more likely to show problematic interpersonal behaviors such as attention seeking, inappropriate self-disclosure, and an inability to spend time alone.

By contrast, the participants who used more pronouns like “we” and “us” tended to have what the researches called a “cold” interpersonal style. But, they explained, the coldness functioned as a positive way to maintain appropriate relationship boundaries while still helping others with their needs.

“Using first-person singular pronouns highlights the self as a distinct entity,” Zimmermann says, “whereas using first-person plural pronouns emphasizes its embeddedness into social relationships.” According to the study authors, the use of more first-person singular pronouns may be part of a strategy to gain more friendly attention from others.

Zimmerman points out that there’s no evidence that using more “I” and “me” words actually causes depression—instead, the speaking habit probably reflects how people see themselves and relate to others, he says.

The study appears in the June 2013 issue of the Journal of Research in Personality.

http://www.popsci.com/science/article/2013-05/people-who-often-say-me-myself-and-i-are-more-depressed?src=SOC&dom=tw

Brain implants: Restoring memory with a microchip

130507101540-brain-implants-human-horizontal-gallery

William Gibson’s popular science fiction tale “Johnny Mnemonic” foresaw sensitive information being carried by microchips in the brain by 2021. A team of American neuroscientists could be making this fantasy world a reality. Their motivation is different but the outcome would be somewhat similar. Hailed as one of 2013’s top ten technological breakthroughs by MIT, the work by the University of Southern California, North Carolina’s Wake Forest University and other partners has actually spanned a decade.

But the U.S.-wide team now thinks that it will see a memory device being implanted in a small number of human volunteers within two years and available to patients in five to 10 years. They can’t quite contain their excitement. “I never thought I’d see this in my lifetime,” said Ted Berger, professor of biomedical engineering at the University of Southern California in Los Angeles. “I might not benefit from it myself but my kids will.”

Rob Hampson, associate professor of physiology and pharmacology at Wake Forest University, agrees. “We keep pushing forward, every time I put an estimate on it, it gets shorter and shorter.”

The scientists — who bring varied skills to the table, including mathematical modeling and psychiatry — believe they have cracked how long-term memories are made, stored and retrieved and how to replicate this process in brains that are damaged, particularly by stroke or localized injury.

Berger said they record a memory being made, in an undamaged area of the brain, then use that data to predict what a damaged area “downstream” should be doing. Electrodes are then used to stimulate the damaged area to replicate the action of the undamaged cells.

They concentrate on the hippocampus — part of the cerebral cortex which sits deep in the brain — where short-term memories become long-term ones. Berger has looked at how electrical signals travel through neurons there to form those long-term memories and has used his expertise in mathematical modeling to mimic these movements using electronics.

Hampson, whose university has done much of the animal studies, adds: “We support and reinforce the signal in the hippocampus but we are moving forward with the idea that if you can study enough of the inputs and outputs to replace the function of the hippocampus, you can bypass the hippocampus.”

The team’s experiments on rats and monkeys have shown that certain brain functions can be replaced with signals via electrodes. You would think that the work of then creating an implant for people and getting such a thing approved would be a Herculean task, but think again.

For 15 years, people have been having brain implants to provide deep brain stimulation to treat epilepsy and Parkinson’s disease — a reported 80,000 people have now had such devices placed in their brains. So many of the hurdles have already been overcome — particularly the “yuck factor” and the fear factor.

“It’s now commonly accepted that humans will have electrodes put in them — it’s done for epilepsy, deep brain stimulation, (that has made it) easier for investigative research, it’s much more acceptable now than five to 10 years ago,” Hampson says.

Much of the work that remains now is in shrinking down the electronics.

“Right now it’s not a device, it’s a fair amount of equipment,”Hampson says. “We’re probably looking at devices in the five to 10 year range for human patients.”

The ultimate goal in memory research would be to treat Alzheimer’s Disease but unlike in stroke or localized brain injury, Alzheimer’s tends to affect many parts of the brain, especially in its later stages, making these implants a less likely option any time soon.

Berger foresees a future, however, where drugs and implants could be used together to treat early dementia. Drugs could be used to enhance the action of cells that surround the most damaged areas, and the team’s memory implant could be used to replace a lot of the lost cells in the center of the damaged area. “I think the best strategy is going to involve both drugs and devices,” he says.

Unfortunately, the team found that its method can’t help patients with advanced dementia.

“When looking at a patient with mild memory loss, there’s probably enough residual signal to work with, but not when there’s significant memory loss,” Hampson said.

Constantine Lyketsos, professor of psychiatry and behavioral sciences at John Hopkins Medicine in Baltimore which is trialing a deep brain stimulator implant for Alzheimer’s patients was a little skeptical of the other team’s claims.

“The brain has a lot of redundancy, it can function pretty well if loses one or two parts. But memory involves circuits diffusely dispersed throughout the brain so it’s hard to envision.” However, he added that it was more likely to be successful in helping victims of stroke or localized brain injury as indeed its makers are aiming to do.

The UK’s Alzheimer’s Society is cautiously optimistic.

“Finding ways to combat symptoms caused by changes in the brain is an ongoing battle for researchers. An implant like this one is an interesting avenue to explore,” said Doug Brown, director of research and development.

Hampson says the team’s breakthrough is “like the difference between a cane, to help you walk, and a prosthetic limb — it’s two different approaches.”

It will still take time for many people to accept their findings and their claims, he says, but they don’t expect to have a shortage of volunteers stepping forward to try their implant — the project is partly funded by the U.S. military which is looking for help with battlefield injuries.

There are U.S. soldiers coming back from operations with brain trauma and a neurologist at DARPA (the Defense Advanced Research Projects Agency) is asking “what can you do for my boys?” Hampson says.

“That’s what it’s all about.”

http://www.cnn.com/2013/05/07/tech/brain-memory-implants-humans/index.html?iref=allsearch

US suicide rate has risen sharply among middle-aged white men and women

suicide

The suicide rate among middle-aged Americans climbed a startling 28 percent in a decade, a period that included the recession and the mortgage crisis, the government reported Thursday. The trend was most pronounced among white men and women in that age group. Their suicide rate jumped 40 percent between 1999 and 2010. But the rates in younger and older people held steady. And there was little change among middle-aged blacks, Hispanics and most other racial and ethnic groups, the report from the Centers for Disease Control and Prevention found.

Why did so many middle-aged whites — that is, those who are 35 to 64 years old — take their own lives?

One theory suggests the recession caused more emotional trauma in whites, who tend not to have the same kind of church support and extended families that blacks and Hispanics do.

The economy was in recession from the end of 2007 until mid-2009. Even well afterward, polls showed most Americans remained worried about weak hiring, a depressed housing market and other problems.

Pat Smith, violence-prevention program coordinator for the Michigan Department of Community Health, said the recession — which hit manufacturing-heavy states particularly hard — may have pushed already-troubled people over the brink. Being unable to find a job or settling for one with lower pay or prestige could add “that final weight to a whole chain of events,” she said.

Another theory notes that white baby boomers have always had higher rates of depression and suicide, and that has held true as they’ve hit middle age. During the 11-year period studied, suicide went from the eighth leading cause of death among middle-aged Americans to the fourth, behind cancer, heart disease and accidents.

“Some of us think we’re facing an upsurge as this generation moves into later life,” said Dr. Eric Caine, a suicide researcher at the University of Rochester.

One more possible contributor is the growing sale and abuse of prescription painkillers over the past decade. Some people commit suicide by overdose. In other cases, abuse of the drugs helps put people in a frame of mind to attempt suicide by other means, said Thomas Simon, one of the authors of the CDC report, which was based on death certificates.

People ages 35 to 64 account for about 57 percent of suicides in the U.S.

The report contained surprising information about how middle-aged people kill themselves: During the period studied, hangings overtook drug overdoses in that age group, becoming the No. 2 manner of suicide. But guns remained far in the lead and were the instrument of death in nearly half of all suicides among the middle-aged in 2010.

The CDC does not collect gun ownership statistics and did not look at the relationship between suicide rates and the prevalence of firearms.

For the entire U.S. population, there were 38,350 suicides in 2010, making it the nation’s 10th leading cause of death, the CDC said. The overall national suicide rate climbed from 12 suicides per 100,000 people in 1999 to 14 per 100,000 in 2010. That was a 15 percent increase.

For the middle-aged, the rate jumped from about 14 per 100,000 to nearly 18 — a 28 percent increase. Among whites in that age group, it spiked from about 16 to 22.

Suicide prevention efforts have tended to concentrate on teenagers and the elderly, but research over the past several years has begun to focus on the middle-aged. The new CDC report is being called the first to show how the trend is playing out nationally and to look in depth at the racial and geographic breakdown.

Thirty-nine out of 50 states registered a statistically significant increase in suicide rates among the middle-aged. The West and the South had the highest rates. It’s not clear why, but one factor may be cultural differences in willingness to seek help during tough times, Simon said.

Also, it may be more difficult to find counseling and mental health services in certain places, he added.

Suicides among middle-aged Native Americans and Alaska Natives climbed 65 percent, to 18.5 per 100,000. However, the overall numbers remain very small — 171 such deaths in 2010. And changes in small numbers can look unusually dramatic.

The CDC did not break out suicides of current and former military service members, a tragedy that has been getting increased attention. But a recent Department of Veterans Affairs report concluded that suicides among veterans have been relatively stable in the past decade and that veterans have been a shrinking percentage of suicides nationally.

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

http://bigstory.ap.org/article/us-suicide-rate-rose-sharply-among-middle-aged

Bionic superhumans on the horizon

1-bionic-hand-story-top

Around 220,000 people worldwide already walk around with cochlear implants — devices worn around the ear that turn sound waves into electrical impulses shunted directly into the auditory nerve.

Tens of thousands of people have been implanted with deep brain stimulators, devices that send an electrode tunneling several inches in the brain. Deep brain stimulators are used to control Parkinson’s disease, though lately they’ve also been tested — with encouraging results — in use against severe depression and obsessive compulsive disorder.

The most obvious bionics are those that replace limbs. Olympian “Blade Runner” Oscar Pistorius, now awaiting trial for the alleged murder of his girlfriend, made a splash with his Cheetah carbon fiber prostheses. Yet those are a relatively simple technology — a curved piece of slightly springy, super-strong material. In the digital age, we’re seeing more sophisticated limbs.

Consider the thought-controlled bionic leg that Zac Vawter used to climb all 103 floors of Chicago’s Willis Tower. Or the nerve-controlled bionic hand that Iraq war veteran Glen Lehman had attached after the loss of his original hand.

Or the even more sophisticated i-limb Ultra, an artificial hand with five independently articulating artificial fingers. Those limbs don’t just react mechanically to pressure. They actually respond to the thoughts and intentions of their owners, flexing, extending, gripping, and releasing on mental command.

The age when prostheses were largely inert pieces of wood, metal, and plastic is passing. Advances in microprocessors, in techniques to interface digital technology with the human nervous system, and in battery technology to allow prostheses to pack more power with less weight are turning replacement limbs into active parts of the human body.

In some cases, they’re not even part of the body at all. Consider the case of Cathy Hutchinson. In 1997, Cathy had a stroke, leaving her without control of her arms. Hutchinson volunteered for an experimental procedure that could one day help millions of people with partial or complete paralysis. She let researchers implant a small device in the part of her brain responsible for motor control. With that device, she is able to control an external robotic arm by thinking about it.

That, in turn, brings up an interesting question: If the arm isn’t physically attached to her body, how far away could she be and still control it? The answer is at least thousands of miles. In animal studies, scientists have shown that a monkey with a brain implant can control a robot arm 7,000 miles away. The monkey’s mental signals were sent over the internet, from Duke University in North Carolina, to the robot arm in Japan. In this day and age, distance is almost irrelevant.

The 7,000-mile-away prosthetic arm makes an important point: These new prostheses aren’t just going to restore missing human abilities. They’re going to enhance our abilities, giving us powers we never had before, and augmenting other capabilities we have. While the current generation of prostheses is still primitive, we can already see this taking shape when a monkey moves a robotic arm on the other side of the planet just by thinking about it.

Other research is pointing to enhancements to memory and decision making.

The hippocampus is a small, seahorse-shaped part of the brain that’s essential in forming new memories. If it’s damaged — by an injury to the head, for example — people start having difficulty forming new long-term memories. In the most extreme cases, this can lead to the complete inability to form new long-term memories, as in the film Memento. Working to find a way to repair this sort of brain damage, researchers in 2011 created a “hippocampus chip” that can replace damaged brain tissue. When they implanted it in rats with a damaged hippocampus, they found that not only could their chip repair damaged memory — it could improve the rats’ ability to learn new things.

Nor is memory the end of it. Another study, in 2012, demonstrated that we can boost intelligence — at least one sort — in monkeys. Scientists at Wake Forest University implanted specialized brain chips in a set of monkeys and trained those monkeys to perform a picture-matching game. When the implant was activated, it raised their scores by an average of 10 points on a 100-point scale. The implant makes monkeys smarter.

Both of those technologies for boosting memory and intelligence are in very early stages, in small animal studies only, and years (or possibly decades) away from wide use in humans. Still, they make us wonder — what happens when it’s possible to improve on the human body and mind?

The debate has started already, of course. Oscar Pistorius had to fight hard for inclusion in the Olympics. Many objected that his carbon fiber prostheses gave him a competitive advantage. He was able — with the help of doctors and biomedical engineers — to make a compelling case that his Cheetah blades didn’t give him any advantage on the field. But how long will that be true? How long until we have prostheses (not to mention drugs and genetic therapies) that make athletes better in their sports?

But the issue is much, much wider than professional sports. We may care passionately about the integrity of the Olympics or professional cycling or so on, but they only directly affect a very small number of us. In other areas of life — in the workforce in particular — enhancement technology might affect all of us.

When it’s possible to make humans smarter, sharper, and faster, how will that affect us? Will the effect be mostly positive, boosting our productivity and the rate of human innovation? Or will it be just another pressure to compete at work? Who will be able to afford these technologies? Will anyone be able to have their body, and more importantly, their brain upgraded? Or will only the rich have access to these enhancements?

We have a little while to consider these questions, but we ought to start. The technology will sneak its way into our lives, starting with people with disabilities, the injured, and the ill. It’ll improve their lives in ways that are unquestionably good. And then, one day, we’ll wake up and realize that we’re doing more than restoring lost function. We’re enhancing it.

Superhuman technology is on the horizon. Time to start thinking about what that means for us.

http://www.cnn.com/2013/04/24/opinion/bionic-superhumans-ramez-naam/index.html?iid=article_sidebar

Documentary on Sleep Paralysis this May

sleep-paralysis-still-130329

Stephanie Pappas, LiveScience Senior Writer

When filmmaker Carla MacKinnon started waking up several times a week unable to move, with the sense that a disturbing presence was in the room with her, she didn’t call up her local ghost hunter. She got researching. Now, that research is becoming a short film and multiplatform art project exploring the strange and spooky phenomenon of sleep paralysis. The film, supported by the Wellcome Trust and set to screen at the Royal College of Arts in London, will debut in May.

Sleep paralysis happens when people become conscious while their muscles remain in the ultra-relaxed state that prevents them from acting out their dreams. The experience can be quite terrifying, with many people hallucinating a malevolent presence nearby, or even an attacker suffocating them. Surveys put the number of sleep paralysis sufferers between about 5 percent and 60 percent of the population. “I was getting quite a lot of sleep paralysis over the summer, quite frequently, and I became quite interested in what was happening, what medically or scientifically, it was all about,” MacKinnon said.

Her questions led her to talk with psychologists and scientists, as well as to people who experience the phenomenon. Myths and legends about sleep paralysis persist all over the globe, from the incubus and succubus (male and female demons, respectively) of European tales to a pink dolphin-turned-nighttime seducer in Brazil. Some of the stories MacKinnon uncovered reveal why these myths are so chilling.

One man told her about his frequent sleep paralysis episodes, during which he’d experience extremely realistic hallucinations of a young child, skipping around the bed and singing nursery rhymes. Sometimes, the child would sit on his pillow and talk to him. One night, the tot asked the man a personal question. When he refused to answer, the child transformed into a “horrendous demon,” MacKinnon said.

For another man, who had the sleep disorder narcolepsy (which can make sleep paralysis more common), his dream world clashed with the real world in a horrifying way. His sleep paralysis episodes typically included hallucinations that someone else was in his house or his room — he’d hear voices or banging around. One night, he awoke in a paralyzed state and saw a figure in his room as usual. “He suddenly realizes something is different,” MacKinnon said. “He suddenly realizes that he is in sleep paralysis, and his eyes are open, but the person who is in the room is in his room in real life.” The figure was no dream demon, but an actual burglar.

Sleep paralysis experiences are almost certainly behind the myths of the incubus and succubus, demons thought to have sex with unsuspecting humans in their sleep. In many cases, MacKinnon said, the science of sleep paralysis explains these myths. The feeling of suffocating or someone pushing down on the chest that often occurs during sleep paralysis may be a result of the automatic breathing pattern people fall into during sleep. When they become conscious while still in this breathing pattern, people may try to bring their breathing under voluntary control, leading to the feeling of suffocating. Add to that the hallucinations that seem to seep in from the dream world, and it’s no surprise that interpretations lend themselves to demons, ghosts or even alien abduction, MacKinnon said.

What’s more, MacKinnon said, sleep paralysis is more likely when your sleep is disrupted in some way — perhaps because you’ve been traveling, you’re too hot or too cold, or you’re sleeping in an unfamiliar or spooky place. Those tendencies may make it more likely that a person will experience sleep paralysis when already vulnerable to thoughts of ghosts and ghouls. “It’s interesting seeing how these scientific narratives and the more psychoanalytical or psychological narratives can support each other rather than conflict,” MacKinnon said.

Since working on the project, MacKinnon has been able to bring her own sleep paralysis episodes under control — or at least learned to calm herself during them. The trick, she said, is to use episodes like a form of research, by paying attention to details like how her hands feel and what position she’s in. This sort of mindfulness tends to make scary hallucinations blink away, she said. “Rationalizing it is incredibly counterintuitive,” she said. “It took me a really long time to stop believing that it was real, because it feels so incredibly real.”

http://www.livescience.com/28325-spooky-film-explores-sleep-paralysis.html

Researchers explore connecting the brain to machines

brain

Behind a locked door in a white-walled basement in a research building in Tempe, Ariz., a monkey sits stone-still in a chair, eyes locked on a computer screen. From his head protrudes a bundle of wires; from his mouth, a plastic tube. As he stares, a picture of a green cursor on the black screen floats toward the corner of a cube. The monkey is moving it with his mind.

The monkey, a rhesus macaque named Oscar, has electrodes implanted in his motor cortex, detecting electrical impulses that indicate mental activity and translating them to the movement of the ball on the screen. The computer isn’t reading his mind, exactly — Oscar’s own brain is doing a lot of the lifting, adapting itself by trial and error to the delicate task of accurately communicating its intentions to the machine. (When Oscar succeeds in controlling the ball as instructed, the tube in his mouth rewards him with a sip of his favorite beverage, Crystal Light.) It’s not technically telekinesis, either, since that would imply that there’s something paranormal about the process. It’s called a “brain-computer interface” (BCI). And it just might represent the future of the relationship between human and machine.

Stephen Helms Tillery’s laboratory at Arizona State University is one of a growing number where researchers are racing to explore the breathtaking potential of BCIs and a related technology, neuroprosthetics. The promise is irresistible: from restoring sight to the blind, to helping the paralyzed walk again, to allowing people suffering from locked-in syndrome to communicate with the outside world. In the past few years, the pace of progress has been accelerating, delivering dazzling headlines seemingly by the week.

At Duke University in 2008, a monkey named Idoya walked on a treadmill, causing a robot in Japan to do the same. Then Miguel Nicolelis stopped the monkey’s treadmill — and the robotic legs kept walking, controlled by Idoya’s brain. At Andrew Schwartz’s lab at the University of Pittsburgh in December 2012, a quadriplegic woman named Jan Scheuermann learned to feed herself chocolate by mentally manipulating a robotic arm. Just last month, Nicolelis’ lab set up what it billed as the first brain-to-brain interface, allowing a rat in North Carolina to make a decision based on sensory data beamed via Internet from the brain of a rat in Brazil.

So far the focus has been on medical applications — restoring standard-issue human functions to people with disabilities. But it’s not hard to imagine the same technologies someday augmenting capacities. If you can make robotic legs walk with your mind, there’s no reason you can’t also make them run faster than any sprinter. If you can control a robotic arm, you can control a robotic crane. If you can play a computer game with your mind, you can, theoretically at least, fly a drone with your mind.

It’s tempting and a bit frightening to imagine that all of this is right around the corner, given how far the field has already come in a short time. Indeed, Nicolelis — the media-savvy scientist behind the “rat telepathy” experiment — is aiming to build a robotic bodysuit that would allow a paralyzed teen to take the first kick of the 2014 World Cup. Yet the same factor that has made the explosion of progress in neuroprosthetics possible could also make future advances harder to come by: the almost unfathomable complexity of the human brain.

From I, Robot to Skynet, we’ve tended to assume that the machines of the future would be guided by artificial intelligence — that our robots would have minds of their own. Over the decades, researchers have made enormous leaps in artificial intelligence (AI), and we may be entering an age of “smart objects” that can learn, adapt to, and even shape our habits and preferences. We have planes that fly themselves, and we’ll soon have cars that do the same. Google has some of the world’s top AI minds working on making our smartphones even smarter, to the point that they can anticipate our needs. But “smart” is not the same as “sentient.” We can train devices to learn specific behaviors, and even out-think humans in certain constrained settings, like a game of Jeopardy. But we’re still nowhere close to building a machine that can pass the Turing test, the benchmark for human-like intelligence. Some experts doubt we ever will.

Philosophy aside, for the time being the smartest machines of all are those that humans can control. The challenge lies in how best to control them. From vacuum tubes to the DOS command line to the Mac to the iPhone, the history of computing has been a progression from lower to higher levels of abstraction. In other words, we’ve been moving from machines that require us to understand and directly manipulate their inner workings to machines that understand how we work and respond readily to our commands. The next step after smartphones may be voice-controlled smart glasses, which can intuit our intentions all the more readily because they see what we see and hear what we hear.

The logical endpoint of this progression would be computers that read our minds, computers we can control without any physical action on our part at all. That sounds impossible. After all, if the human brain is so hard to compute, how can a computer understand what’s going on inside it?

It can’t. But as it turns out, it doesn’t have to — not fully, anyway. What makes brain-computer interfaces possible is an amazing property of the brain called neuroplasticity: the ability of neurons to form new connections in response to fresh stimuli. Our brains are constantly rewiring themselves to allow us to adapt to our environment. So when researchers implant electrodes in a part of the brain that they expect to be active in moving, say, the right arm, it’s not essential that they know in advance exactly which neurons will fire at what rate. When the subject attempts to move the robotic arm and sees that it isn’t quite working as expected, the person — or rat or monkey — will try different configurations of brain activity. Eventually, with time and feedback and training, the brain will hit on a solution that makes use of the electrodes to move the arm.

That’s the principle behind such rapid progress in brain-computer interface and neuroprosthetics. Researchers began looking into the possibility of reading signals directly from the brain in the 1970s, and testing on rats began in the early 1990s. The first big breakthrough for humans came in Georgia in 1997, when a scientist named Philip Kennedy used brain implants to allow a “locked in” stroke victim named Johnny Ray to spell out words by moving a cursor with his thoughts. (It took him six exhausting months of training to master the process.) In 2008, when Nicolelis got his monkey at Duke to make robotic legs run a treadmill in Japan, it might have seemed like mind-controlled exoskeletons for humans were just another step or two away. If he succeeds in his plan to have a paralyzed youngster kick a soccer ball at next year’s World Cup, some will pronounce the cyborg revolution in full swing.

Schwartz, the Pittsburgh researcher who helped Jan Scheuermann feed herself chocolate in December, is optimistic that neuroprosthetics will eventually allow paralyzed people to regain some mobility. But he says that full control over an exoskeleton would require a more sophisticated way to extract nuanced information from the brain. Getting a pair of robotic legs to walk is one thing. Getting robotic limbs to do everything human limbs can do may be exponentially more complicated. “The challenge of maintaining balance and staying upright on two feet is a difficult problem, but it can be handled by robotics without a brain. But if you need to move gracefully and with skill, turn and step over obstacles, decide if it’s slippery outside — that does require a brain. If you see someone go up and kick a soccer ball, the essential thing to ask is, ‘OK, what would happen if I moved the soccer ball two inches to the right?'” The idea that simple electrodes could detect things as complex as memory or cognition, which involve the firing of billions of neurons in patterns that scientists can’t yet comprehend, is far-fetched, Schwartz adds.

That’s not the only reason that companies like Apple and Google aren’t yet working on devices that read our minds (as far as we know). Another one is that the devices aren’t portable. And then there’s the little fact that they require brain surgery.

A different class of brain-scanning technology is being touted on the consumer market and in the media as a way for computers to read people’s minds without drilling into their skulls. It’s called electroencephalography, or EEG, and it involves headsets that press electrodes against the scalp. In an impressive 2010 TED Talk, Tan Le of the consumer EEG-headset company Emotiv Lifescience showed how someone can use her company’s EPOC headset to move objects on a computer screen.

Skeptics point out that these devices can detect only the crudest electrical signals from the brain itself, which is well-insulated by the skull and scalp. In many cases, consumer devices that claim to read people’s thoughts are in fact relying largely on physical signals like skin conductivity and tension of the scalp or eyebrow muscles.

Robert Oschler, a robotics enthusiast who develops apps for EEG headsets, believes the more sophisticated consumer headsets like the Emotiv EPOC may be the real deal in terms of filtering out the noise to detect brain waves. Still, he says, there are limits to what even the most advanced, medical-grade EEG devices can divine about our cognition. He’s fond of an analogy that he attributes to Gerwin Schalk, a pioneer in the field of invasive brain implants. The best EEG devices, he says, are “like going to a stadium with a bunch of microphones: You can’t hear what any individual is saying, but maybe you can tell if they’re doing the wave.” With some of the more basic consumer headsets, at this point, “it’s like being in a party in the parking lot outside the same game.”

It’s fairly safe to say that EEG headsets won’t be turning us into cyborgs anytime soon. But it would be a mistake to assume that we can predict today how brain-computer interface technology will evolve. Just last month, a team at Brown University unveiled a prototype of a low-power, wireless neural implant that can transmit signals to a computer over broadband. That could be a major step forward in someday making BCIs practical for everyday use. Meanwhile, researchers at Cornell last week revealed that they were able to use fMRI, a measure of brain activity, to detect which of four people a research subject was thinking about at a given time. Machines today can read our minds in only the most rudimentary ways. But such advances hint that they may be able to detect and respond to more abstract types of mental activity in the always-changing future.

http://www.ydr.com/living/ci_22800493/researchers-explore-connecting-brain-machines

Flip of a single molecular switch makes an old brain young

green-image

The flip of a single molecular switch helps create the mature neuronal connections that allow the brain to bridge the gap between adolescent impressionability and adult stability. Now Yale School of Medicine researchers have reversed the process, recreating a youthful brain that facilitated both learning and healing in the adult mouse.

Scientists have long known that the young and old brains are very different. Adolescent brains are more malleable or plastic, which allows them to learn languages more quickly than adults and speeds recovery from brain injuries. The comparative rigidity of the adult brain results in part from the function of a single gene that slows the rapid change in synaptic connections between neurons.

By monitoring the synapses in living mice over weeks and months, Yale researchers have identified the key genetic switch for brain maturation a study released March 6 in the journal Neuron. The Nogo Receptor 1 gene is required to suppress high levels of plasticity in the adolescent brain and create the relatively quiescent levels of plasticity in adulthood. In mice without this gene, juvenile levels of brain plasticity persist throughout adulthood. When researchers blocked the function of this gene in old mice, they reset the old brain to adolescent levels of plasticity.

“These are the molecules the brain needs for the transition from adolescence to adulthood,” said Dr. Stephen Strittmatter. Vincent Coates Professor of Neurology, Professor of Neurobiology and senior author of the paper. “It suggests we can turn back the clock in the adult brain and recover from trauma the way kids recover.”

Rehabilitation after brain injuries like strokes requires that patients re-learn tasks such as moving a hand. Researchers found that adult mice lacking Nogo Receptor recovered from injury as quickly as adolescent mice and mastered new, complex motor tasks more quickly than adults with the receptor.

“This raises the potential that manipulating Nogo Receptor in humans might accelerate and magnify rehabilitation after brain injuries like strokes,” said Feras Akbik, Yale doctoral student who is first author of the study.

Researchers also showed that Nogo Receptor slows loss of memories. Mice without Nogo receptor lost stressful memories more quickly, suggesting that manipulating the receptor could help treat post-traumatic stress disorder.

“We know a lot about the early development of the brain,” Strittmatter said, “But we know amazingly little about what happens in the brain during late adolescence.”

Other Yale authors are: Sarah M. Bhagat, Pujan R. Patel and William B.J. Cafferty

The study was funded by the National Institutes of Health. Strittmatter is scientific founder of Axerion Therapeutics, which is investigating applications of Nogo research to repair spinal cord damage.

http://news.yale.edu/2013/03/06/flip-single-molecular-switch-makes-old-brain-young

Largest psychiatric genetic study in history shows a common genetic basis that underlies 5 types of mental disorders

Protein_CACNA1C_PDB_2be6
Structure of the CACNA1C gene product, a calcium channel named Cav1.2, which is one of 4 genes that has now been found to be genetically held in common amongst schizophrenia, bipolar disorder, autism, major depression and attention deficit hyperactivity disoder. Groundbreaking work on the role of this protein on anxiety and other forms of behavior related to mental illness has previously been established in the Rajadhyaksha laboratory at Weill Cornell Medical Center.
http://weill.cornell.edu/research/arajadhyaksha/

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3481072/
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3192195/
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3077109/

From the New York Times:
The psychiatric illnesses seem very different — schizophrenia, bipolar disorder, autism, major depression and attention deficit hyperactivity disorder. Yet they share several genetic glitches that can nudge the brain along a path to mental illness, researchers report. Which disease, if any, develops is thought to depend on other genetic or environmental factors.

Their study, published online Wednesday in the Lancet, was based on an examination of genetic data from more than 60,000 people worldwide. Its authors say it is the largest genetic study yet of psychiatric disorders. The findings strengthen an emerging view of mental illness that aims to make diagnoses based on the genetic aberrations underlying diseases instead of on the disease symptoms.

Two of the aberrations discovered in the new study were in genes used in a major signaling system in the brain, giving clues to processes that might go awry and suggestions of how to treat the diseases.

“What we identified here is probably just the tip of an iceberg,” said Dr. Jordan Smoller, lead author of the paper and a professor of psychiatry at Harvard Medical School and Massachusetts General Hospital. “As these studies grow we expect to find additional genes that might overlap.”

The new study does not mean that the genetics of psychiatric disorders are simple. Researchers say there seem to be hundreds of genes involved and the gene variations discovered in the new study confer only a small risk of psychiatric disease.

Steven McCarroll, director of genetics for the Stanley Center for Psychiatric Research at the Broad Institute of Harvard and M.I.T., said it was significant that the researchers had found common genetic factors that pointed to a specific signaling system.

“It is very important that these were not just random hits on the dartboard of the genome,” said Dr. McCarroll, who was not involved in the new study.

The work began in 2007 when a large group of researchers began investigating genetic data generated by studies in 19 countries and including 33,332 people with psychiatric illnesses and 27,888 people free of the illnesses for comparison. The researchers studied scans of people’s DNA, looking for variations in any of several million places along the long stretch of genetic material containing three billion DNA letters. The question: Did people with psychiatric illnesses tend to have a distinctive DNA pattern in any of those locations?

Researchers had already seen some clues of overlapping genetic effects in identical twins. One twin might have schizophrenia while the other had bipolar disorder. About six years ago, around the time the new study began, researchers had examined the genes of a few rare families in which psychiatric disorders seemed especially prevalent. They found a few unusual disruptions of chromosomes that were linked to psychiatric illnesses. But what surprised them was that while one person with the aberration might get one disorder, a relative with the same mutation got a different one.

Jonathan Sebat, chief of the Beyster Center for Molecular Genomics of Neuropsychiatric Diseases at the University of California, San Diego, and one of the discoverers of this effect, said that work on these rare genetic aberrations had opened his eyes. “Two different diagnoses can have the same genetic risk factor,” he said.

In fact, the new paper reports, distinguishing psychiatric diseases by their symptoms has long been difficult. Autism, for example, was once called childhood schizophrenia. It was not until the 1970s that autism was distinguished as a separate disorder.

But Dr. Sebat, who did not work on the new study, said that until now it was not clear whether the rare families he and others had studied were an exception or whether they were pointing to a rule about multiple disorders arising from a single genetic glitch.

“No one had systematically looked at the common variations,” in DNA, he said. “We didn’t know if this was particularly true for rare mutations or if it would be true for all genetic risk.” The new study, he said, “shows all genetic risk is of this nature.”

The new study found four DNA regions that conferred a small risk of psychiatric disorders. For two of them, it is not clear what genes are involved or what they do, Dr. Smoller said. The other two, though, involve genes that are part of calcium channels, which are used when neurons send signals in the brain.

“The calcium channel findings suggest that perhaps — and this is a big if — treatments to affect calcium channel functioning might have effects across a range of disorders,” Dr. Smoller said.

There are drugs on the market that block calcium channels — they are used to treat high blood pressure — and researchers had already postulated that they might be useful for bipolar disorder even before the current findings.

One investigator, Dr. Roy Perlis of Massachusetts General Hospital, just completed a small study of a calcium channel blocker in 10 people with bipolar disorder and is about to expand it to a large randomized clinical trial. He also wants to study the drug in people with schizophrenia, in light of the new findings. He cautions, though, that people should not rush out to take a calcium channel blocker on their own.

“We need to be sure it is safe and we need to be sure it works,” Dr. Perlis said.