Parenting Rewires the Male Brain

By Elizabeth Norton

Cultures around the world have long assumed that women are hardwired to be mothers. But a new study suggests that caring for children awakens a parenting network in the brain—even turning on some of the same circuits in men as it does in women. The research implies that the neural underpinnings of the so-called maternal instinct aren’t unique to women, or activated solely by hormones, but can be developed by anyone who chooses to be a parent.

“This is the first study to look at the way dads’ brains change with child care experience,” says Kevin Pelphrey, a neuroscientist at Yale University who was not involved with the study. “What we thought of as a purely maternal circuit can also be turned on just by being a parent—which is neat, given the way our culture is changing with respect to shared responsibility and marriage equality.”

The findings come from an investigation of two types of households in Israel: traditional families consisting of a biological mother and father, in which the mother assumed most of the caregiving duties, though the fathers were very involved; and homosexual male couples, one of whom was the biological father, who’d had the child with the help of surrogate mothers. The two-father couples had taken the babies home shortly after birth and shared caregiving responsibilities equally. All participants in the study were first-time parents.

Researchers led by Ruth Feldman, a psychologist and neuroscientist at Bar-Ilan University in Ramat Gan, Israel, visited with the families in their homes, videotaping each parent with the child and then the parents and children alone. The team, which included collaborators at the Tel Aviv Sourasky Medical Center in Israel, also took saliva samples from all parents before and after the videotaped sessions to measure oxytocin—a hormone that’s released at times of intimacy and affection and is widely considered the “trust hormone.” Within a week of the home visit, the participants underwent functional magnetic resonance imaging scanning to determine how their brains reacted to the videotapes of themselves with their infants.

The mothers, their husbands, and the homosexual father-father couples all showed the activation of what the researchers term a “parenting network” that incorporated two linked but separate pathways in the brain. One circuit encompasses evolutionarily ancient structures such as the amygdala, insula, and nucleus accumbens, which handle strong emotions, attention, vigilance, and reward. The other pathway turns up in response to learning and experience and includes parts of the prefrontal cortex and an area called the superior temporal sulcus.

In the mothers, activation was stronger in the amygdala-centered network, whereas the heterosexual fathers showed more activity in the network that’s more experience-dependent. At first glance, Feldman says, the finding would seem to suggest that mothers are more wired up to nurture, protect, and possibly worry about their children. The fathers, in contrast, might have to develop these traits through tending, communicating, and learning from their babies what various sounds mean and what the child needs.

“It’s as if the father’s amygdala can shut off when there’s a woman around,” Feldman observes. It could be assumed, she says, that this circuitry is activated only by the rush of hormones during conception, pregnancy, and childbirth.

But the brains of the homosexual couples, in which each partner was a primary caregiver, told a different story. All of these men showed activity that mirrored that of the mothers, with much higher activation in the amygdala-based network, the team reports online today in the Proceedings of the National Academy of Sciences.

This finding argues strongly that the experience of hands-on parenting, with no female mother anywhere in the picture, can configure a caregiver’s brain in the same way that pregnancy and childbirth do, Feldman says.

She adds that in the heterosexual fathers, the activation of the amygdala-based network was proportional to the amount of time they spent with the baby, though the activity wasn’t as high as in the mothers or in the two-father couples.

Feldman does not believe that the brain activity of the primary-caregiving fathers differed because they were gay. Previous imaging studies, she notes, show no difference in brain activation when homosexual and heterosexual participants viewed pictures of their loved ones.

Future studies, Pelphrey says, might focus more closely on this question. “But it’s clear that we’re all born with the circuitry to help us be sensitive caregivers, and the network can be turned up through parenting.”

http://news.sciencemag.org/brain-behavior/2014/05/parenting-rewires-male-brain

New research shows molecular mechanism by which neuronal projections can regenerate after injury

The mechanisms that drive axon regeneration after central nervous system (CNS) injury or disease are proposed to recapitulate, at least in part, the developmental axon growth pathways. This hypothesis is bolstered by a new study by O’Donovan et al. showing that activation of a B-RAF kinase signaling pathway is sufficient to promote robust axon growth not only during development but also after injury.

B-RAF was previously shown to be essential for developmental axon growth but it was not known if additional signaling pathways are required. In this study, the authors demonstrate that activation of B-RAF alone is sufficient to promote sensory axon growth during development. Using a conditional B-RAF gain-of-function mouse model, the authors elegantly prove that B-RAF has a cell-autonomous role in the developmental axon growth program. Notably, activated B-RAF promoted overgrowth of embryonic sensory axons projecting centrally in the spinal cord, suggesting that this pathway may normally be quiescent in central axons.

Could activated B-RAF also enhance axon regeneration in the adult central nervous system? The authors found that activated B-RAF not only enabled sensory axon growth into the spinal cord after spinal injury, but also promoted regrowth of axons projecting in the optic nerve. Regeneration in the injured CNS is prevented by both the poor intrinsic regrowth capacity of axons and by inhibitory factors in the tissue environment. Importantly, the B-RAF–activated signaling growth program was insensitive to this repulsive environment.

Interestingly, the authors find that B-RAF synergizes with the PI3-kinase–mTOR pathway, which also functions downstream of growth factors. This opens the possibility that combinatorial approaches that integrate these two pathways may heighten regenerative capacity.

This in vivo study significantly advances the understanding of the role of MAP kinases in axon growth and suggests that reactivation of the B-RAF pathway may be exploited to promote axon regeneration in the injured central nervous system. An exciting future avenue will be to determine the downstream mechanisms controlled by B-RAF.

O’Donovan, K.J., et al. 2014. J. Exp. Med. doi:10.1084/jem.20131780.

http://jem.rupress.org/content/211/5/746.1.long

Scientists have identified the age at which most childhood memories fade and are lost forever

child_2787607b

Most adults struggle to recall events from their first few years of life and now scientists have identified exactly when these childhood memories fade and are lost forever.

A new study into childhood amnesia – the phenomenon where early memories are forgotten – has found that it tends to take affect around the age of seven.

The researchers found that while most three year olds can recall a lot of what happened to them over a year earlier, these memories can persist while they are five and six, but by the time they are over seven these memories decline rapidly.

Most children by the age of eight or nine can only recall 35% of their experiences from under the age of three, according to the new findings.

The psychologists behind the research say this is because at around this age the way we form memories begins to change.

They say that before the age of seven children tend to have an immature form of recall where they do not have a sense of time or place in their memories.

In older children, however, the early events they can recall tend to be more adult like in their content and the way they are formed.

Children also have a far faster rate of forgetting than adults and so the turnover of memories tends to be higher, meaning early memories are less likely to survive.

The findings also help to explain why children can often have vivid memories of events but then have forgotten them just a couple of years later.

Professor Patricia Bauer, a psychologist and associate dean for research at Emory college of Arts and Science who led the study, said: “The paradox of children’s memory competence and adults’ seeming “incompetence” at remembering early childhood events is striking.

“Though forgetting is more rapid in the early childhood years, eventually it slows to adult levels.

“Thus memories that “survived” early childhood have some likelihood of being remembered later in life.”

Professor Bauer and her colleagues studied 83 children over several years for the research, which is published in the scientific journal Memory.

The youngsters first visited the laboratory at the age of three years old and discussed six unique events from their past, such as family outings, camping holidays, trips to the zoo, first day of school and birthdays.

The children then returned for a second session at the ages between five years old and nine years old to discuss the same events and were asked to recall details they had previously remembered.

The researchers found that between the ages of five and seven, the amount of the memories the children could recall remained between 63-72 per cent.

However, the amount of information the children who were 8 and nine years old dropped dramatically to 35 and 36 per cent.

When the researchers looked closely at the kind of details the children were and were not able to remember, they found marked age differences.

The memories of the younger children tended to lack autobiographical narrative such as place and time. Their memories also had less narrative, which the researchers believe may lead to a process known as “retrieval induced forgetting” – where the action of remembering causes other information to be forgotten.

As they children got older, however, the memories they recalled from early childhood tended to have these features.

Professor Bauer said: “The fact that the younger children had less-complete narratives relative to the older children, likely has consequences for the continued accessibility of early memories beyond the first decade of life.

“We may anticipate that memories that survive into the ninth or tenth year of life, when narrative skills are more developed, would continue to be accessible over time.”

http://www.telegraph.co.uk/science/science-news/10564312/Scientists-pinpoint-age-when-childhood-memories-fade.html

‘Jumping Genes’ Linked to Schizophrenia

sn-schizophrenia

Roaming bits of DNA that can relocate and proliferate throughout the genome, called “jumping genes,” may contribute to schizophrenia, a new study suggests. These rogue genetic elements pepper the brain tissue of deceased people with the disorder and multiply in response to stressful events, such as infection during pregnancy, which increase the risk of the disease. The study could help explain how genes and environment work together to produce the complex disorder and may even point to ways of lowering the risk of the disease, researchers say.

Schizophrenia causes hallucinations, delusions, and a host of other cognitive problems, and afflicts roughly 1% of all people. It runs in families—a person whose twin sibling has the disorder, for example, has a roughly 50-50 chance of developing it. Scientists have struggled to define which genes are most important to developing the disease, however; each individual gene associated with the disorder confers only modest risk. Environmental factors such as viral infections before birth have also been shown to increase risk of developing schizophrenia, but how and whether these exposures work together with genes to skew brain development and produce the disease is still unclear, says Tadafumi Kato, a neuroscientist at the RIKEN Brain Science Institute in Wako City, Japan and co-author of the new study.

Over the past several years, a new mechanism for genetic mutation has attracted considerable interest from researchers studying neurological disorders, Kato says. Informally called jumping genes, these bits of DNA can replicate and insert themselves into other regions of the genome, where they either lie silent, doing nothing; start churning out their own genetic products; or alter the activity of their neighboring genes. If that sounds potentially dangerous, it is: Such genes are often the culprits behind tumor-causing mutations and have been implicated in several neurological diseases. However, jumping genes also make up nearly half the current human genome, suggesting that humans owe much of our identity to their audacious leaps.

Recent research by neuroscientist Fred Gage and colleagues at the University of California (UC), San Diego, has shown that one of the most common types of jumping gene in people, called L1, is particularly abundant in human stem cells in the brain that ultimately differentiate into neurons and plays an important role in regulating neuronal development and proliferation. Although Gage and colleagues have found that increased L1 is associated with mental disorders such as Rett syndrome, a form of autism, and a neurological motor disease called Louis-Bar syndrome, “no one had looked very carefully” to see if the gene might also contribute to schizophrenia, he says.

To investigate that question, principal investigator Kazuya Iwamoto, a neuroscientist; Kato; and their team at RIKEN extracted brain tissue of deceased people who had been diagnosed with schizophrenia as well as several other mental disorders, extracted DNA from their neurons, and compared it with that of healthy people. Compared with controls, there was a 1.1-fold increase in L1 in the tissue of people with schizophrenia, as well as slightly less elevated levels in people with other mental disorders such as major depression, the team reports today in Neuron.

Next, the scientists tested whether environmental factors associated with schizophrenia could trigger a comparable increase in L1. They injected pregnant mice with a chemical that simulates viral infection and found that their offspring did, indeed, show higher levels of the gene in their brain tissue. An additional study in infant macaques, which mimicked exposure to a hormone also associated with increased schizophrenia risk, produced similar results. Finally, the group examined human neural stem cells extracted from people with schizophrenia and found that these, too, showed higher levels of L1.

The fact that it is possible to increase the number of copies of L1 in the mouse and macaque brains using established environmental triggers for schizophrenia shows that such genetic mutations in the brain may be preventable if such exposures can be avoided, Kato says. He says he hopes that the “new view” that environmental factors can trigger or deter genetic changes involved in the disease will help remove some of the disorder’s stigma.

Combined with previous studies on other disorders, the new study suggests that L1 genes are indeed more active in the brain of patients with neuropsychiatric diseases, Gage says. He cautions, however, that no one yet knows whether they are actually causing the disease. “Now that we have multiple confirmations of this occurring in humans with different diseases, the next step is to determine if possible what role, if any, they play.”

One tantalizing possibility is that as these restless bits of DNA drift throughout the genomes of human brain cells, they help create the vibrant cognitive diversity that helps humans as a species respond to changing environmental conditions, and produces extraordinary “outliers,” including innovators and geniuses such as Picasso, says UC San Diego neuroscientist Alysson Muotri. The price of such rich diversity may be that mutations contributing to mental disorders such as schizophrenia sometimes emerge. Figuring out what these jumping genes truly do in the human brain is the “next frontier” for understanding complex mental disorders, he says. “This is only the tip of the iceberg.”

Thanks to Dr. Rajadhyaksha for bringing this to the attention of the It’s Interesting community.

http://news.sciencemag.org/biology/2014/01/jumping-genes-linked-schizophrenia

Electric brain stimulation in a specific area discovered to induce a sense of determination

Doctors in the US have induced feelings of intense determination in two men by stimulating a part of their brains with gentle electric currents.

The men were having a routine procedure to locate regions in their brains that caused epileptic seizures when they felt their heart rates rise, a sense of foreboding, and an overwhelming desire to persevere against a looming hardship.

The remarkable findings could help researchers develop treatments for depression and other disorders where people are debilitated by a lack of motivation.

One patient said the feeling was like driving a car into a raging storm. When his brain was stimulated, he sensed a shaking in his chest and a surge in his pulse. In six trials, he felt the same sensations time and again.

Comparing the feelings to a frantic drive towards a storm, the patient said: “You’re only halfway there and you have no other way to turn around and go back, you have to keep going forward.”

When asked by doctors to elaborate on whether the feeling was good or bad, he said: “It was more of a positive thing, like push harder, push harder, push harder to try and get through this.”

A second patient had similar feelings when his brain was stimulated in the same region, called the anterior midcingulate cortex (aMCC). He felt worried that something terrible was about to happen, but knew he had to fight and not give up, according to a case study in the journal Neuron.

Both men were having an exploratory procedure to find the focal point in their brains that caused them to suffer epileptic fits. In the procedure, doctors sink fine electrodes deep into different parts of the brain and stimulate them with tiny electrical currents until the patient senses the “aura” that precedes a seizure. Often, seizures can be treated by removing tissue from this part of the brain.

“In the very first patient this was something very unexpected, and we didn’t report it,” said Josef Parvizi at Stanford University in California. But then I was doing functional mapping on the second patient and he suddenly experienced a very similar thing.”

“Its extraordinary that two individuals with very different past experiences respond in a similar way to one or two seconds of very low intensity electricity delivered to the same area of their brain. These patients are normal individuals, they have their IQ, they have their jobs. We are not reporting these findings in sick brains,” Parvizi said.

The men were stimulated with between two and eight milliamps of electrical current, but in tests the doctors administered sham stimulation too. In the sham tests, they told the patients they were about to stimulate the brain, but had switched off the electical supply. In these cases, the men reported no changes to their feelings. The sensation was only induced in a small area of the brain, and vanished when doctors implanted electrodes just five millimetres away.

Parvizi said a crucial follow-up experiment will be to test whether stimulation of the brain region really makes people more determined, or simply creates the sensation of perseverance. If future studies replicate the findings, stimulation of the brain region – perhaps without the need for brain-penetrating electrodes – could be used to help people with severe depression.

The anterior midcingulate cortex seems to be important in helping us select responses and make decisions in light of the feedback we get. Brent Vogt, a neurobiologist at Boston University, said patients with chronic pain and obsessive-compulsive disorder have already been treated by destroying part of the aMCC. “Why not stimulate it? If this would enhance relieving depression, for example, let’s go,” he said.

http://www.theguardian.com/science/2013/dec/05/determination-electrical-brain-stimulation

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

Trouble With Math? Maybe You Should Get Your Brain Zapped

sn-math

by Emily Underwood
ScienceNOW

If you are one of the 20% of healthy adults who struggle with basic arithmetic, simple tasks like splitting the dinner bill can be excruciating. Now, a new study suggests that a gentle, painless electrical current applied to the brain can boost math performance for up to 6 months. Researchers don’t fully understand how it works, however, and there could be side effects.

The idea of using electrical current to alter brain activity is nothing new—electroshock therapy, which induces seizures for therapeutic effect, is probably the best known and most dramatic example. In recent years, however, a slew of studies has shown that much milder electrical stimulation applied to targeted regions of the brain can dramatically accelerate learning in a wide range of tasks, from marksmanship to speech rehabilitation after stroke.

In 2010, cognitive neuroscientist Roi Cohen Kadosh of the University of Oxford in the United Kingdom showed that, when combined with training, electrical brain stimulation can make people better at very basic numerical tasks, such as judging which of two quantities is larger. However, it wasn’t clear how those basic numerical skills would translate to real-world math ability.

To answer that question, Cohen Kadosh recruited 25 volunteers to practice math while receiving either real or “sham” brain stimulation. Two sponge-covered electrodes, fixed to either side of the forehead with a stretchy athletic band, targeted an area of the prefrontal cortex considered key to arithmetic processing, says Jacqueline Thompson, a Ph.D. student in Cohen Kadosh’s lab and a co-author on the study. The electrical current slowly ramped up to about 1 milliamp—a tiny fraction of the voltage of an AA battery—then randomly fluctuated between high and low values. For the sham group, the researchers simulated the initial sensation of the increase by releasing a small amount of current, then turned it off.

For roughly 20 minutes per day over 5 days, the participants memorized arbitrary mathematical “facts,” such as 4#10 = 23, then performed a more sophisticated task requiring multiple steps of arithmetic, also based on memorized symbols. A squiggle, for example, might mean “add 2,” or “subtract 1.” This is the first time that brain stimulation has been applied to improving such complex math skills, says neuroethicist Peter Reiner of the University of British Columbia, Vancouver, in Canada, who wasn’t involved in the research.

The researchers also used a brain imaging technique called near-infrared spectroscopy to measure how efficiently the participants’ brains were working as they performed the tasks.

Although the two groups performed at the same level on the first day, over the next 4 days people receiving brain stimulation along with training learned to do the tasks two to five times faster than people receiving a sham treatment, the authors reported in Current Biology. Six months later, the researchers called the participants back and found that people who had received brain stimulation were still roughly 30% faster at the same types of mathematical challenges. The targeted brain region also showed more efficient activity, Thompson says.

The fact that only participants who received electrical stimulation and practiced math showed lasting physiological changes in their brains suggests that experience is required to seal in the effects of stimulation, says Michael Weisend, a neuroscientist at the Mind Research Network in Albuquerque, New Mexico, who wasn’t involved with the study. That’s valuable information for people who hope to get benefits from stimulation alone, he says. “It’s not going to be a magic bullet.”

Although it’s not clear how the technique works, Thompson says, one hypothesis is that the current helps synchronize neuron firing, enabling the brain to work more efficiently. Scientists also don’t know if negative or unintended effects might result. Although no side effects of brain stimulation have yet been reported, “it’s impossible to say with any certainty” that there aren’t any, Thompson says.

“Math is only one of dozens of skills in which this could be used,” Reiner says, adding that it’s “not unreasonable” to imagine that this and similar stimulation techniques could replace the use of pills for cognitive enhancement.

In the future, the researchers hope to include groups that often struggle with math, such as people with neurodegenerative disorders and a condition called developmental dyscalculia. As long as further testing shows that the technique is safe and effective, children in schools could also receive brain stimulation along with their lessons, Thompson says. But there’s “a long way to go,” before the method is ready for schools, she says. In the meantime, she adds, “We strongly caution you not to try this at home, no matter how tempted you may be to slap a battery on your kid’s head.”

http://news.sciencemag.org/sciencenow/2013/05/trouble-with-math-maybe-you-shou.html?ref=hp

Brain implants: Restoring memory with a microchip

130507101540-brain-implants-human-horizontal-gallery

William Gibson’s popular science fiction tale “Johnny Mnemonic” foresaw sensitive information being carried by microchips in the brain by 2021. A team of American neuroscientists could be making this fantasy world a reality. Their motivation is different but the outcome would be somewhat similar. Hailed as one of 2013’s top ten technological breakthroughs by MIT, the work by the University of Southern California, North Carolina’s Wake Forest University and other partners has actually spanned a decade.

But the U.S.-wide team now thinks that it will see a memory device being implanted in a small number of human volunteers within two years and available to patients in five to 10 years. They can’t quite contain their excitement. “I never thought I’d see this in my lifetime,” said Ted Berger, professor of biomedical engineering at the University of Southern California in Los Angeles. “I might not benefit from it myself but my kids will.”

Rob Hampson, associate professor of physiology and pharmacology at Wake Forest University, agrees. “We keep pushing forward, every time I put an estimate on it, it gets shorter and shorter.”

The scientists — who bring varied skills to the table, including mathematical modeling and psychiatry — believe they have cracked how long-term memories are made, stored and retrieved and how to replicate this process in brains that are damaged, particularly by stroke or localized injury.

Berger said they record a memory being made, in an undamaged area of the brain, then use that data to predict what a damaged area “downstream” should be doing. Electrodes are then used to stimulate the damaged area to replicate the action of the undamaged cells.

They concentrate on the hippocampus — part of the cerebral cortex which sits deep in the brain — where short-term memories become long-term ones. Berger has looked at how electrical signals travel through neurons there to form those long-term memories and has used his expertise in mathematical modeling to mimic these movements using electronics.

Hampson, whose university has done much of the animal studies, adds: “We support and reinforce the signal in the hippocampus but we are moving forward with the idea that if you can study enough of the inputs and outputs to replace the function of the hippocampus, you can bypass the hippocampus.”

The team’s experiments on rats and monkeys have shown that certain brain functions can be replaced with signals via electrodes. You would think that the work of then creating an implant for people and getting such a thing approved would be a Herculean task, but think again.

For 15 years, people have been having brain implants to provide deep brain stimulation to treat epilepsy and Parkinson’s disease — a reported 80,000 people have now had such devices placed in their brains. So many of the hurdles have already been overcome — particularly the “yuck factor” and the fear factor.

“It’s now commonly accepted that humans will have electrodes put in them — it’s done for epilepsy, deep brain stimulation, (that has made it) easier for investigative research, it’s much more acceptable now than five to 10 years ago,” Hampson says.

Much of the work that remains now is in shrinking down the electronics.

“Right now it’s not a device, it’s a fair amount of equipment,”Hampson says. “We’re probably looking at devices in the five to 10 year range for human patients.”

The ultimate goal in memory research would be to treat Alzheimer’s Disease but unlike in stroke or localized brain injury, Alzheimer’s tends to affect many parts of the brain, especially in its later stages, making these implants a less likely option any time soon.

Berger foresees a future, however, where drugs and implants could be used together to treat early dementia. Drugs could be used to enhance the action of cells that surround the most damaged areas, and the team’s memory implant could be used to replace a lot of the lost cells in the center of the damaged area. “I think the best strategy is going to involve both drugs and devices,” he says.

Unfortunately, the team found that its method can’t help patients with advanced dementia.

“When looking at a patient with mild memory loss, there’s probably enough residual signal to work with, but not when there’s significant memory loss,” Hampson said.

Constantine Lyketsos, professor of psychiatry and behavioral sciences at John Hopkins Medicine in Baltimore which is trialing a deep brain stimulator implant for Alzheimer’s patients was a little skeptical of the other team’s claims.

“The brain has a lot of redundancy, it can function pretty well if loses one or two parts. But memory involves circuits diffusely dispersed throughout the brain so it’s hard to envision.” However, he added that it was more likely to be successful in helping victims of stroke or localized brain injury as indeed its makers are aiming to do.

The UK’s Alzheimer’s Society is cautiously optimistic.

“Finding ways to combat symptoms caused by changes in the brain is an ongoing battle for researchers. An implant like this one is an interesting avenue to explore,” said Doug Brown, director of research and development.

Hampson says the team’s breakthrough is “like the difference between a cane, to help you walk, and a prosthetic limb — it’s two different approaches.”

It will still take time for many people to accept their findings and their claims, he says, but they don’t expect to have a shortage of volunteers stepping forward to try their implant — the project is partly funded by the U.S. military which is looking for help with battlefield injuries.

There are U.S. soldiers coming back from operations with brain trauma and a neurologist at DARPA (the Defense Advanced Research Projects Agency) is asking “what can you do for my boys?” Hampson says.

“That’s what it’s all about.”

http://www.cnn.com/2013/05/07/tech/brain-memory-implants-humans/index.html?iref=allsearch

Researchers explore connecting the brain to machines

brain

Behind a locked door in a white-walled basement in a research building in Tempe, Ariz., a monkey sits stone-still in a chair, eyes locked on a computer screen. From his head protrudes a bundle of wires; from his mouth, a plastic tube. As he stares, a picture of a green cursor on the black screen floats toward the corner of a cube. The monkey is moving it with his mind.

The monkey, a rhesus macaque named Oscar, has electrodes implanted in his motor cortex, detecting electrical impulses that indicate mental activity and translating them to the movement of the ball on the screen. The computer isn’t reading his mind, exactly — Oscar’s own brain is doing a lot of the lifting, adapting itself by trial and error to the delicate task of accurately communicating its intentions to the machine. (When Oscar succeeds in controlling the ball as instructed, the tube in his mouth rewards him with a sip of his favorite beverage, Crystal Light.) It’s not technically telekinesis, either, since that would imply that there’s something paranormal about the process. It’s called a “brain-computer interface” (BCI). And it just might represent the future of the relationship between human and machine.

Stephen Helms Tillery’s laboratory at Arizona State University is one of a growing number where researchers are racing to explore the breathtaking potential of BCIs and a related technology, neuroprosthetics. The promise is irresistible: from restoring sight to the blind, to helping the paralyzed walk again, to allowing people suffering from locked-in syndrome to communicate with the outside world. In the past few years, the pace of progress has been accelerating, delivering dazzling headlines seemingly by the week.

At Duke University in 2008, a monkey named Idoya walked on a treadmill, causing a robot in Japan to do the same. Then Miguel Nicolelis stopped the monkey’s treadmill — and the robotic legs kept walking, controlled by Idoya’s brain. At Andrew Schwartz’s lab at the University of Pittsburgh in December 2012, a quadriplegic woman named Jan Scheuermann learned to feed herself chocolate by mentally manipulating a robotic arm. Just last month, Nicolelis’ lab set up what it billed as the first brain-to-brain interface, allowing a rat in North Carolina to make a decision based on sensory data beamed via Internet from the brain of a rat in Brazil.

So far the focus has been on medical applications — restoring standard-issue human functions to people with disabilities. But it’s not hard to imagine the same technologies someday augmenting capacities. If you can make robotic legs walk with your mind, there’s no reason you can’t also make them run faster than any sprinter. If you can control a robotic arm, you can control a robotic crane. If you can play a computer game with your mind, you can, theoretically at least, fly a drone with your mind.

It’s tempting and a bit frightening to imagine that all of this is right around the corner, given how far the field has already come in a short time. Indeed, Nicolelis — the media-savvy scientist behind the “rat telepathy” experiment — is aiming to build a robotic bodysuit that would allow a paralyzed teen to take the first kick of the 2014 World Cup. Yet the same factor that has made the explosion of progress in neuroprosthetics possible could also make future advances harder to come by: the almost unfathomable complexity of the human brain.

From I, Robot to Skynet, we’ve tended to assume that the machines of the future would be guided by artificial intelligence — that our robots would have minds of their own. Over the decades, researchers have made enormous leaps in artificial intelligence (AI), and we may be entering an age of “smart objects” that can learn, adapt to, and even shape our habits and preferences. We have planes that fly themselves, and we’ll soon have cars that do the same. Google has some of the world’s top AI minds working on making our smartphones even smarter, to the point that they can anticipate our needs. But “smart” is not the same as “sentient.” We can train devices to learn specific behaviors, and even out-think humans in certain constrained settings, like a game of Jeopardy. But we’re still nowhere close to building a machine that can pass the Turing test, the benchmark for human-like intelligence. Some experts doubt we ever will.

Philosophy aside, for the time being the smartest machines of all are those that humans can control. The challenge lies in how best to control them. From vacuum tubes to the DOS command line to the Mac to the iPhone, the history of computing has been a progression from lower to higher levels of abstraction. In other words, we’ve been moving from machines that require us to understand and directly manipulate their inner workings to machines that understand how we work and respond readily to our commands. The next step after smartphones may be voice-controlled smart glasses, which can intuit our intentions all the more readily because they see what we see and hear what we hear.

The logical endpoint of this progression would be computers that read our minds, computers we can control without any physical action on our part at all. That sounds impossible. After all, if the human brain is so hard to compute, how can a computer understand what’s going on inside it?

It can’t. But as it turns out, it doesn’t have to — not fully, anyway. What makes brain-computer interfaces possible is an amazing property of the brain called neuroplasticity: the ability of neurons to form new connections in response to fresh stimuli. Our brains are constantly rewiring themselves to allow us to adapt to our environment. So when researchers implant electrodes in a part of the brain that they expect to be active in moving, say, the right arm, it’s not essential that they know in advance exactly which neurons will fire at what rate. When the subject attempts to move the robotic arm and sees that it isn’t quite working as expected, the person — or rat or monkey — will try different configurations of brain activity. Eventually, with time and feedback and training, the brain will hit on a solution that makes use of the electrodes to move the arm.

That’s the principle behind such rapid progress in brain-computer interface and neuroprosthetics. Researchers began looking into the possibility of reading signals directly from the brain in the 1970s, and testing on rats began in the early 1990s. The first big breakthrough for humans came in Georgia in 1997, when a scientist named Philip Kennedy used brain implants to allow a “locked in” stroke victim named Johnny Ray to spell out words by moving a cursor with his thoughts. (It took him six exhausting months of training to master the process.) In 2008, when Nicolelis got his monkey at Duke to make robotic legs run a treadmill in Japan, it might have seemed like mind-controlled exoskeletons for humans were just another step or two away. If he succeeds in his plan to have a paralyzed youngster kick a soccer ball at next year’s World Cup, some will pronounce the cyborg revolution in full swing.

Schwartz, the Pittsburgh researcher who helped Jan Scheuermann feed herself chocolate in December, is optimistic that neuroprosthetics will eventually allow paralyzed people to regain some mobility. But he says that full control over an exoskeleton would require a more sophisticated way to extract nuanced information from the brain. Getting a pair of robotic legs to walk is one thing. Getting robotic limbs to do everything human limbs can do may be exponentially more complicated. “The challenge of maintaining balance and staying upright on two feet is a difficult problem, but it can be handled by robotics without a brain. But if you need to move gracefully and with skill, turn and step over obstacles, decide if it’s slippery outside — that does require a brain. If you see someone go up and kick a soccer ball, the essential thing to ask is, ‘OK, what would happen if I moved the soccer ball two inches to the right?'” The idea that simple electrodes could detect things as complex as memory or cognition, which involve the firing of billions of neurons in patterns that scientists can’t yet comprehend, is far-fetched, Schwartz adds.

That’s not the only reason that companies like Apple and Google aren’t yet working on devices that read our minds (as far as we know). Another one is that the devices aren’t portable. And then there’s the little fact that they require brain surgery.

A different class of brain-scanning technology is being touted on the consumer market and in the media as a way for computers to read people’s minds without drilling into their skulls. It’s called electroencephalography, or EEG, and it involves headsets that press electrodes against the scalp. In an impressive 2010 TED Talk, Tan Le of the consumer EEG-headset company Emotiv Lifescience showed how someone can use her company’s EPOC headset to move objects on a computer screen.

Skeptics point out that these devices can detect only the crudest electrical signals from the brain itself, which is well-insulated by the skull and scalp. In many cases, consumer devices that claim to read people’s thoughts are in fact relying largely on physical signals like skin conductivity and tension of the scalp or eyebrow muscles.

Robert Oschler, a robotics enthusiast who develops apps for EEG headsets, believes the more sophisticated consumer headsets like the Emotiv EPOC may be the real deal in terms of filtering out the noise to detect brain waves. Still, he says, there are limits to what even the most advanced, medical-grade EEG devices can divine about our cognition. He’s fond of an analogy that he attributes to Gerwin Schalk, a pioneer in the field of invasive brain implants. The best EEG devices, he says, are “like going to a stadium with a bunch of microphones: You can’t hear what any individual is saying, but maybe you can tell if they’re doing the wave.” With some of the more basic consumer headsets, at this point, “it’s like being in a party in the parking lot outside the same game.”

It’s fairly safe to say that EEG headsets won’t be turning us into cyborgs anytime soon. But it would be a mistake to assume that we can predict today how brain-computer interface technology will evolve. Just last month, a team at Brown University unveiled a prototype of a low-power, wireless neural implant that can transmit signals to a computer over broadband. That could be a major step forward in someday making BCIs practical for everyday use. Meanwhile, researchers at Cornell last week revealed that they were able to use fMRI, a measure of brain activity, to detect which of four people a research subject was thinking about at a given time. Machines today can read our minds in only the most rudimentary ways. But such advances hint that they may be able to detect and respond to more abstract types of mental activity in the always-changing future.

http://www.ydr.com/living/ci_22800493/researchers-explore-connecting-brain-machines

Flip of a single molecular switch makes an old brain young

green-image

The flip of a single molecular switch helps create the mature neuronal connections that allow the brain to bridge the gap between adolescent impressionability and adult stability. Now Yale School of Medicine researchers have reversed the process, recreating a youthful brain that facilitated both learning and healing in the adult mouse.

Scientists have long known that the young and old brains are very different. Adolescent brains are more malleable or plastic, which allows them to learn languages more quickly than adults and speeds recovery from brain injuries. The comparative rigidity of the adult brain results in part from the function of a single gene that slows the rapid change in synaptic connections between neurons.

By monitoring the synapses in living mice over weeks and months, Yale researchers have identified the key genetic switch for brain maturation a study released March 6 in the journal Neuron. The Nogo Receptor 1 gene is required to suppress high levels of plasticity in the adolescent brain and create the relatively quiescent levels of plasticity in adulthood. In mice without this gene, juvenile levels of brain plasticity persist throughout adulthood. When researchers blocked the function of this gene in old mice, they reset the old brain to adolescent levels of plasticity.

“These are the molecules the brain needs for the transition from adolescence to adulthood,” said Dr. Stephen Strittmatter. Vincent Coates Professor of Neurology, Professor of Neurobiology and senior author of the paper. “It suggests we can turn back the clock in the adult brain and recover from trauma the way kids recover.”

Rehabilitation after brain injuries like strokes requires that patients re-learn tasks such as moving a hand. Researchers found that adult mice lacking Nogo Receptor recovered from injury as quickly as adolescent mice and mastered new, complex motor tasks more quickly than adults with the receptor.

“This raises the potential that manipulating Nogo Receptor in humans might accelerate and magnify rehabilitation after brain injuries like strokes,” said Feras Akbik, Yale doctoral student who is first author of the study.

Researchers also showed that Nogo Receptor slows loss of memories. Mice without Nogo receptor lost stressful memories more quickly, suggesting that manipulating the receptor could help treat post-traumatic stress disorder.

“We know a lot about the early development of the brain,” Strittmatter said, “But we know amazingly little about what happens in the brain during late adolescence.”

Other Yale authors are: Sarah M. Bhagat, Pujan R. Patel and William B.J. Cafferty

The study was funded by the National Institutes of Health. Strittmatter is scientific founder of Axerion Therapeutics, which is investigating applications of Nogo research to repair spinal cord damage.

http://news.yale.edu/2013/03/06/flip-single-molecular-switch-makes-old-brain-young