Trouble With Math? Maybe You Should Get Your Brain Zapped

sn-math

by Emily Underwood
ScienceNOW

If you are one of the 20% of healthy adults who struggle with basic arithmetic, simple tasks like splitting the dinner bill can be excruciating. Now, a new study suggests that a gentle, painless electrical current applied to the brain can boost math performance for up to 6 months. Researchers don’t fully understand how it works, however, and there could be side effects.

The idea of using electrical current to alter brain activity is nothing new—electroshock therapy, which induces seizures for therapeutic effect, is probably the best known and most dramatic example. In recent years, however, a slew of studies has shown that much milder electrical stimulation applied to targeted regions of the brain can dramatically accelerate learning in a wide range of tasks, from marksmanship to speech rehabilitation after stroke.

In 2010, cognitive neuroscientist Roi Cohen Kadosh of the University of Oxford in the United Kingdom showed that, when combined with training, electrical brain stimulation can make people better at very basic numerical tasks, such as judging which of two quantities is larger. However, it wasn’t clear how those basic numerical skills would translate to real-world math ability.

To answer that question, Cohen Kadosh recruited 25 volunteers to practice math while receiving either real or “sham” brain stimulation. Two sponge-covered electrodes, fixed to either side of the forehead with a stretchy athletic band, targeted an area of the prefrontal cortex considered key to arithmetic processing, says Jacqueline Thompson, a Ph.D. student in Cohen Kadosh’s lab and a co-author on the study. The electrical current slowly ramped up to about 1 milliamp—a tiny fraction of the voltage of an AA battery—then randomly fluctuated between high and low values. For the sham group, the researchers simulated the initial sensation of the increase by releasing a small amount of current, then turned it off.

For roughly 20 minutes per day over 5 days, the participants memorized arbitrary mathematical “facts,” such as 4#10 = 23, then performed a more sophisticated task requiring multiple steps of arithmetic, also based on memorized symbols. A squiggle, for example, might mean “add 2,” or “subtract 1.” This is the first time that brain stimulation has been applied to improving such complex math skills, says neuroethicist Peter Reiner of the University of British Columbia, Vancouver, in Canada, who wasn’t involved in the research.

The researchers also used a brain imaging technique called near-infrared spectroscopy to measure how efficiently the participants’ brains were working as they performed the tasks.

Although the two groups performed at the same level on the first day, over the next 4 days people receiving brain stimulation along with training learned to do the tasks two to five times faster than people receiving a sham treatment, the authors reported in Current Biology. Six months later, the researchers called the participants back and found that people who had received brain stimulation were still roughly 30% faster at the same types of mathematical challenges. The targeted brain region also showed more efficient activity, Thompson says.

The fact that only participants who received electrical stimulation and practiced math showed lasting physiological changes in their brains suggests that experience is required to seal in the effects of stimulation, says Michael Weisend, a neuroscientist at the Mind Research Network in Albuquerque, New Mexico, who wasn’t involved with the study. That’s valuable information for people who hope to get benefits from stimulation alone, he says. “It’s not going to be a magic bullet.”

Although it’s not clear how the technique works, Thompson says, one hypothesis is that the current helps synchronize neuron firing, enabling the brain to work more efficiently. Scientists also don’t know if negative or unintended effects might result. Although no side effects of brain stimulation have yet been reported, “it’s impossible to say with any certainty” that there aren’t any, Thompson says.

“Math is only one of dozens of skills in which this could be used,” Reiner says, adding that it’s “not unreasonable” to imagine that this and similar stimulation techniques could replace the use of pills for cognitive enhancement.

In the future, the researchers hope to include groups that often struggle with math, such as people with neurodegenerative disorders and a condition called developmental dyscalculia. As long as further testing shows that the technique is safe and effective, children in schools could also receive brain stimulation along with their lessons, Thompson says. But there’s “a long way to go,” before the method is ready for schools, she says. In the meantime, she adds, “We strongly caution you not to try this at home, no matter how tempted you may be to slap a battery on your kid’s head.”

http://news.sciencemag.org/sciencenow/2013/05/trouble-with-math-maybe-you-shou.html?ref=hp

Pyjamas that read your child a story

930_2557014b

Get ready to scan your children. The timeless onesie’s getting a digital upgrade with a set of jammies that link to stories and lullabies on a smart device.

Technology has tiptoed into kids’ pajamas with onesies covered in QR codes that link to bedtime stories.

“It’s time for bed, Tommy. Brush your teeth, put on your PJs, and let’s scan you.”

Smart PJs, called the world’s “first and only interactive pajamas,” require downloading a free app for iOS or Android and scanning one of dozens of codes from the Smart PJs with a smartphone or tablet. The device then reads aloud a story, sings a lullaby, or broadcasts pictures of animals or other bedtime-appropriate cuteness.

“We purposely created Smart PJ’s with the scannable dot patterns all over them so that parents can help the child scan the stories on their backs where they can’t reach,” Smart PJs founder Juan Murdoch told told Tech Cocktail. “We also put words to all the stories and animal information on the screen so that parents can turn off the volume and help the child learn to read the stories and words themselves.”

Murdoch, an Idaho Falls, Idaho, real estate agent and father of six, just got honored at an an event showcasing Boise-area startups, where his company was named Hottest Showcasing Startup.

The smart jammies join a number of other QR-coded clothes on the market, including T-shirts that link back to the wearer’s social-networking profiles. The $25 cotton PJs come in four sizes for boys and girls.

“Now your child will be excited to go to bed,” says a promotional video for the product. One potential hitch: all those studies suggesting that staring into gadget screens at night can disrupt sleep patterns.

Also, we really need to know these innocent little onesies won’t start serving up ads for Barbies and Legos in the middle of “Winnie the Pooh.”

http://news.cnet.com/8301-17938_105-57581071-1/smart-pajamas-read-your-kids-a-bedtime-story/

Brain implants: Restoring memory with a microchip

130507101540-brain-implants-human-horizontal-gallery

William Gibson’s popular science fiction tale “Johnny Mnemonic” foresaw sensitive information being carried by microchips in the brain by 2021. A team of American neuroscientists could be making this fantasy world a reality. Their motivation is different but the outcome would be somewhat similar. Hailed as one of 2013’s top ten technological breakthroughs by MIT, the work by the University of Southern California, North Carolina’s Wake Forest University and other partners has actually spanned a decade.

But the U.S.-wide team now thinks that it will see a memory device being implanted in a small number of human volunteers within two years and available to patients in five to 10 years. They can’t quite contain their excitement. “I never thought I’d see this in my lifetime,” said Ted Berger, professor of biomedical engineering at the University of Southern California in Los Angeles. “I might not benefit from it myself but my kids will.”

Rob Hampson, associate professor of physiology and pharmacology at Wake Forest University, agrees. “We keep pushing forward, every time I put an estimate on it, it gets shorter and shorter.”

The scientists — who bring varied skills to the table, including mathematical modeling and psychiatry — believe they have cracked how long-term memories are made, stored and retrieved and how to replicate this process in brains that are damaged, particularly by stroke or localized injury.

Berger said they record a memory being made, in an undamaged area of the brain, then use that data to predict what a damaged area “downstream” should be doing. Electrodes are then used to stimulate the damaged area to replicate the action of the undamaged cells.

They concentrate on the hippocampus — part of the cerebral cortex which sits deep in the brain — where short-term memories become long-term ones. Berger has looked at how electrical signals travel through neurons there to form those long-term memories and has used his expertise in mathematical modeling to mimic these movements using electronics.

Hampson, whose university has done much of the animal studies, adds: “We support and reinforce the signal in the hippocampus but we are moving forward with the idea that if you can study enough of the inputs and outputs to replace the function of the hippocampus, you can bypass the hippocampus.”

The team’s experiments on rats and monkeys have shown that certain brain functions can be replaced with signals via electrodes. You would think that the work of then creating an implant for people and getting such a thing approved would be a Herculean task, but think again.

For 15 years, people have been having brain implants to provide deep brain stimulation to treat epilepsy and Parkinson’s disease — a reported 80,000 people have now had such devices placed in their brains. So many of the hurdles have already been overcome — particularly the “yuck factor” and the fear factor.

“It’s now commonly accepted that humans will have electrodes put in them — it’s done for epilepsy, deep brain stimulation, (that has made it) easier for investigative research, it’s much more acceptable now than five to 10 years ago,” Hampson says.

Much of the work that remains now is in shrinking down the electronics.

“Right now it’s not a device, it’s a fair amount of equipment,”Hampson says. “We’re probably looking at devices in the five to 10 year range for human patients.”

The ultimate goal in memory research would be to treat Alzheimer’s Disease but unlike in stroke or localized brain injury, Alzheimer’s tends to affect many parts of the brain, especially in its later stages, making these implants a less likely option any time soon.

Berger foresees a future, however, where drugs and implants could be used together to treat early dementia. Drugs could be used to enhance the action of cells that surround the most damaged areas, and the team’s memory implant could be used to replace a lot of the lost cells in the center of the damaged area. “I think the best strategy is going to involve both drugs and devices,” he says.

Unfortunately, the team found that its method can’t help patients with advanced dementia.

“When looking at a patient with mild memory loss, there’s probably enough residual signal to work with, but not when there’s significant memory loss,” Hampson said.

Constantine Lyketsos, professor of psychiatry and behavioral sciences at John Hopkins Medicine in Baltimore which is trialing a deep brain stimulator implant for Alzheimer’s patients was a little skeptical of the other team’s claims.

“The brain has a lot of redundancy, it can function pretty well if loses one or two parts. But memory involves circuits diffusely dispersed throughout the brain so it’s hard to envision.” However, he added that it was more likely to be successful in helping victims of stroke or localized brain injury as indeed its makers are aiming to do.

The UK’s Alzheimer’s Society is cautiously optimistic.

“Finding ways to combat symptoms caused by changes in the brain is an ongoing battle for researchers. An implant like this one is an interesting avenue to explore,” said Doug Brown, director of research and development.

Hampson says the team’s breakthrough is “like the difference between a cane, to help you walk, and a prosthetic limb — it’s two different approaches.”

It will still take time for many people to accept their findings and their claims, he says, but they don’t expect to have a shortage of volunteers stepping forward to try their implant — the project is partly funded by the U.S. military which is looking for help with battlefield injuries.

There are U.S. soldiers coming back from operations with brain trauma and a neurologist at DARPA (the Defense Advanced Research Projects Agency) is asking “what can you do for my boys?” Hampson says.

“That’s what it’s all about.”

http://www.cnn.com/2013/05/07/tech/brain-memory-implants-humans/index.html?iref=allsearch

Bionic superhumans on the horizon

1-bionic-hand-story-top

Around 220,000 people worldwide already walk around with cochlear implants — devices worn around the ear that turn sound waves into electrical impulses shunted directly into the auditory nerve.

Tens of thousands of people have been implanted with deep brain stimulators, devices that send an electrode tunneling several inches in the brain. Deep brain stimulators are used to control Parkinson’s disease, though lately they’ve also been tested — with encouraging results — in use against severe depression and obsessive compulsive disorder.

The most obvious bionics are those that replace limbs. Olympian “Blade Runner” Oscar Pistorius, now awaiting trial for the alleged murder of his girlfriend, made a splash with his Cheetah carbon fiber prostheses. Yet those are a relatively simple technology — a curved piece of slightly springy, super-strong material. In the digital age, we’re seeing more sophisticated limbs.

Consider the thought-controlled bionic leg that Zac Vawter used to climb all 103 floors of Chicago’s Willis Tower. Or the nerve-controlled bionic hand that Iraq war veteran Glen Lehman had attached after the loss of his original hand.

Or the even more sophisticated i-limb Ultra, an artificial hand with five independently articulating artificial fingers. Those limbs don’t just react mechanically to pressure. They actually respond to the thoughts and intentions of their owners, flexing, extending, gripping, and releasing on mental command.

The age when prostheses were largely inert pieces of wood, metal, and plastic is passing. Advances in microprocessors, in techniques to interface digital technology with the human nervous system, and in battery technology to allow prostheses to pack more power with less weight are turning replacement limbs into active parts of the human body.

In some cases, they’re not even part of the body at all. Consider the case of Cathy Hutchinson. In 1997, Cathy had a stroke, leaving her without control of her arms. Hutchinson volunteered for an experimental procedure that could one day help millions of people with partial or complete paralysis. She let researchers implant a small device in the part of her brain responsible for motor control. With that device, she is able to control an external robotic arm by thinking about it.

That, in turn, brings up an interesting question: If the arm isn’t physically attached to her body, how far away could she be and still control it? The answer is at least thousands of miles. In animal studies, scientists have shown that a monkey with a brain implant can control a robot arm 7,000 miles away. The monkey’s mental signals were sent over the internet, from Duke University in North Carolina, to the robot arm in Japan. In this day and age, distance is almost irrelevant.

The 7,000-mile-away prosthetic arm makes an important point: These new prostheses aren’t just going to restore missing human abilities. They’re going to enhance our abilities, giving us powers we never had before, and augmenting other capabilities we have. While the current generation of prostheses is still primitive, we can already see this taking shape when a monkey moves a robotic arm on the other side of the planet just by thinking about it.

Other research is pointing to enhancements to memory and decision making.

The hippocampus is a small, seahorse-shaped part of the brain that’s essential in forming new memories. If it’s damaged — by an injury to the head, for example — people start having difficulty forming new long-term memories. In the most extreme cases, this can lead to the complete inability to form new long-term memories, as in the film Memento. Working to find a way to repair this sort of brain damage, researchers in 2011 created a “hippocampus chip” that can replace damaged brain tissue. When they implanted it in rats with a damaged hippocampus, they found that not only could their chip repair damaged memory — it could improve the rats’ ability to learn new things.

Nor is memory the end of it. Another study, in 2012, demonstrated that we can boost intelligence — at least one sort — in monkeys. Scientists at Wake Forest University implanted specialized brain chips in a set of monkeys and trained those monkeys to perform a picture-matching game. When the implant was activated, it raised their scores by an average of 10 points on a 100-point scale. The implant makes monkeys smarter.

Both of those technologies for boosting memory and intelligence are in very early stages, in small animal studies only, and years (or possibly decades) away from wide use in humans. Still, they make us wonder — what happens when it’s possible to improve on the human body and mind?

The debate has started already, of course. Oscar Pistorius had to fight hard for inclusion in the Olympics. Many objected that his carbon fiber prostheses gave him a competitive advantage. He was able — with the help of doctors and biomedical engineers — to make a compelling case that his Cheetah blades didn’t give him any advantage on the field. But how long will that be true? How long until we have prostheses (not to mention drugs and genetic therapies) that make athletes better in their sports?

But the issue is much, much wider than professional sports. We may care passionately about the integrity of the Olympics or professional cycling or so on, but they only directly affect a very small number of us. In other areas of life — in the workforce in particular — enhancement technology might affect all of us.

When it’s possible to make humans smarter, sharper, and faster, how will that affect us? Will the effect be mostly positive, boosting our productivity and the rate of human innovation? Or will it be just another pressure to compete at work? Who will be able to afford these technologies? Will anyone be able to have their body, and more importantly, their brain upgraded? Or will only the rich have access to these enhancements?

We have a little while to consider these questions, but we ought to start. The technology will sneak its way into our lives, starting with people with disabilities, the injured, and the ill. It’ll improve their lives in ways that are unquestionably good. And then, one day, we’ll wake up and realize that we’re doing more than restoring lost function. We’re enhancing it.

Superhuman technology is on the horizon. Time to start thinking about what that means for us.

http://www.cnn.com/2013/04/24/opinion/bionic-superhumans-ramez-naam/index.html?iid=article_sidebar

Fundawear

Fundawear – a prototype conception from Durex Australia – adds an extra dimension to long-distance lovemaking through the use of hi-tech vibrating underwear that can be wirelessly stimulated via a mobile app. The intensity and location of the vibrations can be controlled with a flick of a finger.

To demonstrate, Durex recruited chirpy Bondi couple Nick and Dani to be the first to test it out. They were separated before the trial, and, in their own words, by the time they came to use Fundawear they felt like they hadn’t seen each other for “like, 100 years”.

Tech director Ben Moir, who is featured in the video, remains confident: “Fundawear is a project about transferring touch across vast distances and that really is a first globally,” he says. “People are gonna want this.”

Thanks to Ray Gaudette for bringing this to the attention of the It’s Interesting community.

Researchers explore connecting the brain to machines

brain

Behind a locked door in a white-walled basement in a research building in Tempe, Ariz., a monkey sits stone-still in a chair, eyes locked on a computer screen. From his head protrudes a bundle of wires; from his mouth, a plastic tube. As he stares, a picture of a green cursor on the black screen floats toward the corner of a cube. The monkey is moving it with his mind.

The monkey, a rhesus macaque named Oscar, has electrodes implanted in his motor cortex, detecting electrical impulses that indicate mental activity and translating them to the movement of the ball on the screen. The computer isn’t reading his mind, exactly — Oscar’s own brain is doing a lot of the lifting, adapting itself by trial and error to the delicate task of accurately communicating its intentions to the machine. (When Oscar succeeds in controlling the ball as instructed, the tube in his mouth rewards him with a sip of his favorite beverage, Crystal Light.) It’s not technically telekinesis, either, since that would imply that there’s something paranormal about the process. It’s called a “brain-computer interface” (BCI). And it just might represent the future of the relationship between human and machine.

Stephen Helms Tillery’s laboratory at Arizona State University is one of a growing number where researchers are racing to explore the breathtaking potential of BCIs and a related technology, neuroprosthetics. The promise is irresistible: from restoring sight to the blind, to helping the paralyzed walk again, to allowing people suffering from locked-in syndrome to communicate with the outside world. In the past few years, the pace of progress has been accelerating, delivering dazzling headlines seemingly by the week.

At Duke University in 2008, a monkey named Idoya walked on a treadmill, causing a robot in Japan to do the same. Then Miguel Nicolelis stopped the monkey’s treadmill — and the robotic legs kept walking, controlled by Idoya’s brain. At Andrew Schwartz’s lab at the University of Pittsburgh in December 2012, a quadriplegic woman named Jan Scheuermann learned to feed herself chocolate by mentally manipulating a robotic arm. Just last month, Nicolelis’ lab set up what it billed as the first brain-to-brain interface, allowing a rat in North Carolina to make a decision based on sensory data beamed via Internet from the brain of a rat in Brazil.

So far the focus has been on medical applications — restoring standard-issue human functions to people with disabilities. But it’s not hard to imagine the same technologies someday augmenting capacities. If you can make robotic legs walk with your mind, there’s no reason you can’t also make them run faster than any sprinter. If you can control a robotic arm, you can control a robotic crane. If you can play a computer game with your mind, you can, theoretically at least, fly a drone with your mind.

It’s tempting and a bit frightening to imagine that all of this is right around the corner, given how far the field has already come in a short time. Indeed, Nicolelis — the media-savvy scientist behind the “rat telepathy” experiment — is aiming to build a robotic bodysuit that would allow a paralyzed teen to take the first kick of the 2014 World Cup. Yet the same factor that has made the explosion of progress in neuroprosthetics possible could also make future advances harder to come by: the almost unfathomable complexity of the human brain.

From I, Robot to Skynet, we’ve tended to assume that the machines of the future would be guided by artificial intelligence — that our robots would have minds of their own. Over the decades, researchers have made enormous leaps in artificial intelligence (AI), and we may be entering an age of “smart objects” that can learn, adapt to, and even shape our habits and preferences. We have planes that fly themselves, and we’ll soon have cars that do the same. Google has some of the world’s top AI minds working on making our smartphones even smarter, to the point that they can anticipate our needs. But “smart” is not the same as “sentient.” We can train devices to learn specific behaviors, and even out-think humans in certain constrained settings, like a game of Jeopardy. But we’re still nowhere close to building a machine that can pass the Turing test, the benchmark for human-like intelligence. Some experts doubt we ever will.

Philosophy aside, for the time being the smartest machines of all are those that humans can control. The challenge lies in how best to control them. From vacuum tubes to the DOS command line to the Mac to the iPhone, the history of computing has been a progression from lower to higher levels of abstraction. In other words, we’ve been moving from machines that require us to understand and directly manipulate their inner workings to machines that understand how we work and respond readily to our commands. The next step after smartphones may be voice-controlled smart glasses, which can intuit our intentions all the more readily because they see what we see and hear what we hear.

The logical endpoint of this progression would be computers that read our minds, computers we can control without any physical action on our part at all. That sounds impossible. After all, if the human brain is so hard to compute, how can a computer understand what’s going on inside it?

It can’t. But as it turns out, it doesn’t have to — not fully, anyway. What makes brain-computer interfaces possible is an amazing property of the brain called neuroplasticity: the ability of neurons to form new connections in response to fresh stimuli. Our brains are constantly rewiring themselves to allow us to adapt to our environment. So when researchers implant electrodes in a part of the brain that they expect to be active in moving, say, the right arm, it’s not essential that they know in advance exactly which neurons will fire at what rate. When the subject attempts to move the robotic arm and sees that it isn’t quite working as expected, the person — or rat or monkey — will try different configurations of brain activity. Eventually, with time and feedback and training, the brain will hit on a solution that makes use of the electrodes to move the arm.

That’s the principle behind such rapid progress in brain-computer interface and neuroprosthetics. Researchers began looking into the possibility of reading signals directly from the brain in the 1970s, and testing on rats began in the early 1990s. The first big breakthrough for humans came in Georgia in 1997, when a scientist named Philip Kennedy used brain implants to allow a “locked in” stroke victim named Johnny Ray to spell out words by moving a cursor with his thoughts. (It took him six exhausting months of training to master the process.) In 2008, when Nicolelis got his monkey at Duke to make robotic legs run a treadmill in Japan, it might have seemed like mind-controlled exoskeletons for humans were just another step or two away. If he succeeds in his plan to have a paralyzed youngster kick a soccer ball at next year’s World Cup, some will pronounce the cyborg revolution in full swing.

Schwartz, the Pittsburgh researcher who helped Jan Scheuermann feed herself chocolate in December, is optimistic that neuroprosthetics will eventually allow paralyzed people to regain some mobility. But he says that full control over an exoskeleton would require a more sophisticated way to extract nuanced information from the brain. Getting a pair of robotic legs to walk is one thing. Getting robotic limbs to do everything human limbs can do may be exponentially more complicated. “The challenge of maintaining balance and staying upright on two feet is a difficult problem, but it can be handled by robotics without a brain. But if you need to move gracefully and with skill, turn and step over obstacles, decide if it’s slippery outside — that does require a brain. If you see someone go up and kick a soccer ball, the essential thing to ask is, ‘OK, what would happen if I moved the soccer ball two inches to the right?'” The idea that simple electrodes could detect things as complex as memory or cognition, which involve the firing of billions of neurons in patterns that scientists can’t yet comprehend, is far-fetched, Schwartz adds.

That’s not the only reason that companies like Apple and Google aren’t yet working on devices that read our minds (as far as we know). Another one is that the devices aren’t portable. And then there’s the little fact that they require brain surgery.

A different class of brain-scanning technology is being touted on the consumer market and in the media as a way for computers to read people’s minds without drilling into their skulls. It’s called electroencephalography, or EEG, and it involves headsets that press electrodes against the scalp. In an impressive 2010 TED Talk, Tan Le of the consumer EEG-headset company Emotiv Lifescience showed how someone can use her company’s EPOC headset to move objects on a computer screen.

Skeptics point out that these devices can detect only the crudest electrical signals from the brain itself, which is well-insulated by the skull and scalp. In many cases, consumer devices that claim to read people’s thoughts are in fact relying largely on physical signals like skin conductivity and tension of the scalp or eyebrow muscles.

Robert Oschler, a robotics enthusiast who develops apps for EEG headsets, believes the more sophisticated consumer headsets like the Emotiv EPOC may be the real deal in terms of filtering out the noise to detect brain waves. Still, he says, there are limits to what even the most advanced, medical-grade EEG devices can divine about our cognition. He’s fond of an analogy that he attributes to Gerwin Schalk, a pioneer in the field of invasive brain implants. The best EEG devices, he says, are “like going to a stadium with a bunch of microphones: You can’t hear what any individual is saying, but maybe you can tell if they’re doing the wave.” With some of the more basic consumer headsets, at this point, “it’s like being in a party in the parking lot outside the same game.”

It’s fairly safe to say that EEG headsets won’t be turning us into cyborgs anytime soon. But it would be a mistake to assume that we can predict today how brain-computer interface technology will evolve. Just last month, a team at Brown University unveiled a prototype of a low-power, wireless neural implant that can transmit signals to a computer over broadband. That could be a major step forward in someday making BCIs practical for everyday use. Meanwhile, researchers at Cornell last week revealed that they were able to use fMRI, a measure of brain activity, to detect which of four people a research subject was thinking about at a given time. Machines today can read our minds in only the most rudimentary ways. But such advances hint that they may be able to detect and respond to more abstract types of mental activity in the always-changing future.

http://www.ydr.com/living/ci_22800493/researchers-explore-connecting-brain-machines

Communication of thoughts between rats on different continents, connected via brain-to-brain interface

The world’s first brain-to-brain connection has given rats the power to communicate by thought alone.

“Many people thought it could never happen,” says Miguel Nicolelis at Duke University in Durham, North Carolina. Although monkeys have been able to control robots with their mind using brain-to-machine interfaces, work by Nicolelis’s team has, for the first time, demonstrated a direct interface between two brains – with the rats able to share both motor and sensory information.

The feat was achieved by first training rats to press one of two levers when an LED above that lever was lit. A correct action opened a hatch containing a drink of water. The rats were then split into two groups, designated as “encoders” and “decoders”.

An array of microelectrodes – each about one-hundredth the width of a human hair – was then implanted in the encoder rats’ primary motor cortex, an area of the brain that processes movement. The team used the implant to record the neuronal activity that occurs just before the rat made a decision in the lever task. They found that pressing the left lever produced a different pattern of activity from pressing the right lever, regardless of which was the correct action.

Next, the team recreated these patterns in decoder rats, using an implant in the same brain area that stimulates neurons rather than recording from them. The decoders received a few training sessions to prime them to pick the correct lever in response to the different patterns of stimulation.

The researchers then wired up the implants of an encoder and a decoder rat. The pair were given the same lever-press task again, but this time only the encoder rats saw the LEDs come on. Brain signals from the encoder rat were recorded just before they pressed the lever and transmitted to the decoder rat. The team found that the decoders, despite having no visual cue, pressed the correct lever between 60 and 72 per cent of the time.

The rats’ ability to cooperate was reinforced by rewarding both rats if the communication resulted in a correct outcome. Such reinforcement led to the transmission of clearer signals, improving the rats’ success rate compared with cases where decoders were given a pre-recorded signal. This was a big surprise, says Nicolelis. “The encoder’s brain activity became more precise. This could have happened because the animal enhanced its attention during the performance of the next trial after a decoder error.”

If the decoders had not been primed to relate specific activity with the left or right lever prior to the being linked with an encoder, the only consequence would be that it would have taken a bit more time for them to learn the task while interacting with the encoder, says Nicolelis. “We simply primed the decoder so that it would get the gist of the task it had to perform.” In unpublished monkey experiments doing a similar task, the team did not need to prime the animals at all.

In a second experiment, rats were trained to explore a hole with their whiskers and indicate if it was narrow or wide by turning to the left or right. Pairs of rats were then connected as before, but this time the implants were placed in their primary somatosensory cortex, an area that processes touch. Decoder rats were able to indicate over 60 per cent of the time the width of a gap that only the encoder rats were exploring.

Finally, encoder rats were held still while their whiskers were stroked with metal bars. The researchers observed patterns of activity in the somatosensory cortex of the decoder rats that matched that of the encoder rats, even though the whiskers of the decoder rats had not been touched.

Pairs of rats were even able to cooperate across continents using cyberspace. Brain signals from an encoder rat at the Edmond and Lily Safra International Institute of Neuroscience of Natal in Brazil were sent to a decoder in Nicolelis’s lab in North Carolina via the internet. Though there was a slight transmission delay, the decoder rat still performed with an accuracy similar to those of rats in closer proximity with encoders.

Christopher James at the University of Warwick, UK, who works on brain-to-machine interfaces for prostheses, says the work is a “wake-up call” for people who haven’t caught up with recent advances in brain research.

We have the technology to create implants for long-term use, he says. What is missing, though, is a full understanding of the brain processes involved. In this case, Nicolelis’s team is “blasting a relatively large area of the brain with a signal they’re not sure is 100 per cent correct,” he says.

That’s because the exact information being communicated between the rats’ brains is not clear. The brain activity of the encoders cannot be transferred precisely to the decoders because that would require matching the patterns neuron for neuron, which is not currently possible. Instead, the two patterns are closely related in terms of their frequency and spatial representation.

“We are still using a sledgehammer to crack a walnut,” says James. “They’re not hearing the voice of God.” But the rats are certainly sending and receiving more than a binary signal that simply points to one or other lever, he says. “I think it will be possible one day to transfer an abstract thought.”

The decoders have to interpret relatively complex brain patterns, says Marshall Shuler at Johns Hopkins University in Baltimore, Maryland. The animals learn the relevance of these new patterns and their brains adapt to the signals. “But the decoders are probably not having the same quality of experience as the encoders,” he says.

Patrick Degenaar at Newcastle University in the UK says that the military might one day be able to deploy genetically modified insects or small mammals that are controlled by the brain signals of a remote human operator. These would be drones that could feed themselves, he says, and could be used for surveillance or even assassination missions. “You’d probably need a flying bug to get near the head [of someone to be targeted],” he says.

Nicolelis is most excited about the future of multiple networked brains. He is currently trialling the implants in monkeys, getting them to work together telepathically to complete a task. For example, each monkey might only have access to part of the information needed to make the right decision in a game. Several monkeys would then need to communicate with each other in order to successfully complete the task.

“In the distant future we may be able to communicate via a brain-net,” says Nicolelis. “I would be very glad if the brain-net my great grandchildren used was due to their great grandfather’s work.”

Journal reference: Nature Scientific Reports, DOI: 10.1038/srep01319

New bionic hand allows person to feel what they are touching

2-bionic-handsBionic-handv1

The first bionic hand that allows an amputee to feel what they are touching will be transplanted later this year in a pioneering operation that could introduce a new generation of artificial limbs with sensory perception.

The patient is an unnamed man in his 20s living in Rome who lost the lower part of his arm following an accident, said Silvestro Micera of the Ecole Polytechnique Federale de Lausanne in Switzerland.

The wiring of his new bionic hand will be connected to the patient’s nervous system with the hope that the man will be able to control the movements of the hand as well as receiving touch signals from the hand’s skin sensors.

Dr Micera said that the hand will be attached directly to the patient’s nervous system via electrodes clipped onto two of the arm’s main nerves, the median and the ulnar nerves.

This should allow the man to control the hand by his thoughts, as well as receiving sensory signals to his brain from the hand’s sensors. It will effectively provide a fast, bidirectional flow of information between the man’s nervous system and the prosthetic hand.

“This is real progress, real hope for amputees. It will be the first prosthetic that will provide real-time sensory feedback for grasping,” Dr Micera said.

“It is clear that the more sensory feeling an amputee has, the more likely you will get full acceptance of that limb,” he told the American Association for the Advancement of Science meeting in Boston.

“We could be on the cusp of providing new and more effective clinical solutions to amputees in the next year,” he said.

An earlier, portable model of the hand was temporarily attached to Pierpaolo Petruzziello in 2009, who lost half his arm in a car accident. He was able to move the bionic hand’s fingers, clench them into a fist and hold objects. He said that he could feel the sensation of needles pricked into the hand’s palm.

However, this earlier version of the hand had only two sensory zones whereas the latest prototype will send sensory signals back from all the fingertips, as well as the palm and the wrists to give a near life-like feeling in the limb, Dr Micera said.

“The idea would be that it could deliver two or more sensations. You could have a pinch and receive information from three fingers, or feel movement in the hand and wrist,” Dr Micera said.

“We have refined the interface [connecting the hand to the patient], so we hope to see much more detailed movement and control of the hand,” he told the meeting.

The plan is for the patient to wear the bionic hand for a month to see how he adapts to the artificial limb. If all goes well, a full working model will be ready for testing within two years, Dr Micera said.

One of the unresolved issues is whether patients will be able to tolerate having such a limb attached to them all the time, or whether they would need to remove it periodically to give them a rest.

Another problem is how to conceal the wiring under the patient’s skin to make them less obtrusive. The electrodes of the prototype hand to be fitted later this year will be inserted through the skin rather than underneath it but there are plans under development to place the wiring subcutaneously, Dr Micera said.

http://www.independent.co.uk/life-style/gadgets-and-tech/news/a-sensational-breakthrough-the-first-bionic-hand-that-can-feel-8498622.html

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

Lab rats given a 6th sense through a brain-machine interface

_65888650_65886269

Duke University researchers have effectively given laboratory rats a “sixth sense” using an implant in their brains.

An experimental device allowed the rats to “touch” infrared light – which is normally invisible to them.

The team at Duke University fitted the rats with an infrared detector wired up to microscopic electrodes that were implanted in the part of their brains that processes tactile information.

The results of the study were published in Nature Communications journal.

The researchers say that, in theory at least, a human with a damaged visual cortex might be able to regain sight through a device implanted in another part of the brain.

Lead author Miguel Nicolelis said this was the first time a brain-machine interface has augmented a sense in adult animals.

The experiment also shows that a new sensory input can be interpreted by a region of the brain that normally does something else (without having to “hijack” the function of that brain region).

“We could create devices sensitive to any physical energy,” said Prof Nicolelis, from the Duke University Medical Center in Durham, North Carolina.

“It could be magnetic fields, radio waves, or ultrasound. We chose infrared initially because it didn’t interfere with our electrophysiological recordings.”

His colleague Eric Thomson commented: “The philosophy of the field of brain-machine interfaces has until now been to attempt to restore a motor function lost to lesion or damage of the central nervous system.

“This is the first paper in which a neuroprosthetic device was used to augment function – literally enabling a normal animal to acquire a sixth sense.”
In their experiments, the researchers used a test chamber with three light sources that could be switched on randomly.

They taught the rats to choose the active light source by poking their noses into a port to receive a sip of water as a reward. They then implanted the microelectrodes, each about a tenth the diameter of a human hair, into the animals’ brains. These electrodes were attached to the infrared detectors.

The scientists then returned the animals to the test chamber. At first, the rats scratched at their faces, indicating that they were interpreting the lights as touch. But after a month the animals learned to associate the signal in their brains with the infrared source.

They began to search actively for the signal, eventually achieving perfect scores in tracking and identifying the correct location of the invisible light source.

One key finding was that enlisting the touch cortex to detect infrared light did not reduce its ability to process touch signals.

http://www.bbc.co.uk/news/science-environment-21459745

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

Mind-meld brain power is best for steering spaceships

piggy

Two people have successfully steered a virtual spacecraft by combining the power of their thoughts – and their efforts were far more accurate than one person acting alone. One day groups of people hooked up to brain-computer interfaces (BCIs) might work together to control complex robotic and telepresence systems, maybe even in space.

A BCI system records the brain’s electrical activity using EEG signals, which are detected with electrodes attached to the scalp. Machine-learning software learns to recognise the patterns generated by each user as they think of a certain concept, such as “left” or “right”. BCIs have helped people with disabilities to steer a wheelchair, for example.

Researchers are discovering, however, that they get better results in some tasks by combining the signals from multiple BCI users. Until now, this “collaborative BCI” technique has been used in simple pattern-recognition tasks, but a team at the University of Essex in the UK wanted to test it more rigorously.

So they developed a simulator in which pairs of BCI users had to steer a craft towards the dead centre of a planet by thinking about one of eight directions that they could fly in, like using compass points. Brain signals representing the users’ chosen direction, as interpreted by the machine-learning system, were merged in real time and the spacecraft followed that path.

The results, to be presented at an Intelligent User Interfaces conference in California in March, strongly favoured two-brain navigation. Simulation flights were 67 per cent accurate for a single user, but 90 per cent on target for two users. And when coping with sudden changes in the simulated planet’s position, reaction times were halved, too. Combining signals eradicates the random noise that dogs EEG signals. “When you average signals from two people’s brains, the noise cancels out a bit,” says team member Riccardo Poli.

The technique can also compensate for a lapse in attention. “It is difficult to stay focused on the task at all times. So when a single user has momentary attention lapses, it matters. But when there are two users, a lapse by one will not have much effect, so you stay on target,” Poli says.

NASA’s Jet Propulsion Lab in Pasadena, California, has been observing the work while itself investigating BCI’s potential for controlling planetary rovers, for example. But don’t hold your breath, says JPL senior research scientist Adrian Stoica. “While potential uses for space applications exist, in terms of uses for planetary rover remote control, this is still a speculative idea,” he says.

http://www.newscientist.com/article/mg21729025.600-mindmeld-brain-power-is-best-for-steering-spaceships.html