Communication of thoughts between rats on different continents, connected via brain-to-brain interface

The world’s first brain-to-brain connection has given rats the power to communicate by thought alone.

“Many people thought it could never happen,” says Miguel Nicolelis at Duke University in Durham, North Carolina. Although monkeys have been able to control robots with their mind using brain-to-machine interfaces, work by Nicolelis’s team has, for the first time, demonstrated a direct interface between two brains – with the rats able to share both motor and sensory information.

The feat was achieved by first training rats to press one of two levers when an LED above that lever was lit. A correct action opened a hatch containing a drink of water. The rats were then split into two groups, designated as “encoders” and “decoders”.

An array of microelectrodes – each about one-hundredth the width of a human hair – was then implanted in the encoder rats’ primary motor cortex, an area of the brain that processes movement. The team used the implant to record the neuronal activity that occurs just before the rat made a decision in the lever task. They found that pressing the left lever produced a different pattern of activity from pressing the right lever, regardless of which was the correct action.

Next, the team recreated these patterns in decoder rats, using an implant in the same brain area that stimulates neurons rather than recording from them. The decoders received a few training sessions to prime them to pick the correct lever in response to the different patterns of stimulation.

The researchers then wired up the implants of an encoder and a decoder rat. The pair were given the same lever-press task again, but this time only the encoder rats saw the LEDs come on. Brain signals from the encoder rat were recorded just before they pressed the lever and transmitted to the decoder rat. The team found that the decoders, despite having no visual cue, pressed the correct lever between 60 and 72 per cent of the time.

The rats’ ability to cooperate was reinforced by rewarding both rats if the communication resulted in a correct outcome. Such reinforcement led to the transmission of clearer signals, improving the rats’ success rate compared with cases where decoders were given a pre-recorded signal. This was a big surprise, says Nicolelis. “The encoder’s brain activity became more precise. This could have happened because the animal enhanced its attention during the performance of the next trial after a decoder error.”

If the decoders had not been primed to relate specific activity with the left or right lever prior to the being linked with an encoder, the only consequence would be that it would have taken a bit more time for them to learn the task while interacting with the encoder, says Nicolelis. “We simply primed the decoder so that it would get the gist of the task it had to perform.” In unpublished monkey experiments doing a similar task, the team did not need to prime the animals at all.

In a second experiment, rats were trained to explore a hole with their whiskers and indicate if it was narrow or wide by turning to the left or right. Pairs of rats were then connected as before, but this time the implants were placed in their primary somatosensory cortex, an area that processes touch. Decoder rats were able to indicate over 60 per cent of the time the width of a gap that only the encoder rats were exploring.

Finally, encoder rats were held still while their whiskers were stroked with metal bars. The researchers observed patterns of activity in the somatosensory cortex of the decoder rats that matched that of the encoder rats, even though the whiskers of the decoder rats had not been touched.

Pairs of rats were even able to cooperate across continents using cyberspace. Brain signals from an encoder rat at the Edmond and Lily Safra International Institute of Neuroscience of Natal in Brazil were sent to a decoder in Nicolelis’s lab in North Carolina via the internet. Though there was a slight transmission delay, the decoder rat still performed with an accuracy similar to those of rats in closer proximity with encoders.

Christopher James at the University of Warwick, UK, who works on brain-to-machine interfaces for prostheses, says the work is a “wake-up call” for people who haven’t caught up with recent advances in brain research.

We have the technology to create implants for long-term use, he says. What is missing, though, is a full understanding of the brain processes involved. In this case, Nicolelis’s team is “blasting a relatively large area of the brain with a signal they’re not sure is 100 per cent correct,” he says.

That’s because the exact information being communicated between the rats’ brains is not clear. The brain activity of the encoders cannot be transferred precisely to the decoders because that would require matching the patterns neuron for neuron, which is not currently possible. Instead, the two patterns are closely related in terms of their frequency and spatial representation.

“We are still using a sledgehammer to crack a walnut,” says James. “They’re not hearing the voice of God.” But the rats are certainly sending and receiving more than a binary signal that simply points to one or other lever, he says. “I think it will be possible one day to transfer an abstract thought.”

The decoders have to interpret relatively complex brain patterns, says Marshall Shuler at Johns Hopkins University in Baltimore, Maryland. The animals learn the relevance of these new patterns and their brains adapt to the signals. “But the decoders are probably not having the same quality of experience as the encoders,” he says.

Patrick Degenaar at Newcastle University in the UK says that the military might one day be able to deploy genetically modified insects or small mammals that are controlled by the brain signals of a remote human operator. These would be drones that could feed themselves, he says, and could be used for surveillance or even assassination missions. “You’d probably need a flying bug to get near the head [of someone to be targeted],” he says.

Nicolelis is most excited about the future of multiple networked brains. He is currently trialling the implants in monkeys, getting them to work together telepathically to complete a task. For example, each monkey might only have access to part of the information needed to make the right decision in a game. Several monkeys would then need to communicate with each other in order to successfully complete the task.

“In the distant future we may be able to communicate via a brain-net,” says Nicolelis. “I would be very glad if the brain-net my great grandchildren used was due to their great grandfather’s work.”

Journal reference: Nature Scientific Reports, DOI: 10.1038/srep01319

Lab rats given a 6th sense through a brain-machine interface

_65888650_65886269

Duke University researchers have effectively given laboratory rats a “sixth sense” using an implant in their brains.

An experimental device allowed the rats to “touch” infrared light – which is normally invisible to them.

The team at Duke University fitted the rats with an infrared detector wired up to microscopic electrodes that were implanted in the part of their brains that processes tactile information.

The results of the study were published in Nature Communications journal.

The researchers say that, in theory at least, a human with a damaged visual cortex might be able to regain sight through a device implanted in another part of the brain.

Lead author Miguel Nicolelis said this was the first time a brain-machine interface has augmented a sense in adult animals.

The experiment also shows that a new sensory input can be interpreted by a region of the brain that normally does something else (without having to “hijack” the function of that brain region).

“We could create devices sensitive to any physical energy,” said Prof Nicolelis, from the Duke University Medical Center in Durham, North Carolina.

“It could be magnetic fields, radio waves, or ultrasound. We chose infrared initially because it didn’t interfere with our electrophysiological recordings.”

His colleague Eric Thomson commented: “The philosophy of the field of brain-machine interfaces has until now been to attempt to restore a motor function lost to lesion or damage of the central nervous system.

“This is the first paper in which a neuroprosthetic device was used to augment function – literally enabling a normal animal to acquire a sixth sense.”
In their experiments, the researchers used a test chamber with three light sources that could be switched on randomly.

They taught the rats to choose the active light source by poking their noses into a port to receive a sip of water as a reward. They then implanted the microelectrodes, each about a tenth the diameter of a human hair, into the animals’ brains. These electrodes were attached to the infrared detectors.

The scientists then returned the animals to the test chamber. At first, the rats scratched at their faces, indicating that they were interpreting the lights as touch. But after a month the animals learned to associate the signal in their brains with the infrared source.

They began to search actively for the signal, eventually achieving perfect scores in tracking and identifying the correct location of the invisible light source.

One key finding was that enlisting the touch cortex to detect infrared light did not reduce its ability to process touch signals.

http://www.bbc.co.uk/news/science-environment-21459745

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

Humans can learn a new sense: ‘Whisking’

 

Rats use a sense that humans don’t: whisking. They move their facial whiskers back and forth about eight times a second to locate objects in their environment. Could humans acquire this sense? And if they can, what could understanding the process of adapting to new sensory input tell us about how humans normally sense? At the Weizmann Institute, researchers explored these questions by attaching plastic “whiskers” to the fingers of blindfolded volunteers and asking them to carry out a location task. The findings, which recently appeared in the Journal of Neuroscience, have yielded new insight into the process of sensing, and they may point to new avenues in developing aids for the blind.
The scientific team, including Drs. Avraham Saig and Goren Gordon, and Eldad Assa in the group of Prof. Ehud Ahissar and Dr. Amos Arieli, all of the Neurobiology Department attached a “whisker” – a 30 cm-long elastic “hair” with position and force sensors on its base – to the index finger of each hand of a blindfolded subject. Then two poles were placed at arm’s distance on either side and slightly to the front of the seated subject, with one a bit farther back than the other. Using just their whiskers, the subjects were challenged to figure out which pole – left or right – was the back one. As the experiment continued, the displacement between front and back poles was reduced, up to the point when the subject could no longer distinguish front from back.
On the first day of the experiment, subjects picked up the new sense so well that they could correctly identify a pole that was set back by only eight cm. An analysis of the data revealed that the subjects did this by figuring the spatial information from the sensory timing. That is, moving their bewhiskered hands together, they could determine which pole was the back one because the whisker on that hand made contact earlier.
When they repeated the testing the next day, the researchers discovered that the subjects had improved their whisking skills significantly: The average sensory threshold went down to just three cm, with some being able to sense a displacement of just one cm. Interestingly, the ability of the subjects to sense time differences had not changed over the two days. Rather, they had improved in the motor aspects of their whisking strategies: Slowing down their hand motions – in effect lengthening the delay time – enabled them to sense a smaller spatial difference.
Saig: “We know that our senses are linked to muscles, for example ocular and hand muscles. In order to sense the texture of cloth, for example, we move our fingers across it, and to seeing stationary object, our eyes must be in constant motion. In this research, we see that changing our physical movements alone – without any corresponding change in the sensitivity of our senses – can be sufficient to sharpen our perception.”
Based on the experiments, the scientists created a statistical model to describe how the subjects updated their “world view” as they acquired new sensory information – up to the point at which they were confident enough to rely on that sense. The model, based on principles of information processing, could explain the number of whisking movements needed to arrive at the correct answer, as well as the pattern of scanning the subjects employed – a gradual change from long to short movements. With this strategy, the flow of information remains constant. “The experiment was conducted in a controlled manner, which allowed us direct access to all the relevant variables: hand motion, hand-pole contact and the reports of the subjects themselves,” says Gordon. “Not only was there a good fit between the theory and the experimental data, we obtained some useful quantitative information on the process of active sensing.”
“Both sight and touch are based on arrays of receptors that scan the outside world in an active manner,” says Ahissar, “Our findings reveal some new principles of active sensing, and show us that activating a new artificial sense in a ‘natural’ way can be very efficient.”  Arieli adds: “Our vision for the future is to help blind people ‘see’ with their fingers. Small devices that translate video to mechanical stimulation, based on principles of active sensing that are common to vision and touch, could provide an intuitive, easily used sensory aid.”