The possible risks to your brain of multi-tasking

multitask

by Dr. Travis Bradberry

You may have heard that multitasking is bad for you, but new studies show that it kills your performance and may even damage your brain. Every time you multitask you aren’t just harming your performance in the moment; you may very well be damaging an area of your brain that’s critical to your future success at work.

Research conducted at Stanford University found that multitasking is less productive than doing a single thing at a time. The researchers found that people who are regularly bombarded with several streams of electronic information cannot pay attention, recall information, or switch from one job to another as well as those who complete one task at a time.

A Special Skill?

But what if some people have a special gift for multitasking? The Stanford researchers compared groups of people based on their tendency to multitask and their belief that it helps their performance. They found that heavy multitaskers—those who multitask a lot and feel that it boosts their performance—were actually worse at multitasking than those who like to do a single thing at a time. The frequent multitaskers performed worse because they had more trouble organizing their thoughts and filtering out irrelevant information, and they were slower at switching from one task to another.

Multitasking reduces your efficiency and performance because your brain can only focus on one thing at a time. When you try to do two things at once, your brain lacks the capacity to perform both tasks successfully.

Multitasking Lowers IQ

Research also shows that, in addition to slowing you down, multitasking lowers your IQ. A study at the University of London found that participants who multitasked during cognitive tasks experienced IQ score declines that were similar to what they’d expect if they had smoked marijuana or stayed up all night. IQ drops of 15 points for multitasking men lowered their scores to the average range of an 8-year-old child.

So the next time you’re writing your boss an email during a meeting, remember that your cognitive capacity is being diminished to the point that you might as well let an 8-year-old write it for you.


Brain Damage From Multitasking?

It was long believed that cognitive impairment from multitasking was temporary, but new research suggests otherwise. Researchers at the University of Sussex in the UK compared the amount of time people spend on multiple devices (such as texting while watching TV) to MRI scans of their brains. They found that high multitaskers had less brain density in the anterior cingulate cortex, a region responsible for empathy as well as cognitive and emotional control.

While more research is needed to determine if multitasking is physically damaging the brain (versus existing brain damage that predisposes people to multitask), it’s clear that multitasking has negative effects.

Neuroscientist Kep Kee Loh, the study’s lead author, explained the implications:

“I feel that it is important to create an awareness that the way we are interacting with the devices might be changing the way we think and these changes might be occurring at the level of brain structure.”

The EQ Connection

Nothing turns people off quite like fiddling with your phone or tablet during a conversation. Multitasking in meetings and other social settings indicates low Self- and Social Awareness, two emotional intelligence (EQ) skills that are critical to success at work. TalentSmart has tested more than a million people and found that 90% of top performers have high EQs. If multitasking does indeed damage the anterior cingulate cortex (a key brain region for EQ) as current research suggests, doing so will lower your EQ while it alienates your coworkers.

Bringing It All Together

If you’re prone to multitasking, this is not a habit you’ll want to indulge—it clearly slows you down and decreases the quality of your work. Even if it doesn’t cause brain damage, allowing yourself to multitask will fuel any existing difficulties you have with concentration, organization, and attention to detail.

Small RNA identified that offers clues for quieting the “voices” of schizophrenia


St. Jude Children’s Research Hospital scientists have linked disruption of a brain circuit associated with schizophrenia to an age-related decline in levels of a single microRNA in one brain region

St. Jude Children’s Research Hospital scientists have identified a small RNA (microRNA) that may be essential to restoring normal function in a brain circuit associated with the “voices” and other hallucinations of schizophrenia. The microRNA provides a possible focus for antipsychotic drug development. The findings appear today in the journal Nature Medicine.

The work was done in a mouse model of a human disorder that is one of the genetic causes of schizophrenia. Building on previous St. Jude research, the results offer important new details about the molecular mechanism that disrupts the flow of information along a neural circuit connecting two brain regions involved in processing auditory information. The findings also provide clues about why psychotic symptoms of schizophrenia are often delayed until late adolescence or early adulthood.

“In 2014, we identified the specific circuit in the brain that is targeted by antipsychotic drugs. However, the existing antipsychotics also cause devastating side effects,” said corresponding author Stanislav Zakharenko, M.D., Ph.D., a member of the St. Jude Department of Developmental Neurobiology. “In this study, we identified the microRNA that is a key player in disruption of that circuit and showed that depletion of the microRNA was necessary and sufficient to inhibit normal functioning of the circuit in the mouse models.

“We also found evidence suggesting that the microRNA, named miR-338-3p, could be targeted for development of a new class of antipsychotic drugs with fewer side effects.”

There are more than 2,000 microRNAs whose function is to silence expression of particular genes and regulate the supply of the corresponding proteins. Working in a mouse model of 22q11 deletion syndrome, researchers identified miR-338-3p as the microRNA that regulates production of the protein D2 dopamine receptor (Drd2), which is the prime target of antipsychotics.

Individuals with the deletion syndrome are at risk for behavior problems as children. Between 23 and 43 percent develop schizophrenia, a severe chronic disorder that affects thinking, memory and behavior. Researchers at St. Jude are studying schizophrenia and other brain disorders to improve understanding of how normal brains develop, which provides insights into the origins of diseases like cancer.

The scientists reported that Drd2 increased in the brain’s auditory thalamus when levels of the microRNA declined. Previous research from Zakharenko’s laboratory linked elevated levels of Drd2 in the auditory thalamus to brain-circuit disruptions in the mutant mice. Investigators also reported that the protein was elevated in the same brain region of individuals with schizophrenia, but not healthy adults.

Individuals with the deletion syndrome are missing part of chromosome 22, which leaves them with one rather than the normal two copies of more than 25 genes. The missing genes included Dgcr8, which facilitates production of microRNAs.

Working in mice, researchers have now linked the 22q11 deletion syndrome and deletion of a single Dgcr8 gene to age-related declines in miR-338-3p in the auditory thalamus. The decline was associated with an increase in Drd2 and reduced signaling in the circuit that links the thalamus and auditory cortex, a brain region implicated in auditory hallucination. Levels of miR-338-3p were lower in the thalamus of individuals with schizophrenia compared to individuals of the same age and sex without the diagnosis.

The miR-338-3p depletion did not disrupt other brain circuits in the mutant mice, and the findings offer a possible explanation. Researchers found that miR-338-3p levels were higher in the thalamus than in other brain regions. In addition, miR-338-3p was one of the most abundant microRNAs present in the thalamus.

Replenishing levels of the microRNA in the auditory thalamus of mutant mice reduced Drd2 protein and restored the circuit to normal functioning. That suggests that the microRNA could be the basis for a new class of antipsychotic drugs that act in a more targeted manner with fewer side effects. Antipsychotic drugs, which target Drd2, also restored circuit function.

The findings provide insight into the age-related delay in the onset of schizophrenia symptoms. Researchers noted that microRNA levels declined with age in all mice, but that mutant mice began with lower levels of miR-338-3p. “A minimum level of the microRNA may be necessary to prevent excessive production of the Drd2 that disrupts the circuit,” Zakharenko said. “While miR-338-3p levels decline as normal mice age, levels may remain above the threshold necessary to prevent overexpression of the protein. In contrast, the deletion syndrome may leave mice at risk for dropping below that threshold.”

The study’s first authors are Sungkun Chun, Fei Du and Joby Westmoreland, all formerly of St. Jude. The other authors are Seung Baek Han, Yong-Dong Wang, Donnie Eddins, Ildar Bayazitov, Prakash Devaraju, Jing Yu, Marcia Mellado Lagarde and Kara Anderson, all of St. Jude.

https://www.stjude.org/media-resources/news-releases/2016-medicine-science-news/small-rna-identified-that-offers-clues-for-quieting-the-voices-of-schizophrenia.html

‘Brain wi-fi’ shown to be able to reverse leg paralysis in a primate.

By James Gallagher

An implant that beams instructions out of the brain has been used to restore movement in paralysed primates for the first time, say scientists.

Rhesus monkeys were paralysed in one leg due to a damaged spinal cord. The team at the Swiss Federal Institute of Technology bypassed the injury by sending the instructions straight from the brain to the nerves controlling leg movement. Experts said the technology could be ready for human trials within a decade.

Spinal-cord injuries block the flow of electrical signals from the brain to the rest of the body resulting in paralysis. It is a wound that rarely heals, but one potential solution is to use technology to bypass the injury.

In the study, a chip was implanted into the part of the monkeys’ brain that controls movement. Its job was to read the spikes of electrical activity that are the instructions for moving the legs and send them to a nearby computer. It deciphered the messages and sent instructions to an implant in the monkey’s spine to electrically stimulate the appropriate nerves. The process all takes place in real time. The results, published in the journal Nature, showed the monkeys regained some control of their paralysed leg within six days and could walk in a straight line on a treadmill.

Dr Gregoire Courtine, one of the researchers, said: “This is the first time that a neurotechnology has restored locomotion in primates.” He told the BBC News website: “The movement was close to normal for the basic walking pattern, but so far we have not been able to test the ability to steer.” The technology used to stimulate the spinal cord is the same as that used in deep brain stimulation to treat Parkinson’s disease, so it would not be a technological leap to doing the same tests in patients. “But the way we walk is different to primates, we are bipedal and this requires more sophisticated ways to stimulate the muscle,” said Dr Courtine.

Jocelyne Bloch, a neurosurgeon from the Lausanne University Hospital, said: “The link between decoding of the brain and the stimulation of the spinal cord is completely new. “For the first time, I can image a completely paralysed patient being able to move their legs through this brain-spine interface.”

Using technology to overcome paralysis is a rapidly developing field:
Brainwaves have been used to control a robotic arm
Electrical stimulation of the spinal cord has helped four paralysed people stand again
An implant has helped a paralysed man play a guitar-based computer game

Dr Mark Bacon, the director of research at the charity Spinal Research, said: “This is quite impressive work. Paralysed patients want to be able to regain real control, that is voluntary control of lost functions, like walking, and the use of implantable devices may be one way of achieving this. The current work is a clear demonstration that there is progress being made in the right direction.”

Dr Andrew Jackson, from the Institute of Neuroscience and Newcastle University, said: “It is not unreasonable to speculate that we could see the first clinical demonstrations of interfaces between the brain and spinal cord by the end of the decade.” However, he said, rhesus monkeys used all four limbs to move and only one leg had been paralysed, so it would be a greater challenge to restore the movement of both legs in people. “Useful locomotion also requires control of balance, steering and obstacle avoidance, which were not addressed,” he added.

The other approach to treating paralysis involves transplanting cells from the nasal cavity into the spinal cord to try to biologically repair the injury. Following this treatment, Darek Fidyka, who was paralysed from the chest down in a knife attack in 2010, can now walk using a frame.

Neither approach is ready for routine use.

http://www.bbc.com/news/health-37914543

Thanks to Kebmodee for bringing this to the It’s Interesting community.

US military enhancing human skills with electrical brain stimulation


Study paves way for personnel such as drone operators to have electrical pulses sent into their brains to improve effectiveness in high pressure situations.

US military scientists have used electrical brain stimulators to enhance mental skills of staff, in research that aims to boost the performance of air crews, drone operators and others in the armed forces’ most demanding roles.

The successful tests of the devices pave the way for servicemen and women to be wired up at critical times of duty, so that electrical pulses can be beamed into their brains to improve their effectiveness in high pressure situations.

The brain stimulation kits use five electrodes to send weak electric currents through the skull and into specific parts of the cortex. Previous studies have found evidence that by helping neurons to fire, these minor brain zaps can boost cognitive ability.

The technology is seen as a safer alternative to prescription drugs, such as modafinil and ritalin, both of which have been used off-label as performance enhancing drugs in the armed forces.

But while electrical brain stimulation appears to have no harmful side effects, some experts say its long-term safety is unknown, and raise concerns about staff being forced to use the equipment if it is approved for military operations.

Others are worried about the broader implications of the science on the general workforce because of the advance of an unregulated technology.

In a new report, scientists at Wright-Patterson Air Force Base in Ohio describe how the performance of military personnel can slump soon after they start work if the demands of the job become too intense.

“Within the air force, various operations such as remotely piloted and manned aircraft operations require a human operator to monitor and respond to multiple events simultaneously over a long period of time,” they write. “With the monotonous nature of these tasks, the operator’s performance may decline shortly after their work shift commences.”

Advertisement

But in a series of experiments at the air force base, the researchers found that electrical brain stimulation can improve people’s multitasking skills and stave off the drop in performance that comes with information overload. Writing in the journal Frontiers in Human Neuroscience, they say that the technology, known as transcranial direct current stimulation (tDCS), has a “profound effect”.

For the study, the scientists had men and women at the base take a test developed by Nasa to assess multitasking skills. The test requires people to keep a crosshair inside a moving circle on a computer screen, while constantly monitoring and responding to three other tasks on the screen.

To investigate whether tDCS boosted people’s scores, half of the volunteers had a constant two milliamp current beamed into the brain for the 36-minute-long test. The other half formed a control group and had only 30 seconds of stimulation at the start of the test.

According to the report, the brain stimulation group started to perform better than the control group four minutes into the test. “The findings provide new evidence that tDCS has the ability to augment and enhance multitasking capability in a human operator,” the researchers write. Larger studies must now look at whether the improvement in performance is real and, if so, how long it lasts.

The tests are not the first to claim beneficial effects from electrical brain stimulation. Last year, researchers at the same US facility found that tDCS seemed to work better than caffeine at keeping military target analysts vigilant after long hours at the desk. Brain stimulation has also been tested for its potential to help soldiers spot snipers more quickly in VR training programmes.

Neil Levy, deputy director of the Oxford Centre for Neuroethics, said that compared with prescription drugs, electrical brain stimulation could actually be a safer way to boost the performance of those in the armed forces. “I have more serious worries about the extent to which participants can give informed consent, and whether they can opt out once it is approved for use,” he said. “Even for those jobs where attention is absolutely critical, you want to be very careful about making it compulsory, or there being a strong social pressure to use it, before we are really sure about its long-term safety.”

But while the devices may be safe in the hands of experts, the technology is freely available, because the sale of brain stimulation kits is unregulated. They can be bought on the internet or assembled from simple components, which raises a greater concern, according to Levy. Young people whose brains are still developing may be tempted to experiment with the devices, and try higher currents than those used in laboratories, he says. “If you use high currents you can damage the brain,” he says.

In 2014 another Oxford scientist, Roi Cohen Kadosh, warned that while brain stimulation could improve performance at some tasks, it made people worse at others. In light of the work, Kadosh urged people not to use brain stimulators at home.

If the technology is proved safe in the long run though, it could help those who need it most, said Levy. “It may have a levelling-up effect, because it is cheap and enhancers tend to benefit the people that perform less well,” he said.

https://www.theguardian.com/science/2016/nov/07/us-military-successfully-tests-electrical-brain-stimulation-to-enhance-staff-skills

Thanks to Kebmodee for bringing this to the It’s Interesting community.

Pain Sensitivity Can Be Socially Transmitted Via Olfactory Cues

by Tori Rodriguez, MA, LPC

The social transmission of emotions has been reported in several studies in recent years. Research published in 2013, for example, found that joy and fear are transmissible between people, while a 2011 study showed that stress — as measured by an increase in cortisol — can be transmitted from others who are under pressure.1,2 Results of a new study that appeared in Science Advances suggest that pain may also be communicable.3

“Being able to perceive and communicate pain to others probably gives an evolutionary advantage to animals,” study co-author Andrey E. Ryabinin, PhD, a professor of behavioral neuroscience at Oregon Health & Science University, told Clinical Pain Advisor. Such awareness may trigger self-protective or caretaking behaviors, for instance, that facilitate the survival of the individual and the group.
In the current study, Ryabinin and colleagues investigated whether “bystander” mice would develop hyperalgesia after being housed in the same room as “primary” mice who had received a noxious stimulus. In one experiment, the paws of primary mice were injected with complete Freund’s adjuvant (CFA), which, as expected, induced persistent hypersensitivity that was apparent for 2 weeks. Bystander mice who had been injected with phosphate-buffered saline (PBS) similarly demonstrated hypersensitivity throughout the same 2-week period.

Bystander mice also displayed acquired hypersensitivity in another set of experiments in which primary mice experienced pain related to withdrawal from morphine and alcohol. This suggests that the transfer of hyperalgesia is not limited to the effects of inflammatory stimuli. In addition, the transfer was consistent across mechanical, thermal, and chemical modalities of nociception.

Tests revealed that nociceptive thresholds returned to basal levels in both primary and bystander mice within 4 days, and the transferred hyperalgesia was not accounted for by familiarity, as the effects were similar between mice that were not familiar with the others and those that were.
Finally, the authors determined that the transfer of hyperalgesia was mediated by olfactory cues (as measured by exposing naïve mice to the bedding of hypersensitive co-housed mice), and it could not be accounted for by anxiety, visual cues, or stress-induced hyperalgesia.

Future research is needed to pinpoint the molecular messenger involved in the transfer of hyperalgesia, and whether a similar process occurs in humans.

“Here we show for the first time that you do not need an injury or inflammation to develop a pain state–pain can develop simply because of social cues,” said Dr Ryabinin. These findings have important implications for the treatment of chronic pain patients. “We cannot dismiss people with chronic pain if they have no physical pathology. They can be in pain without the pathology and need to be treated for their pain despite lack of injury.”

References
Dezecache G, Conty L, Chadwick M, et al. Evidence for Unintentional Emotional Contagion Beyond Dyads.PLoS One. 2013; 8(6): e67371.
Buchanan TW , Bagley SL, Stansfield RB, Preston SD. The empathic, physiological resonance of stress. Soc Neurosci. 2012; 7(2):191-201.
Smith ML, Hostetler CM, Heinricher MM, Ryabinin AE. Social transfer of pain in mice. Sci Adv. 2016; 2(10): e1600855.

http://www.psychiatryadvisor.com/anxiety/social-transfer-of-hyperalgesia/article/571087/?DCMP=EMC-PA_Update_RD&cpn=psych_md%2cpsych_all&hmSubId=&NID=1710903786&dl=0&spMailingID=15837872&spUserID=MTQ4MTYyNjcyNzk2S0&spJobID=902320519&spReportId=OTAyMzIwNTE5S0

Birds with bigger brains appear to be better able to avoid getting shot

by Jaymi Heimbuch

When a flock of geese fly into the air and a hunter takes aim, which bird is most likely to drop from the sky? A new study published in the journal Biology Letters shows that those birds with larger brains relative to their body size are less likely to be shot by hunters.

PhysOrg reports:

The researchers found that those birds with smaller brains (relative to the size of their bodies) were more likely to be shot and catalogued—as were males and larger birds in general. The team looked at a variety of factors such as organ size, body mass, gender, species, color, etc., and found one factor that stood out very clearly from the rest—birds with larger brains were 30 times less likely to be shot and killed. This, the team suggests, indicates that hunting is very likely having an evolutionary impact on animals that are hunted by humans. They do not believe that hunters are specifically targeting smaller species, it’s more likely that those with larger brains have learned to be wary of humans.

Brain size is of course not the only possible factor for which bird ends up on a hunter’s dinner table. But the ability to distinguish danger with more clarity than your compatriots certainly helps, and the researchers point out that brain size might be part of that ability.

Potential target identified for preventing long-term effects of explosion-mediated traumatic brain injury

BY: JENNIFER BROWN

More than 200,000 U.S. soldiers serving in the Middle East have experienced a blast-related traumatic brain injury, making it a common health problem and concern for that population.

Traumatic brain injury (TBI) can have various harmful long-term neurological effects, including problems with vision, coordination, memory, mood, and thinking. According to the Centers for Disease Control and Prevention, TBI from a head injury is a leading cause of death and disability in the United States, and close to 5 million Americans—soldiers and non-soldiers alike—are currently living with a TBI-related disability. Current therapy for these patients involves supportive care and rehabilitation, but no treatments are available that can prevent the development of chronic neurological symptoms.

Researchers from the University of Iowa believe they may have identified a potential approach for preventing the development of neurological problems associated with TBI. Their research in mice suggests that protecting axons—the fiber-like projections that connect brain cells—prevents the long-term neuropsychiatric problems caused by blast-related traumatic brain injury.

In a recent study, the UI team led by Andrew Pieper, professor of psychiatry at the UI Carver College of Medicine, investigated whether early damage to axons—an event that is strongly associated with many forms of brain injury, including blast-related TBI—is simply a consequence of the injury or whether it is a driving cause of the subsequent neurological and psychiatric symptoms.

To answer that question, the researchers used mice with a genetic mutation that protects axons from some forms of damage. The mutation works by maintaining normal levels of an important energy metabolite known as nicotinamide adenine dinucleotide (NAD) in brain cells after injury.

When mice with the mutation experienced blast-mediated TBI, their axons were protected from damage, and they did not develop the vision problems, or the thinking and movement difficulties that were seen when mice without the mutation experienced blast-related TBI. The findings were published Oct. 11 in the online journal eNeuro.

“Our work strongly suggests that early axonal injury appears to be a critical driver of neurobehavioral complications after blast-TBI,” says Pieper, who also is a professor of neurology, radiation oncology, and a physician with the Iowa City Veterans Affairs Health Care System.

“Therefore, future therapeutic strategies targeted specifically at protecting or augmenting the health of axons may provide a uniquely beneficial approach for preventing these patients from developing neurologic symptoms after blast exposure.”

In confirming the critical relationship between axon degeneration and development of subsequent neurological complication, the new study builds on previous work from Pieper’s lab. The researchers also have discovered a series of neuroprotective compounds that appear to help axons survive the kind of early damage seen in TBI. These compounds activate a molecular pathway that preserves neuronal levels of NAD, the energy metabolite that has been shown to be critical to the health of axons. Pieper’s team previously demonstrated that these neuroprotective compounds block axonal degeneration and protect mice from harmful neurological effects of blast-TBI, even when the compound are given 24 to 36 hours after the blast injury.

In addition to Pieper, the research team included Terry Yin, Jaymie Voorhees, Rachel Genova, Kevin Davis, Ashley Madison, Jeremiah Britt, Coral Cinton, Latisha McDaniel, and Matthew Harper. Pieper also is a member of the Pappajohn Biomedical Institute at the UI.

https://now.uiowa.edu/2016/10/study-traumatic-brain-injury

Boy wakes up from coma speaking an entirely different language

By Cari Romm

You may have heard of foreign-accent syndrome, a rare and mysterious condition in which someone suffers a brain injury and suddenly — true to the name — begins speaking in a new accent. Last year, for example, a woman from Ontario began speaking in the regional accent of the Canadian East Coast after a stroke, despite the fact that she’d never visited or met anyone from that particular part of the country. Just a few months ago, a woman in Texas developed a British accent following dental surgery.

Both women are members of a pretty exclusive club: Scientists estimate that foreign-accent syndrome strikes just one person in the world each year. And as Time reported earlier this week, a Georgia high-school student has taken the step further: Sixteen-year-old Rueben Nsemoh, recently woke up from a coma speaking fluent Spanish.

The patient: Last month, Nsemoh developed a severe concussion during a soccer game, when another player accidentally kicked him in the head. When he woke up after three days in a coma, according to Time, he’d lost his English, but he could still speak: His first words were “tengo hambre,” Spanish for “I’m hungry” — and his family quickly discovered that he could now speak the language fluently, despite the fact that he had previously known only a handful of Spanish words.

The diagnosis: This isn’t the first time a patient has walked away from a head injury with a newfound linguistic ability: In 2014, an Australian man came to and discovered that he now spoke fluent Mandarin; in 2010, the same thing happened to a Croatian teen with German and a British man with French.

But these cases, like Nsemoh’s, can’t simply be explained as an extension of foreign-accent syndrome, which researchers believe isn’t really the development of a new accent at all: It’s a sign of damage to the area of the brain that controls the motor functions of speech. Any resemblance to a real foreign accent, then, is coincidental — the new speech pattern is just a new way of forcing words out of the mouth, affecting their sounds in random ways.

Seemingly absorbing an entire language overnight, on the other hand, has little to do with motor skills and everything to do with linguistic knowledge. While Nsemoh’s family hasn’t yet received an explanation for his newfound grasp of Spanish, Time noted that he’s heard the language in the past, from his brother (who studied abroad in Spain) and his classmates, meaning it’s not entirely new. For now, that remains just a clue, though the teen’s doctors may not have much longer to solve the case — for the past few weeks, their patient has been slowly regaining his English and losing his Spanish. This one, it seems, may remain un misterio for the ages.

Zika may hurt the adult brain.

By Meghan Rosen

Zika may harm grown-up brains.

The virus, which can cause brain damage in infants infected in the womb, kills stem cells and stunts their numbers in the brains of adult mice, researchers report August 18 in Cell Stem Cell. Though scientists have considered Zika primarily a threat to unborn babies, the new findings suggest that the virus may cause unknown — and potentially long-term — damage to adults as well.

In adults, Zika has been linked to Guillain-Barré syndrome, a rare neurological disorder (SN: 4/2/16, p. 29). But for most people, infection is typically mild: a headache, fever and rash lasting up to a week, or no symptoms at all. In pregnant women, though, the virus can lodge in the brain of a fetus and kill off newly developing cells (SN: 4/13/16).

If Zika targets newborn brain cells, adults may be at risk, too, reasoned neuroscientist Joseph Gleeson of Rockefeller University in New York City and colleagues. Parts of the forebrain and the hippocampus, which plays a crucial role in learning and memory, continue to generate nerve cells in adult brains.

In mice infected with Zika, the virus hit these brain regions hard. Nerve cells died and the regions generated one-fifth to one-half as many new cells compared with those of uninfected mice. The results might not translate to humans; the mice were genetically engineered to have weak immune systems, making them susceptible to Zika.

But Zika could potentially harm immunocompromised people and perhaps even healthy people in a similar way, the authors write.

Zika kills brain cells in adult mice

A metabolic shift in neurons may provide insight into neurodegenerative diseases


A key metabolic pathway must be switched off during neuron development or fewer neurons (green, on the right) survive.

by Jennifer Hicks

Researchers at the Salk Institute of Biological Studies released a study in the July 12 issue of eLife, which identifies the point at which there’s a dramatic metabolic shift in developing neurons. This discovery of the path a neuron takes during development could help provide insight into neurodegenerative diseases such as Alzheimer’s and Parkinson’s disease.

In a press release, Tony Hunter, American Cancer Society Professor, Salk Molecular and Cell Biology Laboratory said there’s relatively little understanding about how neuron metabolism is first established.

Oxidative stress leads to disruptions in neural cells which are key players in neurodegenerative diseases like Parkinson’s or ALS. The brain needs oxygen to survive but by knowing when and how neuron metabolism goes off track and mitochondria fail to function properly in these diseases, researchers can begin to devise ways to re-route metabolic processes to prevent degeneration.

“Aside from enabling us to understand this process during neuronal development, the work also allows us to better understand neurodegenerative disease,” added Hunter.

What the researchers found in the study was that while neurons shut off the aerobic glycolysis to survive during the metabolic process at the same time neurons also had to kick-start oxidative phosphorylation in order to survive. When the researchers stopped that metabolic process from happening, the neurons died. A neuron dysfunction of any kind can potentially lead to neurodegenerative disease for a number of reasons.

http://www.forbes.com/sites/jenniferhicks/2016/07/31/a-look-at-the-metabolic-shift-in-neurons-for-insight-into-neurodegenerative-disease/#14296174e07b