DARPA Wants to Zap Your Brain to Boost Your Memory

We may go to sleep at night, but our brains don’t. Instead, they spend those quiet hours tidying up, and one of their chores is to lug memories into long-term storage boxes.

Now, a group of scientists may have found a way to give that memory-storing process a boost, by delivering precisely timed electric zaps to the brain at the exact right moments of sleep. These zaps, the researchers found, can improve memory.

And to make matters even more interesting, the team of researchers was funded by the Defense Advanced Research Projects Agency (DARPA), the U.S. agency tasked with developing technology for the military. They reported their findings July 23 in The Journal of Neuroscience.

DARPA Wants to Zap Your Brain to Boost Your Memory
Credit: Shutterstock
We may go to sleep at night, but our brains don’t. Instead, they spend those quiet hours tidying up, and one of their chores is to lug memories into long-term storage boxes.

Now, a group of scientists may have found a way to give that memory-storing process a boost, by delivering precisely timed electric zaps to the brain at the exact right moments of sleep. These zaps, the researchers found, can improve memory.

And to make matters even more interesting, the team of researchers was funded by the Defense Advanced Research Projects Agency (DARPA), the U.S. agency tasked with developing technology for the military. They reported their findings July 23 in The Journal of Neuroscience.

If the findings are confirmed with additional research, the brain zaps could one day be used to help students study for a big exam, assist people at work or even treat patients with memory impairments, including those who experienced a traumatic brain injury in the military, said senior study author Praveen Pilly, a senior scientist at HRL Laboratories, a research facility focused on advancing technology.

The study involved 16 healthy adults from the Albuquerque, New Mexico, area. The first night, no experiments were run; instead, it was simply an opportunity for the participants to get accustomed to spending the night in the sleep lab while wearing the lumpy stimulation cap designed to deliver the tiny zaps to their brains. Indeed, when the researchers started the experiment, “our biggest worry [was] whether our subjects [could] sleep with all those wires,” Pilly told Live Science.

The next night, the experiment began: Before the participants fell asleep, they were shown war-like scenes and were asked to spot the location of certain targets, such as hidden bombs or snipers.

Then, the participants went to sleep, wearing the stimulation cap that not only delivered zaps but also measured brain activity using a device called an electroencephalogram (EEG). On the first night of the experiment, half of the participants received brain zaps, and half did not.

Using measurements from the EEG, the researchers aimed their electric zaps at a specific type of brain activity called “slow-wave oscillations.” These oscillations — which can be thought of as bursts of neuron activity that come and go with regularity — are known to be important for memory consolidation. They take place during two sleep stages: stage 2 (still a “light” sleep, when the heart rate slows down and body temperature drops) and stage 3 (deep sleep).

So, shortly after the participants in the zapping group fell into slow-wave oscillations, the stimulation cap would deliver slight zaps to the brain, in tune with the oscillations. The next morning, all of the participants were shown similar war-zone scenes, and the researchers measured how well they detected targets.

Five days later, the groups were switched for the second night of experiments.

The researchers found that, the mornings after, the participants who received the brain zaps weren’t any better at detecting targets in the same scene they saw the night before, compared with those who slept without zaps. But those who received the zapping were much better at detecting the same targets in novel scenes. For example, if the original scene showed a target under a rock, the “novel” scene might show the same target-rock image, but from a different angle, according to a press release from HRL Laboratories.

Researchers call this “generalization.” Pilly explained it as follows: “If you’re [studying] for a test, you learn a fact, and then, when you’re tested the following morning on the same fact … our intervention may not help you. On the other hand, if you’re tested on some questions related to that fact [but] which require you to generalize or integrate previous information,” the intervention would help you perform better.

This is because people rarely recall events exactly as they happen, Pilly said, referring to what’s known as episodic memory. Rather, people generalize what they learn and access that knowledge when faced with various situations. (For example, we know to stay away from a snake in the city, even if the first time we saw it, it was in the countryside.)

Previous studies have also investigated the effects of brain stimulation on memory. But although they delivered the zaps during the same sleep stage as the new study, the researchers in the previous studies didn’t attempt to match the zaps with the natural oscillations of the brain, Pilly said.

Jan Born, a professor of behavioral neuroscience at the University of Tübingen in Germany who was not part of the study, said the new research showed that, “at least in terms of behavior, [such a] procedure is effective.”

The approaches examined in the study have “huge potential, but we are still in the beginning [of this type of research], so we have to be cautious,” Born told Live Science.

One potential problem is that the stimulation typically hits the whole surface of the brain, Born said. Because the brain is wrinkled, and some neurons hide deep in the folds and others sit atop ridges, the stimulations aren’t very effective at targeting all of the neurons necessary, he said. This may make it difficult to reproduce the results every time, he added.

Pilly said that because the zaps aren’t specialized, they could also, in theory, lead to side effects. But he thinks, if anything, the side effect might simply be better-quality sleep.

https://www.livescience.com/63329-darpa-brain-zapping-memory.html

Scientist thinks he has developed a genetic test for heart attack risk and wants to give it away free.

by Matthew Herper

A Harvard scientist thinks he’s reached a new milestone: a genetic test that helps identify people who are at high risk of having a heart attack. Can he convince doctors to use it?

“I think–in a few years, I think everybody will know this number, similar to the way we know our cholesterol right now,” muses Sekar Kathiresan, director of the Cardiovascular Disease Initiative at the Broad institute and a professor at Harvard Medical School.

Not everyone else is so sure. “I think it’s a brilliant approach,” says Harlan Krumholz, the Harold H. Hines Jr. professor of cardiology at Yale University and one of Kathiresan’s collaborators. But he worries about whether Kathiresan’s tests are ready to compete with the plethora of diagnostic tests, from AI-boosted CT scans to new types of “bad” cholesterol proteins, that are on offer. And he worries about cost. There is no commercial version of the gene test. But the very idea that such a test is not only available, but also near, is the result of a cresting wave of new genetic science, the result of large efforts to gather genetic information from millions of volunteers.

The number in question is what is called a polygenic risk score. Instead of looking for one miswritten gene that causes heart attacks, or, for that matter, other health problems, geneticists are increasingly looking at thousands of genetic alterations without even being sure what each does. In the case of Kathiresan’s polygenic score, the test looks for 6.6 million single-letter genetic changes that are more prevalent in people who have had early heart attacks.

Our genetic inheritances, the current thinking goes, are not so much a set of declarative orders as a cacophony of noise. There are big genetic changes that can have a big effect, but most diseases are the result of lots of tiny changes that add up. In Kathiresan’s words, it’s mostly a gemish (Yiddish for “a mixture”). And it’s not clear which changes are biologically important – Kathiresan says only 6,000 or so of the 6.6 million genetic changes are probably actually causing heart attacks. But finding those specific changes will take a long time. The risk score could be used now.

The effect of this genetic cacophony can be huge. The most common single mutation that increases the risk of heart disease is a gene that causes a disease called heterozygous familial hypercholesterolemia (literally: inherited high cholesterol) that occurs in one person in 250 and triple’s a person’s risk of having a heart attack. But today, in a paper in Nature Genetics, Kathiresan and his colleagues present data that 5% to 8% have a polygenic score that also at least triples their risk of having a heart attack. That’s about 20 times as many people, Kathiresan says.

“These patients are currently unaware of their risk because the polygenic patients don’t have higher levels of the usual risk factors,” Kathiresan says. “Their cholesterol is not high. Their blood pressure is not that high. They are hidden from the current risk assessment tools.”

In the Nature Genetics paper, Kathiresan’s team tested the 6-million-variant polygenic score in two groups of patients numbering, respectively, 120,280 and 288,978 people, from the U.K. BioBank, a government-backed effort in the United Kingdom to collect genetic data. For some patients, the risk was even higher, with the genetic changes predicting a fivefold increase in heart attack risk. The paper also argues that polygenic risk scores could be used to predict risk of conditions such as type 2 diabetes and breast cancer.

Another study, yet to be published, looked at the prevalence of both familial hypercholesterolemia and the polygenic score in a population of people who had heart attacks in their 40s and 50s, Katherisan says. Only 2% had familial hypercholesterolemia, but 20% had a high polygenic risk score. Knowing one’s polygenic risk score might matter. A 2016 paper in the New England Journal of Medicine showed that people with high polygenic scores had fewer heart attacks if they had healthier lifestyles, and a 2017 paper in the medical journal Circulation showed that patients with high polygenic risk scores got an outsize benefit from cholesterol-lowering statin drugs. Those papers, both by Kathiresan’s group, used a score that included only a few dozen gene variants.

Doctors should be skeptical of such a test. There’s a long history of tests in medicine that have done more harm than good by leading to people to take drugs they do not need. Cardiologists have gotten used to even higher standards for data. For instance, many might want to see if the test can show a benefit in a large study in which people are tested at random. Many will want more evidence that the test can identify people at high risk they’d otherwise miss, as Kathiresan says, and that it doesn’t lead to treatment in those who don’t need it. Kathiresan says he hopes to do a study in the highest-risk individuals to prove that statin drugs can lower their risk. If the test becomes a commercial prospect, more studies will drive up the eventual cost.

Kathiresan is hoping to follow a less expensive path. He notes that 17 million people have already used genotyping services like 23andMe and Ancestry. He hopes that people who use those services (23andMe costs $99, Ancestry $59) will submit their data to a portal he’ll build for free. He also says he’s in discussions with commercial providers, but he’s hoping that people will be able to get their polygenic scores for about as much as the cost of a cholesterol test. For the people at the highest risk, he argues, this is information that could be important. For others, he argues, why deny people information that has been scientifically validated?

Whether Kathiresan can really pull off a low-cost version in a medical system that is optimized to make money is as big a question as whether the test is ready for prime time. Krumholz worried about the cost of the test until a reporter told him of Kathiresan’s planned website. “If you say it’s free, I’m going, ‘Why not?'” Krumholz says. “It’s a better family history,” he says, comparing the test to asking whether a relative has had a heart attack. But that may be the biggest ‘if’. If anything is more puzzling than genetics, it is the economics of healthcare in the U.S.A.
https://www.forbes.com/sites/matthewherper/2018/08/13/a-harvard-scientist-thinks-he-has-a-gene-test-for-heart-attack-risk-he-wants-to-give-it-away-free/#557490e85959

Stem cell transplants to be used in treating Crohn’s disease

Crohn’s disease is a long-term condition that causes inflammation of the lining of the digestive system, and results in diarrhoea, abdominal pain, extreme tiredness and other symptoms that significantly affect quality of life.

Current treatments include drugs to reduce inflammation but these have varying results, and surgery is often needed to remove the affected part of the bowel. In extreme cases, after multiple operations over the years, patients may require a final operation to divert the bowel from the anus to an opening in the stomach, called a stoma, where stools are collected in a pouch.

Chief investigator Professor James Lindsay from Queen Mary’s Blizard Institute and a consultant at Barts Health NHS Trust said: “Despite the introduction of new drugs, there are still many patients who don’t respond, or gradually lose response, to all available treatments. Although surgery with the formation of a stoma may be an option that allows patients to return to normal daily activities, it is not suitable in some and others may not want to consider this approach.

“We’re hoping that by completely resetting the patient’s immune system through a stem cell transplant, we might be able to radically alter the course of the disease. While it may not be a cure, it may allow some patients to finally respond to drugs which previously did not work.”

Helen Bartlett, a Crohn’s disease patient who had stem cell therapy at John Radcliffe Hospital, Oxford, said: “Living with Crohn’s is a daily struggle. You go to the toilet so often, you bleed a lot and it’s incredibly tiring. You also always need to be careful about where you go. I’ve had to get off trains before because there’s been no toilet, and I needed to go there and then.

“I’ve been in and out of hospital for the last twenty years, operation after operation, drug after drug, to try to beat this disease. It’s frustrating, it’s depressing and you just feel so low.

“When offered the stem cell transplant, it was a complete no brainer as I didn’t want to go through yet more failed operations. I cannot describe how much better I feel since the treatment. I still have problems and I’m always going to have problems, but I’m not in that constant pain.”

The use of stem cell transplants to wipe out and replace patients’ immune systems has recently been found to be successful in treating multiple sclerosis. This new trial will investigate whether a similar treatment could reduce gut inflammation and offer hope to people with Crohn’s disease.

In the trial, patients undergo chemotherapy and hormone treatment to mobilise their stem cells, which are then harvested from their blood. Further chemotherapy is then used to wipe out their faulty immune system. When the stem cells are re-introduced back into the body, they develop into new immune cells which give the patient a fresh immune system.

In theory, the new immune system will then no longer react adversely to the patient’s own gut to cause inflammation, and it will also not act on drug compounds to remove them from their gut before they have had a chance to work.

Professor Tom Walley, Director of the NIHR Evaluation, Trials and Studies programmes, which funded the trial, said: “Stem cell therapies are an important, active and growing area of research with great potential. There are early findings showing a role for stem cells in replacing damaged tissue. In Crohn’s disease this approach could offer real benefits for the clinical care and long term health of patients.”

The current clinical trial, called ‘ASTIClite’, is a follow up to the team’s 2015 ‘ASTIC’ trial, which investigated a similar stem cell therapy. Although the therapy in the original trial did not cure the disease, the team found that many patients did see benefit from the treatment, justifying a further clinical trial. There were also some serious side effects from the doses of drugs used, so this follow-up trial will be using a lower dose of the treatment to minimise risks due to toxicity.

Patients will be recruited to the trial through Barts Health NHS Trust, Cambridge University Hospitals NHS Foundation Trust, Guy’s & St Thomas’ NHS Foundation Trust, NHS Lothian, Nottingham University Hospitals NHS Trust, Oxford University Hospitals NHS Foundation Trust, Royal Liverpool and Broadgreen University Hospital NHS Trust and Sheffield Teaching Hospitals NHS Foundation Trust.

The trial will involve academics from the University of Manchester, University of Nottingham, University of Sheffield, Nottingham Trent University, University of Edinburgh, University of Oxford, King’s College London, as well as Queen Mary University of London.

The study was funded by a Medical Research Council and NIHR partnership created to support the evaluation of interventions with potential to make a step-change in the promotion of health, treatment of disease and improvement of rehabilitation or long-term care.

https://www.qmul.ac.uk/media/news/2018/smd/stem-cell-transplants-to-be-used-in-treating-crohns-disease.html

Scientists Think They’ve Found The Part of The Brain That Makes People Pessimistic

\

by DAVID NIELD

A specific part of the brain called the caudate nucleus could control pessimistic responses, according to animal tests, a finding which might help us unlock better treatments for mental disorders like anxiety and depression.

These disorders often come with negative moods triggered by a pessimistic reaction, and if scientists can figure out how to control that reaction, we might stand a better chance of dealing with the neuropsychiatric problems that affect millions of people worldwide – and maybe discover the difference between glass half full and glass half empty people along the way.

The research team from MIT found that when the caudate nucleus was artificially stimulated in macaques, the animals were more likely to make negative decisions, and consider the potential drawback of a decision rather than the potential benefit.

This pessimistic decision-making continued right through the day after the original stimulation, the researchers found.

“We feel we were seeing a proxy for anxiety, or depression, or some mix of the two,” says lead researcher Ann Graybiel. “These psychiatric problems are still so very difficult to treat for many individuals suffering from them.”

The caudate nucleus has previously been linked to emotional decision-making, and the scientists stimulated it with a small electrical current while the monkeys were offered a reward (juice) and an unpleasant experience (a puff of air to the face) at the same time.

In each run through the amount of juice and the strength of the air blast varied, and the animals could choose whether or not to accept the reward – essentially measuring their ability to weigh up the costs of an action against the benefits.

When the caudate nucleus was stimulated, this decision-making got skewed, so the macaques started rejecting juice/air ratios they would have previously accepted. The negative aspects apparently began to seem greater, while the the rewards became devalued.

“This state we’ve mimicked has an overestimation of cost relative to benefit,” says Graybiel. After a day or so, the effects gradually disappeared.

The researchers also found brainwave activity in the caudate nucleus, part of the basal ganglia, changed when decision-making patterns changed. This might give doctors a marker to indicate whether someone would be responsive to treatment targeting this part of the brain or not.

The next stage is to see whether the same effect can be noticed in human beings – scientists have previously linked abnormal brain activity in people with mood disorders to regions connected to the caudate nucleus, but there’s a lot more work to be done to confirm these neural connections.

Making progress isn’t easy because of the incredibly complexity of the brain, but the researchers think their results show the caudate nucleus could be disrupting dopamine activity in the brain, controlling mood and our sense of reward and pleasure.

“There must be many circuits involved,” says Gabriel. “But apparently we are so delicately balanced that just throwing the system off a little bit can rapidly change behaviour.”

The research has been published in Neuron.

https://www.sciencealert.com/we-found-the-brain-region-for-pessimism

New research shows that being forgetful is a sign of unusual intelligence

By Timothy Roberts

Being able to recall memories, whether short-term or long-term is something that we all need in life. It comes in handy when we are studying at school or when we are trying to remember where we left our keys. We also tend to use our memory at work and remembering somebody’s name is certainly a good thing.

Although many of us may consider ourselves to have a good memory, we are all going to forget things from time to time. When it happens, we might feel as if we are slipping but there may be more behind it than you realize.

Imagine this scenario; you go to the grocery store to pick up 3 items and suddenly, you forget why you were there. Even worse, you may walk from one room to another and forget why you got up in the first place!

If you often struggle with these types of problems, you will be happy to learn that there is probably nothing wrong with you. In fact, a study that was done by the Neuron Journal and it has some rather good news. It says that forgetting is part of the brain process that might actually make you smarter by the time the day is over.

Professors took part in a study at the University of Toronto and they discovered that the perfect memory actually doesn’t necessarily reflect your level of intelligence.

You might even be surprised to learn that when you forget details on occasion, it can make you smarter.

Most people would go by the general thought that remembering more means that you are smarter.

According to the study, however, when you forget a detail on occasion, it’s perfectly normal. It has to do with remembering the big picture compared to remembering little details. Remembering the big picture is better for the brain and for our safety.

Our brains are perhaps more of a computer than many of us think. The hippocampus, which is the part of the brain where memories are stored, tends to filter out the unnecessary details.

In other words, it helps us to “optimize intelligent decision making by holding onto what’s important and letting go of what’s not.”

Think about it this way; is it easier to remember somebody’s face or their name? Which is the most important?

In a social setting, it is typically better to remember both but if we were part of the animal kingdom, remembering somebody as being a threat would mean our very lives. Remembering their name would be inconsequential.

The brain doesn’t automatically decide what we should remember and what we shouldn’t. It holds new memories but it sometimes overwrites old memories.

When the brain becomes cluttered with memories, they tend to conflict with each other and that can make it difficult to make important decisions.

That is why the brain tends to hold on to those big picture memories but they are becoming less important with the advent of technology.

As an example, at one time, we would have learned how to spell words but now, we just use Google if we don’t know how to spell them. We also tend to look everything up online, from how to change a showerhead to how to cook meatloaf for dinner.

If you forget everything, you may want to consider having a checkup but if you forget things on occasion, it’s perfectly okay.

The moral of the story is, the next time you forget something, just think of it as your brain doing what it was designed to do.

http://wetpaintlife.com/scientists-say-that-being-forgetful-is-actually-a-sign-you-are-unusually-intelligent/?utm_source=vn&utm_tracking=11&utm_medium=Social

RNA methylation discovered to be key to brain cell connections

Methyl chemical groups dot lengths of DNA, helping to control when certain genes are accessible by a cell. In new research, UCLA scientists have shown that at the connections between brain cells—which often are located far from the central control centers of the cells—methyl groups also dot chains of RNA. This methyl markup of RNA molecules is likely key to brain cells’ ability to quickly send signals to other cells and react to changing stimuli in a fraction of a second.

To dictate the biology of any cell, DNA in the cell’s nucleus must be translated into corresponding strands of RNA. Next, the messenger RNA, or mRNA—an intermediate genetic molecule between DNA and proteins—is transcribed into proteins. If a cell suddenly needs more of a protein—to adapt to an incoming signal, for instance—it must translate more DNA into mRNA. Then it must make more proteins and shuttle them through the cell to where they are needed. This process means that getting new proteins to a distant part of a cell, like the synapses of neurons where signals are passed, can take time.

Research has recently suggested that methyl chemical groups, which can control when DNA is transcribed into mRNA, are also found on strands of mRNA. The methylation of mRNA, researchers hypothesize, adds a level of control to when the mRNA can be translated into proteins, and their occurrence has been documented in a handful of organs throughout the bodies of mammals. The pattern of methyls on mRNA in any given cell is dubbed the “epitranscriptome.”

UCLA and Kyoto University researchers mapped out the location of methyls on mRNA found at the synapses, or junctions, of mouse brain cells. They isolated brain cells from adult mice and compared the epitranscriptome found at the synapses to the epitranscriptomes of mRNA elsewhere in the cells. At more than 4,000 spots on the genome, the mRNA at the synapse was methylated more often. More than half of these spots, the researchers went on to show, are in genes that encode proteins found mostly at the synapse. The researchers found that when they disrupted the methylation of mRNA at the synapse, the brain cells didn’t function normally.

The methylation of mRNA at the synapse is likely one of many ways that neurons speed up their ability to send messages, by allowing the mRNA to be poised and ready to translate into proteins when needed.

The levels of key proteins at synapses have been linked to a number of psychiatric disorders, including autism. Understanding how the epitranscriptome is regulated, and what role it plays in brain biology, may eventually provide researchers with a new way to control the proteins found at synapses and, in turn, treat disorders characterized by synaptic dysfunction.

More information: Daria Merkurjev et al. Synaptic N6-methyladenosine (m6A) epitranscriptome reveals functional partitioning of localized transcripts, Nature Neuroscience (2018). DOI: 10.1038/s41593-018-0173-6

Read more at: https://phys.org/news/2018-08-methyl-rna-key-brain-cell.html#jCp

A Soviet-Era Fox Experiment May Finally Reveal The Genes Behind Domestication

by CAROLYN Y. JOHNSON

In 1959, Soviet scientists embarked on an audacious experiment to breed a population of tame foxes, a strain of animals that wouldn’t be aggressive or fearful of people.

Scientists painstakingly selected the friendliest foxes to start each new generation, and within 10 cycles they began to see differences from wild foxes – fox pups that wagged their tails eagerly at people or with ears that stayed folded like a dog’s.

This study in animal domestication, known as the Russian farm-fox experiment, might be just a fascinating historical footnote – a quirky corner in the otherwise fraught scientific heritage of Soviet Russia.

Instead, it spawned an ongoing area of research into how domestication, based purely on behavioral traits, can result in other changes – like curlier tails and changes to fur color.

Now, the tools of modern biology are revealing the genetic changes that underpin the taming of foxes of Siberia.

In a new study, published Monday in Nature Ecology & Evolution, scientists used genome sequencing to identify 103 stretches of the fox genome that appear to have been changed by breeding, a first pass at identifying the genes that make some foxes comfortable with humans and others wary and aggressive.

The scientists studied the genomes of 10 foxes from three different groups: the tame population, a strain that was bred to be aggressive toward people and a conventional group bred to live on a farm.

Having genetic information from all three groups allowed the researchers to identify regions of the genome that were likely to have changed due to the active selection of animals with different behaviors, rather than natural fluctuation over time.

Those regions offer starting points in efforts to probe the genetic basis and evolution of complex traits, such as sociability or aggressiveness.

“The experiment has been going on for decades and decades, and to finally have the genome information, you get to look and see where in the genome and what in the genome has been likely driving these changes that we’ve seen – it’s a very elegant experimental design,” said Adam Boyko, an associate professor of biomedical sciences at Cornell University, who was not involved in the study.

While some genetic traits are relatively simple to unravel, the underpinnings of social behaviors aren’t easy to dissect. Behavior is influenced by hundreds or thousands of genes, as well as the environment – and typically behaviors fall on a wide spectrum.

The existence of fox populations bred solely for how they interact with people offers a rare opportunity to strip away some of the other complexity – with possible implications for understanding such traits in people and other animals, too, since evolution may work on the same pathways or even the same genes.

“We’re interested to see what are the genes that make such a big difference in behavior. There are not so many animal models which are good to study genetics of social behavior, and in these foxes it’s such a big difference between tame foxes compared to conventional foxes, and those selected for aggressive behavior,” said Anna Kukekova, an assistant professor at the University of Illinois at Urbana-Champaign, who led the work.

Kukekova and colleagues began studying one very large gene that they think may be linked to tame behavior, called SorCS1. The gene plays a role in sorting proteins that allow brain cells to communicate.

Kukekova is interested in determining what happens if the gene is deleted in a mouse and to search for specific mutations that might contribute to differences in behavior.

Bridgett vonHoldt, an assistant professor of ecology and evolutionary biology at Princeton University, said changes that occurred in foxes “overlap extensively with those observed in the transition of gray wolves to modern domestic dogs.”

She said the study may help dog and fox biologists determine if there are complex behavioral traits under the control of just a few genes.

Recent fox evolution in a domesticated population may seem to have little to do with understanding the genetics of human behavior, but interest in domestication has grown as an area of scientific interest in part because genes involved in behavior in one animal may play a similar role in another.

“One reason why it is interesting is it gives us some insights about us. Humans are domesticated themselves, in a way,” Boyko said.

“We’re much more tolerant of being around other humans than probably we were as we were evolving; we’ve had to undergo a transformation, even relatively recently from the agricultural revolution.”

https://www.sciencealert.com/soviet-era-fox-taming-experiment-may-reveal-genes-behind-social-behavior

The neurobiological basis of leadership rests in low aversion to responsibility


Low responsibility aversion is an important determinant of the decision to lead.

Leaders are more willing to take responsibility for making decisions that affect the welfare of others. In a new study, researchers at the University of Zurich identified the cognitive and neurobiological processes that influence whether someone is more likely to take on leadership or to delegate decision-making.

Parents, company bosses and army generals, as well as teachers and heads of state, all have to make decisions that affect not only themselves, but also influence the welfare of others. Sometimes, the consequences will be borne by individuals, but sometimes by whole organizations or even countries.

Researchers from the Department of Economics investigated what distinguishes people with high leadership abilities. In the study, which has just been published in the journal Science, they identify a common decision process that may characterize followers: Responsibility aversion, or the unwillingness to make decisions that also affect others.

Controlled experiments and brain imaging

In the study, leaders of groups could either make a decision themselves or delegate it to the group. A distinction was drawn between “self” trials, in which the decision only affected the decision-makers themselves, and “group” trials, in which there were consequences for the whole group. The neurobiological processes taking place in the brains of the participants as they were making the decisions were examined using functional magnetic resonance imaging (fMRI).

The scientists tested several common intuitive beliefs, such as the notion that individuals who are less afraid of potential losses or taking risks, or who like being in control, will be more willing to take on responsibility for others. These characteristics, however, did not explain the differing extent of responsibility aversion found in the study participants. Instead, they found that responsibility aversion was driven by a greater need for certainty about the best course of action when the decision also had an effect on others. This shift in the need for certainty was particularly pronounced in people with a strong aversion to responsibility.

“Because this framework highlights the change in the amount of certainty required to make a decision, and not the individual’s general tendency for assuming control, it can account for many different leadership types,” says lead author Micah Edelson. “These can include authoritarian leaders who make most decisions themselves, and egalitarian leaders who frequently seek a group consensus.”

More information: Computational and neurobiological foundations of leadership decisions. Science: August 2, 2018. DOI: 10.1126/science.aat0036

Reducing NOVA1 gene helps prevent tumor growth in most common type of lung cancer


Lung cancer seen on chest X ray.

Researchers have identified a gene that when inhibited or reduced, in turn, reduced or prevented human non-small cell lung cancer tumors from growing.

When mice were injected with non-small cell lung cancer cells that contained the gene NOVA1, three of four mice formed tumors. When the mice were injected with cancer cells without NOVA1, three of four mice remained tumor-free.

The fourth developed a tumor, but it was very small compared to the mice with the NOVA1 tumor cells, said Andrew Ludlow, first author on the study and assistant professor at the University of Michigan School of Kinesiology.

The research appears online today in Nature Communications. Ludlow did the work while a postdoctoral fellow at the University of Texas Southwestern Medical Center, in the shared lab of Woodring Wright, professor of cell biology and internal medicine, and Jerry Shay, professor of cell biology.

The study found that in cancer cells, the NOVA1 gene is thought to activate telomerase, the enzyme that maintains telomeres—the protective caps on the ends of chromosomes that preserve genetic information during cell division (think of the plastic aglets that prevent shoelace ends from fraying).

Telomerase isn’t active in healthy adult tissues, so telomeres degrade and shorten as we age. When they get too short, the body knows to remove those damaged or dead cells.

In most cancers, telomerase is reactivated and telomeres are maintained, thus preserving the genetic material, and these are the cells that mutate and become immortal.

Telomerase is present in most cancer types, and it’s an attractive therapeutic target for cancer. However, scientists haven’t had much luck inhibiting telomerase activity in cancer, Ludlow said.

Ludlow’s group wanted to try a new approach, so they screened lung cancer cell lines for splicing genes (genes that modify RNA) that might regulate telomerase in cancer, and identified NOVA1.

They found that reducing the NOVA1 gene reduced telomerase activity, which led to shorter telomeres, and cancer cells couldn’t survive and divide.

Researchers only looked at non-small cell lung cancers, and NOVA1 was present in about 70 percent of them.

“Non-small cell lung cancer is the most prevalent form of age-related cancer, and 80 to 85 percent of all lung cancers are non-small cell,” Ludlow said. “But there really aren’t that many treatments for it.”

According to the American Cancer Society, lung cancer causes the most cancer deaths among men and women, and is the second most common cancer, aside from skin cancer.

Before researchers can target NOVA1 or telomerase splicing as a serious potential therapy for non-small cell lung cancer, they must gain a much better understanding of how telomerase is regulated. This research is a step in that direction.

Ludlow’s group is also looking at ways to directly impact telomerase splicing, in addition to reducing NOVA1.

Explore further: Blocking two enzymes could make cancer cells mortal

More information: Andrew T. Ludlow et al, NOVA1 regulates hTERT splicing and cell growth in non-small cell lung cancer, Nature Communications (2018). DOI: 10.1038/s41467-018-05582-x

https://medicalxpress.com/news/2018-08-nova1-gene-tumor-growth-common.html

Magnetic particle mapped in the human brain

Researchers at Ludwing Maximilliams Universitat Muchen have for the first time mapped the distribution of magnetic particles in the human brain. The study reveals that the particles are primarily located in the cerebellum and the brainstem, which are the more ancient parts of the brain.

Many living organisms, such as migratory birds, are thought to possess a magnetotactic sense, which enables them to respond to the Earth’s magnetic field. Whether or not humans are capable of sensing magnetism is the subject of debate. However, several studies have already shown that one of the preconditions required for such a magnetic sensory system is indeed met: magnetic particles exist in the human brain. Now a team led by Stuart A. Gilder (a professor at LMU‘s Department of Earth and Environmental Sciences) and Christoph Schmitz (a professor at LMU’s Department of Neuroanatomy) has systematically mapped the distribution of magnetic particles in human post mortem brains. Their findings were published in the journal Scientific Reports (Nature Publishing Group)

In their study, the LMU researchers confirmed the presence of magnetic particles in human brains. The particles were found primarily in the cerebellum and the brainstem, and there was striking asymmetry in the distribution between the left and right hemispheres of the brain. “The human brain exploits asymmetries in sensory responses for spatial orientation, and also for sound-source localization,” Schmitz explains. The asymmetric distribution of the magnetic particles is therefore compatible with the idea that humans might have a magnetic sensor. But in all probability, this sensor is much too insensitive to serve any useful biological function, he adds. Furthermore, the chemical nature of the magnetic particles remains unknown. “We assume that they are all made of magnetite (Fe3O4), but it is not yet possible to be sure,” says Gilder.

The study was funded by the Volkswagen Foundation’s “Experiment!” program, which is designed specifically to get daring new research projects, whose ultimate outcome is uncertain, off the ground. This is in contrast to traditional NIH-style support, which largely supports research that has already been conducted and for which the outcome is almost certain. The data were obtained from seven human post mortem brains, which had been donated for use in medical research. In all, a total of 822 tissue samples were subjected to magnetometry. The measurements were performed under the supervision of Stuart Gilder in a magnetically shielded laboratory located in a forest 80 km from Munich which is largely free from pervasive magnetic pollution that is characteristic of urban settings nowadays.

In further experiments, the LMU team plans to characterize the properties of the magnetic particles found in human brains. In collaboration with Professor Patrick R. Hof (Fishberg Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York), they also hope to perform analogous localization studies on far larger mammals – whales. These huge marine mammals are known to migrate between feeding and breeding grounds across great distances in the world’s oceans. “We want determine whether we can detect magnetic particles in the brains of whales, and if so whether they are also asymmetrically distributed“ says Schmitz. “It goes without saying that such studies will be carried out on animals that have died of natural causes.”

https://www.en.uni-muenchen.de/news/newsarchiv/2018/schmitz_magnetite.html

Distribution of magnetic remanence carriers in the human brain
Stuart A. Gilder, Michael Wack, Leon Kaub, Sophie C. Roud, Nikolai Petersen, Helmut Heinsen, Peter Hillenbrand, Stefan Milz & Christoph Schmitz
Scientific Reportsvolume 8, Article number: 11363 (2018)