Researchers find new phase of carbon, Q-carbon, that is brighter and harder than diamond and can be made easily and inexpensively in less than a second.

Researchers from North Carolina State University have discovered a new phase of solid carbon, called Q-carbon, which is distinct from the known phases of graphite and diamond. They have also developed a technique for using Q-carbon to make diamond-related structures at room temperature and at ambient atmospheric pressure in air.

Phases are distinct forms of the same material. Graphite is one of the solid phases of carbon; diamond is another.

“We’ve now created a third solid phase of carbon,” says Jay Narayan, the John C. Fan Distinguished Chair Professor of Materials Science and Engineering at NC State and lead author of three papers describing the work. “The only place it may be found in the natural world would be possibly in the core of some planets.”

Q-carbon has some unusual characteristics. For one thing, it is ferromagnetic – which other solid forms of carbon are not.

“We didn’t even think that was possible,” Narayan says.

In addition, Q-carbon is harder than diamond, and glows when exposed to even low levels of energy.
“Q-carbon’s strength and low work-function – its willingness to release electrons – make it very promising for developing new electronic display technologies,” Narayan says.

But Q-carbon can also be used to create a variety of single-crystal diamond objects. To understand that, you have to understand the process for creating Q-carbon.

Researchers start with a substrate, such as sapphire, glass or a plastic polymer. The substrate is then coated with amorphous carbon – elemental carbon that, unlike graphite or diamond, does not have a regular, well-defined crystalline structure. The carbon is then hit with a single laser pulse lasting approximately 200 nanoseconds. During this pulse, the temperature of the carbon is raised to 4,000 Kelvin (or around 3,727 degrees Celsius) and then rapidly cooled. This operation takes place at one atmosphere – the same pressure as the surrounding air.

The end result is a film of Q-carbon, and researchers can control the process to make films between 20 nanometers and 500 nanometers thick.

By using different substrates and changing the duration of the laser pulse, the researchers can also control how quickly the carbon cools. By changing the rate of cooling, they are able to create diamond structures within the Q-carbon.

“We can create diamond nanoneedles or microneedles, nanodots, or large-area diamond films, with applications for drug delivery, industrial processes and for creating high-temperature switches and power electronics,” Narayan says. “These diamond objects have a single-crystalline structure, making them stronger than polycrystalline materials. And it is all done at room temperature and at ambient atmosphere – we’re basically using a laser like the ones used for laser eye surgery. So, not only does this allow us to develop new applications, but the process itself is relatively inexpensive.”
And, if researchers want to convert more of the Q-carbon to diamond, they can simply repeat the laser-pulse/cooling process.

If Q-carbon is harder than diamond, why would someone want to make diamond nanodots instead of Q-carbon ones? Because we still have a lot to learn about this new material.

“We can make Q-carbon films, and we’re learning its properties, but we are still in the early stages of understanding how to manipulate it,” Narayan says. “We know a lot about diamond, so we can make diamond nanodots. We don’t yet know how to make Q-carbon nanodots or microneedles. That’s something we’re working on.”

NC State has filed two provisional patents on the Q-carbon and diamond creation techniques.
The work is described in two papers, both of which were co-authored by NC State Ph.D. student Anagh Bhaumik. “Novel Phase of Carbon, Ferromagnetism and Conversion into Diamond” will be published online Nov. 30 in the Journal of Applied Physics. “Direct conversion of amorphous carbon into diamond at ambient pressures and temperatures in air” was published Oct. 7 in the journal APL Materials.

New progress in understanding what may give animals a magnetic sense: a protein that acts as a compass

Quick – can you tell where north is? Animals as diverse as sea turtles, birds, worms, butterflies and wolves can, thanks to sensing Earth’s magnetic field.

But the magnet-sensing structures inside their cells that allow them to do this have evaded scientists – until now.

A team led by Can Xie’s at Peking University in China has now found a protein in fruit flies, butterflies and pigeons that they believe to be responsible for this magnetic sense.

“It’s provocative and potentially groundbreaking,” says neurobiologist Steven Reppert of the University of Massachusetts who was not involved in the work. “It took my breath away.”

There used to be two competing theories about magnetic sense: some thought it came from iron-binding molecules, others thought it came from a protein called cryptochrome, which senses light and has been linked to magnetic sense in birds.

Xie’s group was the first to guess these two were part of the same system, and has now figured out how they fit together.

“This was a very creative approach,” says Reppert. “Everyone thought they were two separate systems.”

Xie’s team first screened the fruit fly genome for a protein that would fit a very specific bill.

The molecule had to bind iron, it had to be expressed inside a cell instead of on the cell membrane and do so in the animal’s head – where animals tend to sense magnetic fields – and it also had to interact with cryptochrome.

“We found one [gene] fit all of our predictions,” says Xie. They called it MagR and then used techniques including electron microscopy and computer modelling to figure out the protein’s structure.

They found that MagR and cryptochrome proteins formed a cylinder, with an inside filling of 20 MagR molecules surrounded by 10 cryptochromes.

The researchers then identified and isolated this protein complex from pigeons and monarch butterflies.

In the lab, the proteins snapped into alignment in response to a magnetic field. They were so strongly magnetic that they flew up and stuck to the researchers’ tools, which contained iron. So the team had to use custom tools made of plastic.

The team hasn’t yet tried to remove the MagR protein from an animal like a fruit fly to see if it loses its magnetic sense, but Xie believes the proteins work the same way in a living animal.

Although this protein complex seems to form the basis of magnetic sense, the exact mechanism is still to be figured out.

One idea is that when an animal changes direction, the proteins may swing around to point north, “just like a compass needle,” says Xie. Perhaps the proteins’ movement could trigger a connected molecule, which would send a signal to the nervous system.

Journal reference: Nature Materials, DOI: 10.1038/nmat4484

https://www.newscientist.com/article/dn28494-animal-magnetic-sense-comes-from-protein-that-acts-as-a-compass

Thanks to Kebmodee for bringing this to the It’s Interesting community.

The Power of Music in Alleviating Dementia Symptoms

by Tori Rodriguez, MA, LPC

As the search continues for effective drug treatments for dementia, patients and caregivers may find some measure of relief from a common, non-pharmaceutical source. Researchers have found that music-related memory appears to be exempt from the extent of memory impairment generally associated with dementia, and several studies report promising results for several different types of musical experiences across a variety of settings and formats.

“We can say that perception of music can be intact, even when explicit judgments and overt recognition have been lost,” Manuela Kerer, PhD, told Psychiatry Advisor. “We are convinced that there is a specialized memory system for music, which is distinct from other domains, like verbal or visual memory, and may be very resilient against Alzheimer’s disease.”

Kerer is a full-time musical composer with a doctoral degree in psychology who co-authored a study on the topic while working at the University of Innsbruck in Austria. She and her colleagues investigated explicit memory for music among ten patients with early-state Alzheimer’s disease (AD) and ten patients with mild cognitive impairment (MCI), and compared their performance to that of 23 healthy participants. Not surprisingly, the patient group demonstrated worse performance on tasks involving verbal memory, but they did significantly better than controls on the music-perceptional tasks of detecting distorted tunes and judging timbre.

“The temporal brain structures necessary for verbal musical memory were mildly affected in our clinical patients, therefore attention might have shifted to the discrimination tasks which led to better results in this area,” she said. “Our results enhance the notion of an explicit memory for music that can be distinguished from other types of explicit memory — that means that memory for music could be spared in this patient group.”

Other findings suggest that music might even improve certain aspects of memory among people with dementia. In a randomized controlled trial published in last month in the Journal of Alzheimer’s Disease, music coaching interventions improved multiple outcomes for both patients with dementia and their caregivers. The researchers divided 89 pairs of patients with dementia and their caregivers into three groups: two groups were assigned to caregiver-led interventions that involved either singing or listening to music, while a third group received standard care. Before and after the 10-week intervention, and six months after the intervention, participants were assessed on measures of mood, quality of life and neuropsychological functioning.

Results showed that the singing intervention improved working memory among patients with mild dementia and helped to preserve executive function and orientation among younger patients, and it also improved the well-being of caregivers. The listening intervention was found to have a positive impact on general cognition, working memory and quality of life, particularly among patients in institutional care with moderate dementia not caused by AD. Both interventions led to reductions in depression.

The findings suggest that “music has the power to improve mood and stimulate cognitive functions in dementia, most likely by engaging limbic and medial prefrontal brain regions, which are often preserved in the early stages of the illness,” study co-author Teppo Särkämö, PhD, a researcher at the University of Helsinki, Finland, told Psychiatry Advisor. “The results indicate that when used regularly, caregiver-implemented musical activities can be an important and easily applicable way to maintain the emotional and cognitive well-being of persons with dementia and also to reduce the psychological burden of family caregivers.”

Singing has also been shown to increase learning and retention of new verbal material in patients with AD, according to research published this year in the Journal of Clinical & Experimental Neuropsychology, and findings published in 2013 show that listening to familiar music improves the verbal narration of autobiographical memories in such patients. Another study found that a music intervention delivered in a group format reduced depression and delayed the deterioration of cognitive functions, especially short-term recall, in patients with mild and moderate dementia. Group-based music therapy appears to also decrease agitation among patients in all stages of dementia, as described in a systematic review published in 2014 in Nursing Times.

n addition to the effects of singing and listening to music on patients who already have dementia, playing a musical instrument may also offer some protection against the condition, according to a population-based twin study reported in 2014 in the International Journal of Alzheimer’s Disease. Researchers at the University of Southern California found that older adults who played an instrument were 64% less likely than their non-musician twin to develop dementia or cognitive impairment.

“Playing an instrument is a unique activity in that it requires a wide array of brain regions and cognitive functions to work together simultaneously, throughout both the right and left hemispheres,” co-author Alison Balbag, PhD, told Psychiatry Advisor. While the study did not examine causal mechanisms, “playing an instrument may be a very effective and efficient way to engage the brain, possibly granting older musicians better maintained cognitive reserve and possibly providing compensatory abilities to mitigate age-related cognitive declines.”

She notes that clinicians might consider suggesting that patients incorporate music-making into their lives as a preventive activity, or encouraging them to keep it up if they already play an instrument.
Further research, particularly neuroimaging studies, is needed to elucidate the mechanisms behind the effects of music on dementia, but in the meantime it could be a helpful supplement to patients’ treatment plans. “Music has considerable potential and it should be introduced much more in rehabilitation and neuropsychological assessment,” Kerer said.

http://www.psychiatryadvisor.com/alzheimers-disease-and-dementia/neurocognitive-neurodegenerative-memory-musical-alzheimers/article/452120/3/

References

Kerer M, Marksteiner J, Hinterhuber H, et al. Explicit (semantic) memory for music in patients with mild cognitive impairment and early-stage Alzheimer’s disease. Experimental Aging Research; 2013; 39(5):536-64.

Särkämö T, Laitinen S, Numminen A, et al. Clinical and Demographic Factors Associated with the Cognitive and Emotional Efficacy of Regular Musical Activities in Dementia. Journal of Alzheimer’s Disease; 2015; published online ahead of print.

Palisson J, Roussel-Baclet C, Maillet D, et al. Music enhances verbal episodic memory in Alzheimer’s disease. Journal of Clinical & Experimental Neuropsychology; 2015; 37(5):503-17.

El Haj M, Sylvain Clément, Luciano Fasotti, Philippe Allain. Effects of music on autobiographical verbal narration in Alzheimer’s disease. Journal of Neurolinguistics; 2013; 26(6): 691–700.

Chu H, Yang CY, Lin Y, et al. The impact of group music therapy on depression and cognition in elderly persons with dementia: a randomized controlled study. Biological Research for Nursing; 2014; 16(2):209-17.

Craig J. Music therapy to reduce agitation in dementia. Nursing Times; 2014; 110(32-33):12-5.
Balbag MA, Pedersen NL, Gatz M. Playing a Musical Instrument as a Protective Factor against Dementia and Cognitive Impairment: A Population-Based Twin Study. International Journal of Alzheimer’s Disease; 2014; 2014: 836748.

Exploring the Biology of Eating Disorders

With the pressure for a certain body type prevalent in the media, eating disorders are on the rise. But these diseases are not completely socially driven; researchers have uncovered important genetic and biological components as well and are now beginning to tease out the genes and pathways responsible for eating disorder predisposition and pathology.

As we enter the holiday season, shoppers will once again rush into crowded department stores searching for the perfect gift. They will be jostled and bumped, yet for the most part, remain cheerful because of the crisp air, lights, decorations, and the sound of Karen Carpenter’s contralto voice ringing out familiar carols.

While Carpenter is mainly remembered for her musical talents, unfortunately, she is also known for introducing the world to anorexia nervosa (AN), a severe life-threatening mental illness characterized by altered body image and stringent eating patterns that claimed her life just before her 33rd birthday in 1983.

Even though eating disorders (ED) carry one of the highest mortality rates of any mental illness, many researchers and clinicians still view them as socially reinforced behaviors and diagnose them based on criteria such as “inability to maintain body weight,” “undue influence of body weight or shape on self-evaluation,” and “denial of the seriousness of low body weight” (1). This way of thinking was prevalent when Michael Lutter, then an MD/PhD student at the University of Texas Southwestern Medical Center, began his psychiatry residency in an eating disorders unit. “I just remember the intense fear of eating that many patients exhibited and thought that it had to be biologically driven,” he said.

Lutter carried this impression with him when he established his own research laboratory at the University of Iowa. Although clear evidence supports the idea that EDs are biologically driven—they predominantly affect women and significantly alter energy homeostasis—a lack of well-defined animal models combined with the view that they are mainly behavioral abnormalities have hindered studies of the neurobiology of EDs. Still, Lutter is determined to find the biological roots of the disease and tease out the relationship between the psychiatric illness and metabolic disturbance using biochemistry, neuroscience, and human genetics approaches.

We’ve Only Just Begun

Like many diseases, EDs result from complex interactions between genes and environmental risk factors. They tend to run in families, but of course, for many family members, genetics and environment are similar enough that teasing apart the influences of nature and nurture is not easy. Researchers estimate that 50-80% of the predisposition for developing an ED is genetic, but preliminary genome-wide analyses and candidate gene studies failed to identify specific genes that contribute to the risk.

According to Lutter, finding ED study participants can be difficult. “People are either reluctant to participate, or they don’t see that they have a problem,” he reported. Set on finding the genetic underpinnings of EDs, his team began recruiting volunteers and found 2 families, 1 with 20 members, 10 of whom had an ED and another with 5 out of 8 members affected. Rather than doing large-scale linkage and association studies, the team decided to characterize rare single-gene mutations in these families, which led them to identify mutations in the first two genes, estrogen-related receptor α (ESRRA) and histone deacetylase 4 (HDAC4), that clearly associated with ED predisposition in 2013 (1).

“We have larger genetic studies on-going, including the collection of more families. We just happened to publish these two families first because we were able to collect enough individuals and because there is a biological connection between the two genes that we identified,” Lutter explained.

ESRRA appears to be a transcription factor upregulated by exercise and calorie restriction that plays a role in energy balance and metabolism. HDAC4, on the other hand, is a well-described histone deacteylase that has previously been implicated in locomotor activity, body weight homeostasis, and neuronal plasticity.

Using immunoprecipitation, the researchers found that ESRRA interacts with HDAC4, in both the wild type and mutant forms, and transcription assays showed that HDAC4 represses ESRRA activity. When Lutter’s team repeated the transcription assays using mutant forms of the proteins, they found that the ESRRA mutation seen in one family significantly reduced the induction of target gene transcription compared to wild type, and that the mutation in HDAC4 found in the other family increased transcriptional repression for ESRRA target genes.

“ESRRA is a well known regulator of mitochondrial function, and there is an emerging view that mitochondria in the synapse are critical for neurotransmission,” Lutter said. “We are working on identifying target pathways now.”

Bless the Beasts and the Children

Finding genes associated with EDs provides the groundwork for molecular studies, but EDs cannot be completely explained by the actions of altered transcription factors. Individuals suffering these disorders often experience intense anxiety, intrusive thoughts, hyperactivity, and poor coping strategies that lead to rigid and ritualized behaviors and severe crippling perfectionism. They are less aware of their emotions and often try to avoid emotion altogether. To study these complex behaviors, researchers need animal models.

Until recently, scientists relied on mice with access to a running wheel and restricted access to food. Under these conditions, the animals quickly increase their locomotor activity and reduce eating, frequently resulting in death. While some characteristics of EDs—excessive exercise and avoiding food—can be studied in these mice, the model doesn’t allow researchers to explore how the disease actually develops. However, Lutter’s team has now introduced a promising new model (3).

Based on their previous success with identifying the involvement of ESRRA and HDAC4 in EDs, the researchers wondered if mice lacking ESRRA might make suitable models for studies on ED development. To find out, they first performed immunohistochemistry to understand more about the potential cognitive role of ESRRA.

“ESRRA is not expressed very abundantly in areas of the brain typically implicated in the regulation of food intake, which surprised us,” Lutter said. “It is expressed in many cortical regions that have been implicated in the etiology of EDs by brain imaging like the prefrontal cortex, orbitofrontal cortex, and insula. We think that it probably affects the activity of neurons that modulate food intake instead of directly affecting a core feeding circuit.”

With these data, the team next tried providing only 60% of the normal daily calories to their mice for 10 days and looked again at ESRRA expression. Interestingly, ESRRA levels increased significantly when the mice were insufficiently fed, indicating that the protein might be involved in the response to energy balance.

Lutter now believes that upregulation of ESRRA helps organisms adapt to calorie restriction, an effect possibly not happening in those with ESRRA or HDAC4 mutations. “This makes sense for the clinical situation where most individuals will be doing fine until they are challenged by something like a diet or heavy exercise for a sporting event. Once they start losing weight, they don’t adapt their behaviors to increase calorie intake and rapidly spiral into a cycle of greater and greater weight loss.”

When Lutter’s team obtained mice lacking ESRRA, they found that these animals were 15% smaller than their wild type littermates and put forth less effort to obtain food both when fed restricted calorie diets and when they had free access to food. These phenotypes were more pronounced in female mice than male mice, likely due to the role of estrogen signaling. Loss of ESRRA increased grooming behavior, obsessive marble burying, and made mice slower to abandon an escape hole after its relocation, indicating behavioral rigidity. And the mice demonstrated impaired social functioning and reduced locomotion.

Some people with AN exercise extensively, but this isn’t seen in all cases. “I would say it is controversial whether or not hyperactivity is due to a genetic predisposition (trait), secondary to starvations (state), or simply a ritual that develops to counter the anxiety of weight related obsessions. Our data would suggest that it is not due to genetic predisposition,” Lutter explained. “But I would caution against over-interpretation of mouse behavior. The locomotor activity of mice is very different from people and it’s not clear that you can directly translate the results.”

For All We Know

Going forward, Lutter’s group plans to drill down into the behavioral phenotypes seen in their ESRRA null mice. They are currently deleting ESRRA from different neuronal cell types to pair individual neurons with the behaviors they mediate in the hope of working out the neural circuits involved in ED development and pathology.

In addition, the team has created a mouse line carrying one of the HDAC4 mutations previously identified in their genetic study. So far, this mouse “has interesting parallels to the ESRRA-null mouse line,” Lutter reported.

The team continues to recruit volunteers for larger-scale genetic studies. Eventually, they plan to perform RNA-seq to identify the targets of ESRRA and HDAC4 and look into their roles in mitochondrial biogenesis in neurons. Lutter suspects that this process is a key target of ESRRA and could shed light on the cognitive differences, such as altered body image, seen in EDs. In the end, a better understanding of the cells and pathways involved with EDs could create new treatment options, reduce suffering, and maybe even avoid the premature loss of talented individuals to the effects of these disorders.

References

1. Lutter M, Croghan AE, Cui H. Escaping the Golden Cage: Animal Models of Eating Disorders in the Post-Diagnostic and Statistical Manual Era. Biol Psychiatry. 2015 Feb 12.

2. Cui H, Moore J, Ashimi SS, Mason BL, Drawbridge JN, Han S, Hing B, Matthews A, McAdams CJ, Darbro BW, Pieper AA, Waller DA, Xing C, Lutter M. Eating disorder predisposition is associated with ESRRA and HDAC4 mutations. J Clin Invest. 2013 Nov;123(11):4706-13.

3. Cui H, Lu Y, Khan MZ, Anderson RM, McDaniel L, Wilson HE, Yin TC, Radley JJ, Pieper AA, Lutter M. Behavioral disturbances in estrogen-related receptor alpha-null mice. Cell Rep. 2015 Apr 21;11(3):344-50.

http://www.biotechniques.com/news/Exploring-the-Biology-of-Eating-Disorders/biotechniques-361522.html

Scientists Sequence Genome of Eurasian Wild Aurochs, an Extinct Species of Ox that Gave Rise to Modern Bison

A multinational team of researchers has sequenced the nuclear genome of the aurochs (Bos primigenius), an extinct species of ox that inhabited Europe, Asia and North Africa.

“This is the first complete nuclear genome sequence from the extinct Eurasian aurochs,” said Dr David MacHugh of University College Dublin, Ireland, corresponding author of a paper published online in the journal Genome Biology.

Domestication of the now-extinct wild aurochs gave rise to the two major domestic extant cattle species – Bos taurus and B. indicus.

While previous genetic studies have shed some light on the evolutionary relationships between European aurochs and modern cattle, important questions remain unanswered, including the phylogenetic status of aurochs, whether gene flow from aurochs into early domestic populations occurred, and which genomic regions were subject to selection processes during and after domestication.

To build a clearer picture of the ancestry of European cattle breeds, Dr MacHugh and his colleagues from the United States, the UK, China and Ireland, extracted genetic material from a bone of a 6,750 year old wild aurochs discovered in a cave in Derbyshire, England.

The scientists then sequenced its complete genome and compared it with the genomes of 81 domesticated Bos taurus and B. indicus animals, and DNA marker information from more than 1,200 modern cows.

They discovered clear evidence of breeding between wild British aurochs and early domesticated cattle.

“Our results show the ancestors of modern British and Irish breeds share more genetic similarities with this ancient specimen than other European cattle,” Dr MacHugh said.

“This suggests that early British farmers may have restocked their domesticated herds with wild aurochs.”

“Genes linked to neurobiology and muscle development were also found to be associated with domestication of the ancestors of European cattle, indicating that a key part of the domestication process was the selection of cattle based on behavioral and meat traits.”

The study contradicts earlier simple models of cattle domestication and evolution that researchers proposed based on mitochondrial DNA or Y chromosomes.

“What now emerges from high-resolution studies of the nuclear genome is a more nuanced picture of crossbreeding and gene flow between domestic cattle and wild aurochs as early European farmers moved into new habitats such as Britain during the Neolithic,” Dr MacHugh concluded.

_____

Stephen D.E. Park et al. 2015. Genome sequencing of the extinct Eurasian wild aurochs, Bos primigenius, illuminates the phylogeography and evolution of cattle. Genome Biology 16: 234; doi: 10.1186/s13059-015-0790-2

http://www.sci-news.com/genetics/science-genome-eurasian-wild-aurochs-bos-primigenius-03377.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+BreakingScienceNews+%28Breaking+Science+News%29

Scientists develop system to detect Supervolcano eruptions

Researchers claim to have worked out how to accurately predict the eruption of ‘supervolcanoes’ that blanket the earth in giant ash clouds triggering a ‘nuclear winter’.

They say the discovery could reveal exactly when giant pools of magma greater than 100 cubic miles in volume and formed a few miles below the surface will erupt.

Repeatedly throughout Earth’s history,when they become a super-eruption, the resulting gigantic volcanic outbursts that throw 100 times more superheated gas, ash and rock into the atmosphere than run-of-the-mill eruptions – enough to blanket continents and plunge the globe into decades-long volcanic winters.

The most recent super-eruption took place about 27,000 years ago in New Zealand, well before humans kept records of volcanic eruptions and their aftermath.

Geologists today are studying deposits from past super-eruptions to try and understand where and how rapidly these magma bodies develop and what causes them to eventually erupt.

Despite considerable study, geologists are still debating how quickly these magma pools can be activated and erupted, with estimates ranging from millions to hundreds of years.

Now a team of geologists have developed a new ‘geospeedometer’ that they argue can help resolve this controversy by providing direct measurements of how long the most explosive types of magma existed as melt-rich bodies of crystal-poor magma before they erupted.

They have applied their new technique to two super-eruption sites and a pair of very large eruptions and found that it took them no more than 500 years to move from formation to eruption.

These results are described in the article ‘Melt inclusion shapes: Timekeepers of short-lived giant magma bodies’ appearing in the November issue of the journal Geology.

Geologists have developed a number of different ‘timekeepers’ for volcanic deposits.

The fact that these techniques measure different processes and have different resolutions, has contributed to this lack of consensus.

‘Geologists have developed a number of different ‘timekeepers’ for volcanic deposits,’ said Guilherme Gualda, associate professor of earth and environmental sciences at Vanderbilt University, who directed the project.

‘The fact that these techniques measure different processes and have different resolutions, has contributed to this lack of consensus.

‘Our new method indicates that the process can take place within historically relevant spans of time,’
The method was developed as part of the doctoral thesis of Ayla Pamukcu, who is now a post-doctoral researcher at Brown and Princeton Universities.

‘The hot spot under Yellowstone National Park has produced several super-eruptions in the past.

‘The measurements that have been made indicate that this magma body doesn’t currently have a high-enough percentage of melt to produce a super-eruption.

But now we know that, when or if it does reach such a state, we will only have a few hundred years to prepare ourselves for the consequences,’ Gualda said.

The researchers’ geospeedometer is based on millimeter-sized quartz crystals that grew within the magma bodies that produced these giant eruptions.

Quartz crystals are typically found in magmas that have a high percentage of silica.

This type of magma is very viscous and commonly produces extremely violent eruptions. Mount St. Helens was a recent example.

When the crystals form, they often capture small blobs of molten magma – known as blebs or melt inclusions. Blebs are initially round.

While the crystal is floating in hot magma, diffusion causes them to gradually acquire the polygonal shape of the crystal void that they occupy. But this faceting process can be halted if eruption occurs before complete faceting is achieved.

Using advanced 3-D X-ray tomography, the researchers were able to measure the size and shape of the melt inclusions with exquisite precision.

In cases where the inclusions had not become completely faceted, the researchers could determine how much time had elapsed since they were enclosed.

‘Previous studies provided us with the data we needed to calculate the rate of the faceting process. We then used this rate, in combination with our shape measurements, to calculate how long the crystal existed in the magma before the eruption,’ said Pamukcu.

In addition, the researchers compared the results obtained with faceting with results obtained using other techniques.

Crystallization may cause variations in concentration of certain elements. In quartz, the element titanium can vary sharply between different zones or layers within the crystal.

Over time, however, the process of diffusion gradually smooths out these variations.

This process also stops at the eruption, so the shallower the slope of titanium concentrations across these boundaries today, the longer the crystal spent in magmatic conditions.

The physics of this process is also well known, so the researchers could use these measurements to provide an independent estimate of how long a crystal spent floating around in the melt.

They found that the duration times they derived from the titanium diffusion measurements agreed closely with those produced by the faceting method.

‘Our current method will also work on smaller volcanic systems, as long as they erupt magmas that contain quartz crystals,’ said Pamukcu.

‘We are also confident that we can adapt these techniques to work with other minerals, which will allow us to make similar timescale calculations for other types of magmas and volcanoes, like the low-silica basalts commonly erupted from Hawaiian volcanoes.’

Read more: http://www.dailymail.co.uk/sciencetech/article-3281859/Phew-Scientists-claim-developed-predict-cataclysmic-SUPERVOLCANO-eruptions-end-life-Earth.html#ixzz3pD81J9cT

Scientists encode memories in a way that bypasses damaged brain tissue

Researchers at University of South Carolina (USC) and Wake Forest Baptist Medical Center have developed a brain prosthesis that is designed to help individuals suffering from memory loss.

The prosthesis, which includes a small array of electrodes implanted into the brain, has performed well in laboratory testing in animals and is currently being evaluated in human patients.

Designed originally at USC and tested at Wake Forest Baptist, the device builds on decades of research by Ted Berger and relies on a new algorithm created by Dong Song, both of the USC Viterbi School of Engineering. The development also builds on more than a decade of collaboration with Sam Deadwyler and Robert Hampson of the Department of Physiology & Pharmacology of Wake Forest Baptist who have collected the neural data used to construct the models and algorithms.

When your brain receives the sensory input, it creates a memory in the form of a complex electrical signal that travels through multiple regions of the hippocampus, the memory center of the brain. At each region, the signal is re-encoded until it reaches the final region as a wholly different signal that is sent off for long-term storage.

If there’s damage at any region that prevents this translation, then there is the possibility that long-term memory will not be formed. That’s why an individual with hippocampal damage (for example, due to Alzheimer’s disease) can recall events from a long time ago – things that were already translated into long-term memories before the brain damage occurred – but have difficulty forming new long-term memories.

Song and Berger found a way to accurately mimic how a memory is translated from short-term memory into long-term memory, using data obtained by Deadwyler and Hampson, first from animals, and then from humans. Their prosthesis is designed to bypass a damaged hippocampal section and provide the next region with the correctly translated memory.

That’s despite the fact that there is currently no way of “reading” a memory just by looking at its electrical signal.

“It’s like being able to translate from Spanish to French without being able to understand either language,” Berger said.

Their research was presented at the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society in Milan on August 27, 2015.

The effectiveness of the model was tested by the USC and Wake Forest Baptist teams. With the permission of patients who had electrodes implanted in their hippocampi to treat chronic seizures, Hampson and Deadwyler read the electrical signals created during memory formation at two regions of the hippocampus, then sent that information to Song and Berger to construct the model. The team then fed those signals into the model and read how the signals generated from the first region of the hippocampus were translated into signals generated by the second region of the hippocampus.

In hundreds of trials conducted with nine patients, the algorithm accurately predicted how the signals would be translated with about 90 percent accuracy.

“Being able to predict neural signals with the USC model suggests that it can be used to design a device to support or replace the function of a damaged part of the brain,” Hampson said.
Next, the team will attempt to send the translated signal back into the brain of a patient with damage at one of the regions in order to try to bypass the damage and enable the formation of an accurate long-term memory.

http://medicalxpress.com/news/2015-09-scientists-bypass-brain-re-encoding-memories.html#nRlv

Scientific testing of the ‘5 second rule’ of food on the floor

The five-second rule is based on the not-entirely-scientific belief that bacteria cannot contaminate food within five seconds, so you won’t get sick eating things you have picked up from the floor.

The first person to investigate this urban myth scientifically was Jillian Clarke, an American high-school student, during an apprenticeship in a microbiology laboratory at the University of Illinois in 2003. Clarke and her colleagues inoculated rough and smooth tiles with the bacterium E coli (certain strains of which cause stomach cramps, diarrhoea and vomiting) and put gummy bears or cookies on the tiles for five seconds. She found that E coli was transferred to gummy bears within five seconds, more so from smooth than rough tiles. As a side issue, Clarke also established in her work that university floors are remarkably clean and that people are more likely to pick up cookies from the floor than cauliflower.

Paul Dawson, professor of food science at Clemson University in South Carolina is a five-second-rule expert. His 2007 study, published in the Journal of Applied Microbiology, found that the dirtiness of the floor was more important than how long the food lay on it. His study was a progression from Clarke’s because it measured the amount of contamination. Using bread or bologna, he showed that it was better to drop either of them on carpet inoculated with salmonella, where less than 1% of the bacteria were transferred, than on tiles or wood, where up to 70% got on to the food. A similar study from Aston University found that, as soon as food hit the floor, it became contaminated – especially on smooth surfaces – but that the number of bacteria on the food increased up to 10 times between lying from three seconds to 30 seconds on the floor.

Dawson says that the five-second rule is simply not true because, if food hits a virulent brand of E coli, even the small number of bacteria it attracts immediately will make you sick. He doesn’t eat food when it falls on the floor. The very young or old shouldn’t use the five-second rule as their immune systems may not cope with even tiny amounts of bacteria. If the floor is filthy, then the rule is invalid on the grounds of grossness anyway. But the likelihood is that, for most of us, eating food off the floor isn’t going to hurt us. So if you are very hungry and you must pick food off the floor, then do it quickly, and preferably off a carpet.

http://www.theguardian.com/lifeandstyle/2015/sep/28/is-the-five-second-food-rule-really-true?channel=us

New 3 million year old human-like species discovered in South Africa indicates ritualistic behavior and symbolic thought, which were not previously considered possible earlier than 200,000 years ago.

By

by Pallab Ghosh
Science correspondent, BBC News, Johannesburg

Scientists have discovered a new human-like species in a burial chamber deep in a cave system in South Africa. The discovery of 15 partial skeletons is the largest single discovery of its type in Africa.

The researchers claim that the discovery will change ideas about our human ancestors.

The studies which have been published in the journal Elife also indicate that these individuals were capable of ritualistic behaviour.

The species, which has been named naledi, has been classified in the grouping, or genus, Homo, to which modern humans belong.

The researchers who made the find have not been able to find out how long ago these creatures lived – but the scientist who led the team, Prof Lee Berger, told BBC News that he believed they could be among the first of our kind (genus Homo) and could have lived in Africa up to three million years ago.

Like all those working in the field, he is at pains to avoid the term “missing link”. Prof Berger says naledi could be thought of as a “bridge” between more primitive bipedal primates and humans.

“We’d gone in with the idea of recovering one fossil. That turned into multiple fossils. That turned into the discovery of multiple skeletons and multiple individuals.

“And so by the end of that remarkable 21-day experience, we had discovered the largest assemblage of fossil human relatives ever discovered in the history of the continent of Africa. That was an extraordinary experience.”

Prof Chris Stringer of the Natural History Museum said naledi was “a very important discovery”.

“What we are seeing is more and more species of creatures that suggests that nature was experimenting with how to evolve humans, thus giving rise to several different types of human-like creatures originating in parallel in different parts of Africa. Only one line eventually survived to give rise to us,” he told BBC News.

I went to see the bones which are kept in a secure room at Witwatersrand University. The door to the room looks like one that would seal a bank vault. As Prof Berger turned the large lever on the door, he told me that our knowledge of very early humans is based on partial skeletons and the occasional skull.

he haul of 15 partial skeletons includes both males and females of varying ages – from infants to elderly. The discovery is unprecedented in Africa and will shed more light on how the first humans evolved.

“We are going to know everything about this species,” Prof Berger told me as we walked over to the remains of H. naledi.

“We are going to know when its children were weaned, when they were born, how they developed, the speed at which they developed, the difference between males and females at every developmental stage from infancy, to childhood to teens to how they aged and how they died.”

I was astonished to see how well preserved the bones were. The skull, teeth and feet looked as if they belonged to a human child – even though the skeleton was that of an elderly female.
Its hand looked human-like too, up to its fingers which curl around a bit like those of an ape.

Homo naledi is unlike any primitive human found in Africa. It has a tiny brain – about the size of a gorilla’s and a primitive pelvis and shoulders. But it is put into the same genus as humans because of the more progressive shape of its skull, relatively small teeth, characteristic long legs and modern-looking feet.

“I saw something I thought I would never see in my career,” Prof Berger told me.

“It was a moment that 25 years as a paleoanthropologist had not prepared me for.”

One of the most intriguing questions raised by the find is how the remains got there.

I visited the site of the find, the Rising Star cave, an hour’s drive from the university in an area known as the Cradle of Humankind. The cave leads to a narrow underground tunnel through which some of Prof Berger’s team crawled in an expedition funded by the National Geographic Society.

Small women were chosen because the tunnel was so narrow. They crawled through darkness lit only by their head torches on a precarious 20 minute-long journey to find a chamber containing hundreds of bones.

Among them was Marina Elliott. She showed me the narrow entrance to the cave and then described how she felt when she first saw the chamber.

“The first time I went to the excavation site I likened it to the feeling that Howard Carter must have had when he opened Tutankhamen’s tomb – that you are in a very confined space and then it opens up and all of a sudden all you can see are all these wonderful things – it was incredible,” she said.

Ms Elliott and her colleagues believe that they have found a burial chamber. The Homo naledi people appear to have carried individuals deep into the cave system and deposited them in the chamber – possibly over generations.

If that is correct, it suggests naledi was capable of ritual behaviour and possibly symbolic thought – something that until now had only been associated with much later humans within the last 200,000 years.

Prof Berger said: “We are going to have to contemplate some very deep things about what it is to be human. Have we been wrong all along about this kind of behaviour that we thought was unique to modern humans?

“Did we inherit that behaviour from deep time and is it something that (the earliest humans) have always been able to do?”

Prof Berger believes that the discovery of a creature that has such a mix of modern and primitive features should make scientists rethink the definition of what it is to be human – so much so that he himself is reluctant to describe naledi as human.

Other researchers working in the field, such as Prof Stringer, believe that naledi should be described as a primitive human. But he agrees that current theories need to be re-evaluated and that we have only just scratched the surface of the rich and complex story of human evolution.

http://www.bbc.com/news/science-environment-34192447

New study identifies potential new class of more rapidly acting antidepressant medications

A new study by researchers at University of Maryland School of Medicine has identified promising compounds that could successfully treat depression in less than 24 hours while minimizing side effects. Although they have not yet been tested in people, the compounds could offer significant advantages over current antidepressant medications.

The research, led by Scott Thompson, PhD, Professor and Chair of the Department of Physiology at the University of Maryland School of Medicine (UM SOM), was published this month in the journal Neuropsychopharmacology.

“Our results open up a whole new class of potential antidepressant medications,” said Dr. Thompson. “We have evidence that these compounds can relieve the devastating symptoms of depression in less than one day, and can do so in a way that limits some of the key disadvantages of current approaches.”

Currently, most people with depression take medications that increase levels of the neurochemical serotonin in the brain. The most common of these drugs, such as Prozac and Lexapro, are selective serotonin reuptake inhibitors, or SSRIs. Unfortunately, SSRIs are effective in only a third of patients with depression. In addition, even when these drugs work, they typically take between three and eight weeks to relieve symptoms. As a result, patients often suffer for months before finding a medicine that makes them feel better. This is not only emotionally excruciating; in the case of patients who are suicidal, it can be deadly. Better treatments for depression are clearly needed.

Dr. Thompson and his team focused on another neurotransmitter besides serotonin, an inhibitory compound called GABA. Brain activity is determined by a balance of opposing excitatory and inhibitory communication between brain cells. Dr. Thompson and his team argue that in depression, excitatory messages in some brain regions are not strong enough. Because there is no safe way to directly strengthen excitatory communication, they examined a class of compounds that reduce the inhibitory messages sent via GABA. They predicted that these compounds would restore excitatory strength. These compounds, called GABA-NAMs, minimize unwanted side effects because they are precise: they work only in the parts of the brain that are essential for mood.

The researchers tested the compounds in rats that were subjected to chronic mild stress that caused the animals to act in ways that resemble human depression. Giving stressed rats GABA-NAMs successfully reversed experimental signs of a key symptom of depression, anhedonia, or the inability to feel pleasure. Remarkably, the beneficial effects of the compounds appeared within 24 hours – much faster than the multiple weeks needed for SSRIs to produce the same effects.

“These compounds produced the most dramatic effects in animal studies that we could have hoped for,” Dr. Thompson said. “It will now be tremendously exciting to find out whether they produce similar effects in depressed patients. If these compounds can quickly provide relief of the symptoms of human depression, such as suicidal thinking, it could revolutionize the way patients are treated.”

In tests on the rats’ brains, the researchers found that the compounds rapidly increased the strength of excitatory communication in regions that were weakened by stress and are thought to be weakened in human depression. No effects of the compound were detected in unstressed animals, raising hopes that they will not produce side effects in human patients.

“This work underscores the importance of basic research to our clinical future,” said Dean E. Albert Reece, MD, PhD, MBA, who is also the vice president for Medical Affairs, University of Maryland, and the John Z. and Akiko K. Bowers Distinguished Professor and Dean of the School of Medicine. “Dr. Thompson’s work lays the crucial groundwork to transform the treatment of depression and reduce the tragic loss of lives to suicide.”

http://www.news-medical.net/news/20150714/New-study-identifies-potential-antidepressant-medications-with-few-side-effects.aspx