New 3 million year old human-like species discovered in South Africa indicates ritualistic behavior and symbolic thought, which were not previously considered possible earlier than 200,000 years ago.

By

by Pallab Ghosh
Science correspondent, BBC News, Johannesburg

Scientists have discovered a new human-like species in a burial chamber deep in a cave system in South Africa. The discovery of 15 partial skeletons is the largest single discovery of its type in Africa.

The researchers claim that the discovery will change ideas about our human ancestors.

The studies which have been published in the journal Elife also indicate that these individuals were capable of ritualistic behaviour.

The species, which has been named naledi, has been classified in the grouping, or genus, Homo, to which modern humans belong.

The researchers who made the find have not been able to find out how long ago these creatures lived – but the scientist who led the team, Prof Lee Berger, told BBC News that he believed they could be among the first of our kind (genus Homo) and could have lived in Africa up to three million years ago.

Like all those working in the field, he is at pains to avoid the term “missing link”. Prof Berger says naledi could be thought of as a “bridge” between more primitive bipedal primates and humans.

“We’d gone in with the idea of recovering one fossil. That turned into multiple fossils. That turned into the discovery of multiple skeletons and multiple individuals.

“And so by the end of that remarkable 21-day experience, we had discovered the largest assemblage of fossil human relatives ever discovered in the history of the continent of Africa. That was an extraordinary experience.”

Prof Chris Stringer of the Natural History Museum said naledi was “a very important discovery”.

“What we are seeing is more and more species of creatures that suggests that nature was experimenting with how to evolve humans, thus giving rise to several different types of human-like creatures originating in parallel in different parts of Africa. Only one line eventually survived to give rise to us,” he told BBC News.

I went to see the bones which are kept in a secure room at Witwatersrand University. The door to the room looks like one that would seal a bank vault. As Prof Berger turned the large lever on the door, he told me that our knowledge of very early humans is based on partial skeletons and the occasional skull.

he haul of 15 partial skeletons includes both males and females of varying ages – from infants to elderly. The discovery is unprecedented in Africa and will shed more light on how the first humans evolved.

“We are going to know everything about this species,” Prof Berger told me as we walked over to the remains of H. naledi.

“We are going to know when its children were weaned, when they were born, how they developed, the speed at which they developed, the difference between males and females at every developmental stage from infancy, to childhood to teens to how they aged and how they died.”

I was astonished to see how well preserved the bones were. The skull, teeth and feet looked as if they belonged to a human child – even though the skeleton was that of an elderly female.
Its hand looked human-like too, up to its fingers which curl around a bit like those of an ape.

Homo naledi is unlike any primitive human found in Africa. It has a tiny brain – about the size of a gorilla’s and a primitive pelvis and shoulders. But it is put into the same genus as humans because of the more progressive shape of its skull, relatively small teeth, characteristic long legs and modern-looking feet.

“I saw something I thought I would never see in my career,” Prof Berger told me.

“It was a moment that 25 years as a paleoanthropologist had not prepared me for.”

One of the most intriguing questions raised by the find is how the remains got there.

I visited the site of the find, the Rising Star cave, an hour’s drive from the university in an area known as the Cradle of Humankind. The cave leads to a narrow underground tunnel through which some of Prof Berger’s team crawled in an expedition funded by the National Geographic Society.

Small women were chosen because the tunnel was so narrow. They crawled through darkness lit only by their head torches on a precarious 20 minute-long journey to find a chamber containing hundreds of bones.

Among them was Marina Elliott. She showed me the narrow entrance to the cave and then described how she felt when she first saw the chamber.

“The first time I went to the excavation site I likened it to the feeling that Howard Carter must have had when he opened Tutankhamen’s tomb – that you are in a very confined space and then it opens up and all of a sudden all you can see are all these wonderful things – it was incredible,” she said.

Ms Elliott and her colleagues believe that they have found a burial chamber. The Homo naledi people appear to have carried individuals deep into the cave system and deposited them in the chamber – possibly over generations.

If that is correct, it suggests naledi was capable of ritual behaviour and possibly symbolic thought – something that until now had only been associated with much later humans within the last 200,000 years.

Prof Berger said: “We are going to have to contemplate some very deep things about what it is to be human. Have we been wrong all along about this kind of behaviour that we thought was unique to modern humans?

“Did we inherit that behaviour from deep time and is it something that (the earliest humans) have always been able to do?”

Prof Berger believes that the discovery of a creature that has such a mix of modern and primitive features should make scientists rethink the definition of what it is to be human – so much so that he himself is reluctant to describe naledi as human.

Other researchers working in the field, such as Prof Stringer, believe that naledi should be described as a primitive human. But he agrees that current theories need to be re-evaluated and that we have only just scratched the surface of the rich and complex story of human evolution.

http://www.bbc.com/news/science-environment-34192447

New research shows that people with ‘O’ blood type have decreased risk of cognitive decline

A pioneering study conducted by leading researchers at the University of Sheffield has revealed blood types play a role in the development of the nervous system and may impact the risk of developing cognitive decline.

The research, carried out in collaboration with the IRCCS San Camillo Hospital Foundation in Venice, shows that people with an ‘O’ blood type have more grey matter in their brain, which helps to protect against diseases such as Alzheimer’s, than those with ‘A’, ‘B’ or ‘AB’ blood types.

Research fellow Matteo De Marco and Professor Annalena Venneri, from the University’s Department of Neuroscience, made the discovery after analysing the results of 189 Magnetic Resonance Imaging (MRI) scans from healthy volunteers.

The researchers calculated the volumes of grey matter within the brain and explored the differences between different blood types.

The results, published in the Brain Research Bulletin, show that individuals with an ‘O’ blood type have more grey matter in the posterior proportion of the cerebellum.

In comparison, those with ‘A’, ‘B’ or ‘AB’ blood types had smaller grey matter volumes in temporal and limbic regions of the brain, including the left hippocampus, which is one of the earliest part of the brain damaged by Alzheimer’s disease.

These findings indicate that smaller volumes of grey matter are associated with non-‘O’ blood types.

As we age a reduction of grey matter volumes is normally seen in the brain, but later in life this grey matter difference between blood types will intensify as a consequence of ageing.

“The findings seem to indicate that people who have an ‘O’ blood type are more protected against the diseases in which volumetric reduction is seen in temporal and mediotemporal regions of the brain like with Alzheimer’s disease for instance,” said Matteo DeMarco.

“However additional tests and further research are required as other biological mechanisms might be involved.”

Professor Annalena Venneri added: “What we know today is that a significant difference in volumes exists, and our findings confirm established clinical observations. In all likelihood the biology of blood types influences the development of the nervous system. We now have to understand how and why this occurs.”

More information: “‘O’ blood type is associated with larger grey-matter volumes in the cerebellum,” Brain Research Bulletin, Volume 116, July 2015, Pages 1-6, ISSN 0361-9230, dx.doi.org/10.1016/j.brainresbull.2015.05.005

Scientists Have Figured Out How to Recover Forgotten Memories Still Lurking in the Brain

memory

All might not be lost. Researchers recently announced a discovery that could have significant implications later down the road for helping people with severe amnesia or Alzheimer’s disease.

The research tackles a highly debated topic of whether memory loss due to damaged brain cells means that memories cannot be stored anymore or if just accessing that memory is inhibited in some way.

Scientists from MIT found in new research that the latter is most likely the case, demonstrating how lost memories could be recovered using technology known as optogenetics, which a news release about the study described as when “proteins are added to neurons to allow them to be activated with light.”

“The majority of researchers have favored the storage theory, but we have shown in this paper that this majority theory is probably wrong,” Susumu Tonegawa, a professor in MIT’s biology department and director of the RIKEN-MIT Center at the Picower Institute for Learning and Memory, said in a statement. “Amnesia is a problem of retrieval impairment.”

First, the scientists demonstrated how “memory engram cells” — brain cells that trigger a memory upon experiencing a related sight or smell, for example — could be strengthened in mice.

The researchers then gave the mice anisomycin, which blocked protein synthesis in neurons, after they had formed a new memory. In doing so, the researchers prevented the engram cells from strengthening.

A day later, the scientists tried to trigger the memory in mice, but couldn’t see any activation that would indicate the mice were remembering it.

“So even though the engram cells are there, without protein synthesis those cell synapses are not strengthened, and the memory is lost,” Tonegawa explained of this part of the research.

The team first developed a clever technique to selectively label the neurons representing what is known as a memory engram – in other words, the brain cells involved in forming a specific memory. They did this by genetically engineering mice so they had extra genes in all their neurons. As a result, when neurons fire as a memory is formed, they produce red proteins visible under a microscope, allowing the researchers to tell which cells were part of the engram. They also inserted a gene that made the neurons fire when illuminated by blue light.

After the researchers induced amnesia, they used optogenetic tools on the mice and witnessed the animals experiencing full recollection.

“If you test memory recall with natural recall triggers in an anisomycin-treated animal, it will be amnesiac, you cannot induce memory recall. But if you go directly to the putative engram-bearing cells and activate them with light, you can restore the memory,” Tonegawa said.

With this discovery, the researchers wrote in the study published this week in the journal Science that they believe a “specific pattern of connectivity of engram cells may be crucial for memory information storage and that strengthened synapses in these cells critically contribute to the memory retrieval process.”

James Bisby, a neuroscientist at University College London, told New Scientist that it’s “not surprising that they could trigger the memories, but it is a cool way to do it.”

http://www.newscientist.com/article/dn27618-lost-memories-recovered-in-mice-with-a-flash-of-light.html

Thanks to Steven Weihing for bringing this to the It’s Interesting community.

An axon self-destruct mechanism that kills neurons

Just as losing a limb can spare a life, parting with a damaged axon by way of Wallerian degeneration can spare a neuron. A protein called SARM1 acts as the self-destruct button, and now researchers led by Jeffrey Milbrandt of Washington University Medical School in St. Louis believe they have figured out how. They report in the April 24 Science that SARM1 forms dimers that trigger the destruction of NAD+. Basic biochemistry dictates that this enzyme cofactor is essential for cell survival.

ARM1 and NAD+ have emerged as key players in the complex, orderly process underlying Wallerian degeneration. Scientists are still filling in other parts of the pathway. SARM1, short for sterile alpha and TIR motif-containing 1, seems to act as a damage sensor, but researchers are not sure how. Recently, researchers led by Marc Tessier-Lavigne at Rockefeller University, New York, found that SARM1 turns on a mitogen-activated protein (MAP) kinase cascade that is involved. Loss of NAD+ may also contribute to axon degeneration, because its concentration drops in dying axons, and Wlds mutant mice that overproduce an NAD+ synthase have slower Wallerian degeneration.

Now, first author Josiah Gerdts confirms that SARM1 is the self-destruct switch. He engineered a version of the protein with a target sequence for tobacco etch virus (TEV) protease embedded in it. Using a rapamycin-activated form of TEV, he eliminated SARM1 from axons he had sliced off of mouse dorsal root ganglion (DRG) neurons. Without SARM1, the severed axons survived.

SARM1 contains SAM and TIR domains, which promote protein-protein interactions. Previously, Gerdts discovered that the TIR domain was sufficient to induce degeneration, even in healthy axons, but it relied on the SAM region to bring multiple SARM1 molecules together. He hypothesized that axonal SARM1 multimerizes upon axon damage. To test this idea, he used a standard biochemical technique to force the SARM1 TIR domains together. He fused domains to one or another of the rapamycin-binding peptides Frb and Fkbp and expressed them in DRG neurons. When he added rapamycin to the cultures, the Frb and Fkbp snapped the TIR domains together within minutes. As Gerdts had predicted, this destroyed axons, confirming that SARM1 activates via dimerization.

Next, the authors investigated what happens to NAD+ during that process. Using high-performance liquid chromatography, Gerdts measured the concentration of NAD+ in the disembodied axons. Normally, its level dropped by about two-thirds within 15 minutes of severing. In axons from SARM1 knockout mice, however, the NAD+ concentration stayed unchanged. In neurons carrying the forced-dimerization constructs, adding rapamycin was sufficient to knock down NAD+ levels—Gerdts did not even have to cut the axons. Ramping up NAD+ production by overexpressing its synthases, NMNAT and NAMPT, overcame the effects of TIR dimerization, and the axons survived. Gerdts concluded that loss of NAD+ was a crucial, SARM1-controlled step on the way to degeneration.

He still wondered what caused the loss of NAD+. It might be that the axon simply stopped making it, or maybe the Wallerian pathway actively destroyed it. To distinguish between these possibilities, Gerdts added radiolabeled exogenous NAD+ to human embryonic kidney HEK293 cultures expressing the forced-dimerization TIR domains. Rapamycin caused them to rapidly degrade the radioactive NAD+, confirming that the cell actively disposes of it.

Gerdts suspects that with this essential cofactor gone, the axon runs out of energy and can no longer survive. He speculated that the MAP kinase cascade reportedly turned on by SARM1 might lead to NAD+ destruction. Alternatively, SARM1 might induce distinct MAP kinase and NAD+ destruction pathways in parallel, he suggested.

“Demonstrating how NAD+ is actively and locally degraded in the axon is a big advance,” commented Andrew Pieper of the Iowa Carver College of Medicine in Iowa City, who was not involved in the study. Jonathan Gilley and Michael Coleman of the Babraham Institute in Cambridge, U.K., predict that there will be more to the story. They note that a drug called FK866, which prevents NAD+ production, protects axons in some instances. Gerdts suggested that FK866 acts on processes upstream of SARM1, delaying the start of axon degeneration. In contrast, his paper only addressed what happens after SARM1 activates. “It will be fascinating to see how the apparent contradictions raised by this new study will be resolved,” wrote Gilley and Coleman.

Could these findings help researchers looking for ways to prevent neurodegeneration? “The study supports the notion that augmenting NAD+ levels is potentially a valuable approach,” said Pieper. He and his colleagues developed a small molecule that enhances NAD+ synthesis, now under commercial development. It improved symptoms in ALS model mice, and protected neurons in mice mimicking Parkinson’s. NAD+ also activates sirtuin, an enzyme important for longevity and stress resistance as well as learning and memory.

However, both Pieper and Gerdts cautioned that they cannot clearly predict which conditions might benefit from an anti-SARM1 or NAD+-boosting therapy. At this point, Gerdts said, researchers do not fully understand how much axon degeneration contributes to symptoms of diseases like Alzheimer’s and Parkinson’s. He suggested that crossing SARM1 knockout mice with models for various neurodegenerative conditions would indicate how well an anti-Wallerian therapy might work.

—Amber Dance

http://www.alzforum.org/news/research-news/axon-self-destruct-button-triggers-energy-woes

Chinese researchers report first-ever gene editing of human embryos

In an ethically charged first, Chinese researchers have used gene editing to modify human embryos obtained from an in-vitro fertilization clinic.

The 16-person scientific team, based at the Sun Yat-Sen University in Guangzhou, China, set out to see whether it could correct the gene defect that causes beta-thalassemia, a blood disease, by editing the DNA of fertilized eggs.

The team’s report showed the method is not yet very accurate, confirming scientific doubts around whether gene editing could be practical in human embryos, and whether genetically engineered people are going to be born anytime soon.

The authors’ report appeared on April 18 in a low-profile scientific journal called Protein & Cell. The authors, led by Junjiu Huang, say there is a “pressing need” to improve the accuracy of gene editing before it can be applied clinically, for instance to produce children with repaired genes.

The team did not try to establish a pregnancy and say for ethical reasons they did their tests only in embryos that were abnormal.

“These authors did a very good job pointing out the challenges,” says Dieter Egli, a researcher at the New York Stem Cell Foundation in Manhattan. “They say themselves this type of technology is not ready for any kind of application.”

The paper had previously circulated among researchers and had provoked concern by highlighting how close medical science may be to tinkering with the human gene pool.

n March, an industry group called for a complete moratorium on experiments of the kind being reported from China, citing risks and the chance they would open the door to eugenics, or changing nonmedical traits in embryos, such as stature or intelligence.

Other scientists recommended high-level meetings of experts, regulators, and ethicists to debate if there are acceptable uses for such engineering.

The Chinese team reported editing the genes of more than 80 embryos using a technology called CRISPR-Cas9. While in some cases they were successful, in others the CRISPR technology didn’t work or introduced unexpected mutations. Some of the embryos ended up being mosaics, with a repaired gene in some cells, but not in others.

Parents who are carriers of beta-thalassemia could choose to test their IVF embryos, selecting those that have not inherited the disease-causing mutation. However, gene editing opens the possibility of germline modification, or permanently repairing the gene in an embryo, egg, or sperm in a way that is passed onto the offspring and to future generations.

That idea is the subject of intense debate, since some think the human gene pool is sacrosanct and should never be the subject of technological alteration, even for medical reasons. Others allow that germline engineering might one day be useful, but needs much more testing. “You can’t discount it,” says Egli. “It’s very interesting.”

The Chinese team performed the gene editing in eggs that had been fertilized in an IVF clinic but were abnormal because they had been fertilized by two sperm, not one. “Ethical reasons precluded studies of gene editing in normal embryos,” they said.

Abnormal embryos are widely available for research, both in China and the U.S. At least one U.S. genetics center is also using CRISPR in abnormal embryos rejected by IVF clinics. That group described aspects of its work on the condition that it would not be identified, since the procedure remains controversial.

Making repairs using CRISPR harnesses a cell’s own DNA repair machinery to correct genes. The technology guides a cutting protein to a particular site on the DNA molecule, chopping it open. If a DNA “repair template” is provided—in this case a correct version of the beta-globin gene—the DNA will mend itself using the healthy sequence.

The Chinese group says that among the problems they encountered, the embryo sometimes ignored the template, and instead repaired itself using similar genes from its own genome, “leading to untoward mutations.”

Huang said he stopped the research after the poor results. “If you want to do it in normal embryos, you need to be close to 100 percent,” Huang told Nature News. “That’s why we stopped. We still think it’s too immature.”

http://www.technologyreview.com/news/536971/chinese-team-reports-gene-editing-human-embryo/

Thanks to Michael Moore for bringing this to the It’s Interesting community.

Blueberries may be effective in the treatment for post-traumatic stress disorder (PTSD)

Researchers from Louisiana State University have found that blueberries may be effective in the treatment for post-traumatic stress disorder (PTSD). Findings from the study have been presented at the Experimental Biology Meeting in Boston, MA.

Presently, the only therapy approved by the Food and Drug Administration (FDA) for PTSD is selective serotonin reuptake inhibitors (SSRIs) such as sertraline and paroxetine. Study authors have previously shown that SSRIs increase the level of serotonin (5-HT) and norepinephrine, and that the increased norepinephrine be a possible reason for the reduced efficacy of SSRI therapy.

For this study, the team studied the ability of blueberries to modulate neurotransmitter levels in a rat model of PTSD. Some of the rats received a 2% blueberry-enriched supplement diet and others received a control diet. A third control group consisted of rats without PTSD and received a standard diet without blueberries. Scientists used high-performance liquid chromatography to to measure monoamines and related metabolite levels.

Rats with PTSD who did not receive blueberries showed a predictable increase in 5-HT and norepinephrine level compared with the control group. But rats with PTSD that received blueberries showed a beneficial increase in 5-HT levels with no impact on norepinephrine levels, which suggest that blueberries can alter neurotransmitter levels in PTSD. More studies are needed to understand the protective effects of blueberries and its potential target as a treatment for PTSD.

http://www.empr.com/benefits-of-blueberries-for-post-traumatic-stress-disorder-explored-in-study/article/405810/

Scientists achieve implantation of memory into the brains of mice while they sleep

Sleeping minds: prepare to be hacked. For the first time, conscious memories have been implanted into the minds of mice while they sleep. The same technique could one day be used to alter memories in people who have undergone traumatic events.

When we sleep, our brain replays the day’s activities. The pattern of brain activity exhibited by mice when they explore a new area during the day, for example, will reappear, speeded up, while the animal sleeps. This is thought to be the brain practising an activity – an essential part of learning. People who miss out on sleep do not learn as well as those who get a good night’s rest, and when the replay process is disrupted in mice, so too is their ability to remember what they learned the previous day.

Karim Benchenane and his colleagues at the Industrial Physics and Chemistry Higher Educational Institution in Paris, France, hijacked this process to create new memories in sleeping mice. The team targeted the rodents’ place cells – neurons that fire in response to being in or thinking about a specific place. These cells are thought to help us form internal maps, and their discoverers won a Nobel prize last year.

Benchenane’s team used electrodes to monitor the activity of mice’s place cells as the animals explored an enclosed arena, and in each mouse they identified a cell that fired only in a certain arena location. Later, when the mice were sleeping, the researchers monitored the animals’ brain activity as they replayed the day’s experiences. A computer recognised when the specific place cell fired; each time it did, a separate electrode would stimulate brain areas associated with reward.

When the mice awoke, they made a beeline for the location represented by the place cell that had been linked to a rewarding feeling in their sleep. A brand new memory – linking a place with reward – had been formed.

It is the first time a conscious memory has been created in animals during sleep. In recent years, researchers have been able to form subconscious associations in sleeping minds – smokers keen to quit can learn to associate cigarettes with the smells of rotten eggs and fish in their sleep, for example.

Previous work suggested that if this kind of subconscious learning had occurred in Benchenane’s mice, they would have explored the arena in a random manner, perhaps stopping at the reward-associated location. But these mice headed straight for the location, suggesting a conscious memory. “The mouse develops a goal-directed behaviour to go towards the place,” says Benchenane. “It proves that it’s not an automatic behaviour. What we create is an association between a particular place and a reward that can be consciously accessed by the mouse.”

“The mouse is remembering enough abstract information to think ‘I want to go to a certain place’, and go there when it wakes up,” says neuroscientist Neil Burgess at University College London. “It’s a bigger breakthrough [than previous studies] because it really does show what the man in the street would call a memory – the ability to bring to mind abstract knowledge which can guide behaviour in a directed way.”

Benchenane doesn’t think the technique can be used to implant many other types of memories, such as skills – at least for the time being. Spatial memories are easier to modify because they are among the best understood.

His team’s findings also provide some of the strongest evidence for the way in which place cells work. It is almost impossible to test whether place cells function as an internal map while animals are awake, says Benchenane, because these animals also use external cues, such as landmarks, to navigate. By specifically targeting place cells while the mouse is asleep, the team were able to directly test theories that specific cells represent specific places.

“Even when those place cells fire in sleep, they still convey spatial information,” says Benchenane. “That provides evidence that when you’ve got activation of place cells during the consolidation of memories in sleep, you’ve got consolidation of the spatial information.”

Benchenane hopes that his technique could be developed to help alter people’s memories, perhaps of traumatic events (see “Now it’s our turn”, below).

Loren Frank at the University of California, San Francisco, agrees. “I think this is a really important step towards helping people with memory impairments or depression,” he says. “It is surprising to me how many neurological and psychiatric illnesses have something to do with memory, including schizophrenia and obsessive compulsive disorder.”

“In principle, you could selectively change brain processing during sleep to soften memories or change their emotional content,” he adds.

Journal reference: Nature Neuroscience, doi:10.1038/nn.3970

http://www.newscientist.com/article/dn27115-new-memories-implanted-in-mice-while-they-sleep.html#.VP_L9uOVquD

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

The eternity drive: Why DNA could be the future of data storage

By Peter Shadbolt, for CNN

How long will the data last in your hard-drive or USB stick? Five years? 10 years? Longer?

Already a storage company called Backblaze is running 25,000 hard drives simultaneously to get to the bottom of the question. As each hard drive coughs its last, the company replaces it and logs its lifespan.

While this census has only been running five years, the statistics show a 22% attrition rate over four years.

Some may last longer than a decade, the company says, others may last little more than a year; but the short answer is that storage devices don’t last forever.

Science is now looking to nature, however, to find the best way to store data in a way that will make it last for millions of years.

Researchers at ETH Zurich, in Switzerland, believe the answer may lie in the data storage system that exists in every living cell: DNA.

So compact and complex are its strands that just 1 gram of DNA is theoretically capable of containing all the data of internet giants such as Google and Facebook, with room to spare.

In data storage terms, that gram would be capable of holding 455 exabytes, where one exabyte is equivalent to a billion gigabytes.

Fossilization has been known to preserve DNA in strands long enough to gain an animal’s entire genome — the complete set of genes present in a cell or organism.

So far, scientists have extracted and sequenced the genome of a 110,000-year-old polar bear and more recently a 700,000-year-old horse.

Robert Grass, lecturer at the Department of Chemistry and Applied Biosciences, said the problem with DNA is that it degrades quickly. The project, he said, wanted to find ways of combining the possibility of the large storage density in DNA with the stability of the DNA found in fossils.

“We have found elegant ways of making DNA very stable,” he told CNN. “So we wanted to combine these two stories — to get the high storage density of DNA and combine it with the archaeological aspects of DNA.”

The synthetic process of preserving DNA actually mimics processes found in nature.

As with fossils, keeping the DNA cool, dry and encased — in this case, with microscopic spheres of glass – could keep the information contained in its strands intact for thousands of years.

“The time limit with DNA in fossils is about 700,000 years but people speculate about finding one-million-year storage of genomic material in fossil bones,” he said.

“We were able to show that decay of our DNA and store of information decays at the same rate as the fossil DNA so we get to similar time frames of close to a million years.”

Fresh fossil discoveries are throwing up new surprises about the preservation of DNA.

Human bones discovered in the Sima de los Huesos cave network in Spain show maternally inherited “mitochondrial” DNA that is 400,000 years old – a new record for human remains.

The fact that the DNA survived in the relatively cool climate of a cave — rather than in a frozen environment as with the DNA extracted from mammoth remains in Siberia – has added to the mystery about DNA longevity.

“A lot of it is not really known,” Grass says. “What we’re trying to understand is how DNA decays and what the mechanisms are to get more insight into that.”

What is known is that water and oxygen are the enemy of DNA survival. DNA in a test tube and exposed to air will last little more than two to three years. Encasing it in glass — an inert, neutral agent – and cooling it increases its chances of survival.

Grass says sol-gel technology, which produces solid materials from small molecules, has made it a relatively easy process to get the glass around the DNA molecules.

While the team’s work invites immediate comparison with Jurassic Park, where DNA was extracted from amber fossils, Grass says that prehistoric insects encased in amber are a poor source of prehistoric DNA.

“The best DNA comes from sources that are ceramic and dry — so teeth, bones and even eggshells,” he said.

So far the team has tested their storage method by preserving just 83 kilobytes of data.

“The first is the Swiss Federal Charter of 1291 — it’s like the Swiss Magna Carta — and the other was the Archimedes Palimpsest; a copy of an Ancient Greek mathematics treatise made by a monk in the 10th century but which had been overwritten by other monks in the 15th century.

“We wanted to preserve these documents to show not just that the method works, but that the method is important too,” he said.

He estimates that the information will be readable in 10,000 years’ time, and if frozen, as long as a million years.

The cost of encoding just 83Kb of data cost about $2,000, making it a relatively expensive process, but Grass is optimistic that price will come down over time. Advances in technology for medical analysis, he said, are likely to help with this.

“Already the prices for human genome sequences have dropped from several millions of dollars a few years ago to just hundreds of dollars now,” Grass said.

“It makes sense to integrate these advances in medical and genome analysis into the world of IT.”

http://www.cnn.com/2015/02/25/tech/make-create-innovate-fossil-dna-data-storage/index.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+rss%2Fcnn_latest+%28RSS%3A+Most+Recent%29

Risk of American ‘megadroughts’ for decades, NASA warns

There is no precedent in contemporary weather records for the kinds of droughts the country’s West will face, if greenhouse gas emissions stay on course, a NASA study said.

No precedent even in the past 1,000 years.

The feared droughts would cover most of the western half of the United States — the Central Plains and the Southwest.

Those regions have suffered severe drought in recent years. But it doesn’t compare in the slightest to the ‘megadroughts’ likely to hit them before the century is over due to global warming.
These will be epochal, worthy of a chapter in Earth’s natural history.

Even if emissions drop moderately, droughts in those regions will get much worse than they are now, NASA said.

The space agency’s study conjures visions of the sun scorching cracked earth that is baked dry of moisture for feet below the surface, across vast landscapes, for decades. Great lake reservoirs could dwindle to ponds, leaving cities to ration water to residents who haven’t fled east.

“Our projections for what we are seeing is that, with climate change, many of these types of droughts will likely last for 20, 30, even 40 years,” said NASA climate scientist Ben Cook.

That’s worse and longer than the historic Dust Bowl of the 1930s, when “black blizzards” — towering, blustery dust walls — buried Southern Plains homes, buggies and barns in dirt dunes.

It lasted about 10 years. Though long, it was within the framework of a contemporary natural drought.

To find something almost as extreme as what looms, one must go back to Medieval times.

Nestled in the shade of Southwestern mountain rock, earthen Ancestral Pueblo housing offers a foreshadowing. The tight, lively villages emptied out in the 13th century’s Great Drought that lasted more than 30 years.

No water. No crops. Starvation drove populations out to the east and south.

If NASA’s worst case scenario plays out, what’s to come could be worse.

Its computations are based on greenhouse gas emissions continuing on their current course. And they produce an 80% chance of at least one drought that could last for decades.

One “even exceeding the duration of the long term intense ‘megadroughts’ that characterized the really arid time period known as the Medieval Climate Anomaly,” Cook said.

That was a period of heightened global temperatures that lasted from about 1100 to 1300 — when those Ancestral Pueblos dispersed. Global average temperatures are already higher now than they were then, the study said.

The NASA team’s study was very data heavy.

It examined past wet and dry periods using tree rings going back 1,000 years and compared them with soil moisture from 17 climate models, NASA said in the study published in Science Advances.

Scientists used super computers to calculate the models forward along the lines of human induced global warming scenarios. The models all showed a much drier planet.

Some Southwestern areas that are currently drought-stricken are filling up with more people, creating more demand for water while reservoirs are already strained.

The predicted megadroughts will wrack water supplies much harder, NASA Goddard Space Flight Center said.

“These droughts really represent events that nobody in the history of the United States has ever had to deal with,” Cook said.

Compared with the last millennium, the dryness will be unprecedented. Adapting to it will be tough.

http://www.cnn.com/2015/02/14/us/nasa-study-western-megadrought/index.html

Among New York Subway’s Millions of Riders, a Study Finds Many Mystery Microbes

Have you ever been on the subway and seen something that you did not quite recognize, something mysteriously unidentifiable?

Well, there is a good chance scientists do not know what it is either.

Researchers at Weill Cornell Medical College released a study on Thursday that mapped DNA found in New York’s subway system — a crowded, largely subterranean behemoth that carries 5.5 million riders on an average weekday, and is filled with hundreds of species of bacteria (mostly harmless), the occasional spot of bubonic plague, and a universe of enigmas. Almost half of the DNA found on the system’s surfaces did not match any known organism and just 0.2 percent matched the human genome.

“People don’t look at a subway pole and think, ‘It’s teeming with life,’ ” said Dr. Christopher E. Mason, a geneticist at Weill Cornell Medical College and the lead author of the study. “After this study, they may. But I want them to think of it the same way you’d look at a rain forest, and be almost in awe and wonder, effectively, that there are all these species present — and that you’ve been healthy all along.”

Dr. Mason said the inspiration for the study struck about four years ago when he was dropping off his daughter at day care. He watched her explore her new surroundings by happily popping objects into her mouth. As is the custom among tiny children, friendships were made on the floor, by passing back and forth toys that made their way from one mouth to the next.

“I couldn’t help thinking, ‘How much is being transferred, and on which kinds of things?’ ” Dr. Mason said. So he considered a place where adults can get a little too close to each other, the subway.

Thus was the project, called PathoMap, born. Over the past 17 months, a team mainly composed of medical students, graduate students and volunteers fanned out across the city, using nylon swabs to collect DNA, in triplicate, from surfaces that included wooden benches, stairway handrails, seats, doors, poles and turnstiles.

In addition to the wealth of mystery DNA — which was not unexpected given that only a few thousand of the world’s genomes have been fully mapped — the study’s other findings reflected New York’s famed diversity, both human and microbial.

The Bronx was found to be the most diverse borough in terms of microbial species. Brooklyn claimed second place, followed by Manhattan, Queens and Staten Island, where researchers took samples on the Staten Island Railway.

On the human front, Dr. Mason said that, in some cases, the DNA that was found in some subway stations tended to match the neighborhood’s demographic profile. An area with a high concentration of Hispanic residents near Chinatown in Manhattan, for example, yielded a large amount of Hispanic and Asian genes.

In an area of Brooklyn to the south of Prospect Park that roughly encompassed the Kensington and Windsor Terrace neighborhoods, the DNA gathered frequently read as British, Tuscan, and Finnish, three groups not generally associated with the borough. Dr. Mason had an explanation for the finding: Scientists have not yet compiled a reliable database of Irish genes, so the many people of Irish descent who live in the area could be the source of DNA known to be shared with other European groups. The study produced some less appetizing news. Live, antibiotic-resistant bacteria were discovered in 27 percent of the collected samples, though among all the bacteria, only 12 percent could be associated with disease. Researchers also found three samples associated with bubonic plague and two with DNA fragments of anthrax, though they noted that none of those samples showed evidence of being alive, and that neither disease had been diagnosed in New York for some time. The presence of anthrax, Dr. Mason said, “is consistent with the many documented cases of anthrax in livestock in New York State and the East Coast broadly.”

The purpose of the study was not simply to satisfy scientific curiosity, the authors said. By cataloging species now, researchers can compare them against samples taken in the future to determine whether certain diseases, or even substances used as bioterrorism weapons, had spread.

City and transit officials did not sound grateful for the examination.

“As the study clearly indicates, microbes were found at levels that pose absolutely no danger to human life and health,” Kevin Ortiz, a spokesman for the Metropolitan Transportation Authority, said in an email. And the city’s health department called the study “deeply flawed” and misleading.

Dr. Mason responded by saying he and his team had simply presented their complete results.

“For us to not report the fragments of anthrax and plague in the context of a full analysis would have been irresponsible,” he said. “Our findings indicate a normal, healthy microbiome, and we welcome others to review the publicly available data and run the same analysis.”

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.