Posts Tagged ‘DNA’

Methyl chemical groups dot lengths of DNA, helping to control when certain genes are accessible by a cell. In new research, UCLA scientists have shown that at the connections between brain cells—which often are located far from the central control centers of the cells—methyl groups also dot chains of RNA. This methyl markup of RNA molecules is likely key to brain cells’ ability to quickly send signals to other cells and react to changing stimuli in a fraction of a second.

To dictate the biology of any cell, DNA in the cell’s nucleus must be translated into corresponding strands of RNA. Next, the messenger RNA, or mRNA—an intermediate genetic molecule between DNA and proteins—is transcribed into proteins. If a cell suddenly needs more of a protein—to adapt to an incoming signal, for instance—it must translate more DNA into mRNA. Then it must make more proteins and shuttle them through the cell to where they are needed. This process means that getting new proteins to a distant part of a cell, like the synapses of neurons where signals are passed, can take time.

Research has recently suggested that methyl chemical groups, which can control when DNA is transcribed into mRNA, are also found on strands of mRNA. The methylation of mRNA, researchers hypothesize, adds a level of control to when the mRNA can be translated into proteins, and their occurrence has been documented in a handful of organs throughout the bodies of mammals. The pattern of methyls on mRNA in any given cell is dubbed the “epitranscriptome.”

UCLA and Kyoto University researchers mapped out the location of methyls on mRNA found at the synapses, or junctions, of mouse brain cells. They isolated brain cells from adult mice and compared the epitranscriptome found at the synapses to the epitranscriptomes of mRNA elsewhere in the cells. At more than 4,000 spots on the genome, the mRNA at the synapse was methylated more often. More than half of these spots, the researchers went on to show, are in genes that encode proteins found mostly at the synapse. The researchers found that when they disrupted the methylation of mRNA at the synapse, the brain cells didn’t function normally.

The methylation of mRNA at the synapse is likely one of many ways that neurons speed up their ability to send messages, by allowing the mRNA to be poised and ready to translate into proteins when needed.

The levels of key proteins at synapses have been linked to a number of psychiatric disorders, including autism. Understanding how the epitranscriptome is regulated, and what role it plays in brain biology, may eventually provide researchers with a new way to control the proteins found at synapses and, in turn, treat disorders characterized by synaptic dysfunction.

More information: Daria Merkurjev et al. Synaptic N6-methyladenosine (m6A) epitranscriptome reveals functional partitioning of localized transcripts, Nature Neuroscience (2018). DOI: 10.1038/s41593-018-0173-6

Read more at: https://phys.org/news/2018-08-methyl-rna-key-brain-cell.html#jCp

Advertisements

by CAROLYN Y. JOHNSON

In 1959, Soviet scientists embarked on an audacious experiment to breed a population of tame foxes, a strain of animals that wouldn’t be aggressive or fearful of people.

Scientists painstakingly selected the friendliest foxes to start each new generation, and within 10 cycles they began to see differences from wild foxes – fox pups that wagged their tails eagerly at people or with ears that stayed folded like a dog’s.

This study in animal domestication, known as the Russian farm-fox experiment, might be just a fascinating historical footnote – a quirky corner in the otherwise fraught scientific heritage of Soviet Russia.

Instead, it spawned an ongoing area of research into how domestication, based purely on behavioral traits, can result in other changes – like curlier tails and changes to fur color.

Now, the tools of modern biology are revealing the genetic changes that underpin the taming of foxes of Siberia.

In a new study, published Monday in Nature Ecology & Evolution, scientists used genome sequencing to identify 103 stretches of the fox genome that appear to have been changed by breeding, a first pass at identifying the genes that make some foxes comfortable with humans and others wary and aggressive.

The scientists studied the genomes of 10 foxes from three different groups: the tame population, a strain that was bred to be aggressive toward people and a conventional group bred to live on a farm.

Having genetic information from all three groups allowed the researchers to identify regions of the genome that were likely to have changed due to the active selection of animals with different behaviors, rather than natural fluctuation over time.

Those regions offer starting points in efforts to probe the genetic basis and evolution of complex traits, such as sociability or aggressiveness.

“The experiment has been going on for decades and decades, and to finally have the genome information, you get to look and see where in the genome and what in the genome has been likely driving these changes that we’ve seen – it’s a very elegant experimental design,” said Adam Boyko, an associate professor of biomedical sciences at Cornell University, who was not involved in the study.

While some genetic traits are relatively simple to unravel, the underpinnings of social behaviors aren’t easy to dissect. Behavior is influenced by hundreds or thousands of genes, as well as the environment – and typically behaviors fall on a wide spectrum.

The existence of fox populations bred solely for how they interact with people offers a rare opportunity to strip away some of the other complexity – with possible implications for understanding such traits in people and other animals, too, since evolution may work on the same pathways or even the same genes.

“We’re interested to see what are the genes that make such a big difference in behavior. There are not so many animal models which are good to study genetics of social behavior, and in these foxes it’s such a big difference between tame foxes compared to conventional foxes, and those selected for aggressive behavior,” said Anna Kukekova, an assistant professor at the University of Illinois at Urbana-Champaign, who led the work.

Kukekova and colleagues began studying one very large gene that they think may be linked to tame behavior, called SorCS1. The gene plays a role in sorting proteins that allow brain cells to communicate.

Kukekova is interested in determining what happens if the gene is deleted in a mouse and to search for specific mutations that might contribute to differences in behavior.

Bridgett vonHoldt, an assistant professor of ecology and evolutionary biology at Princeton University, said changes that occurred in foxes “overlap extensively with those observed in the transition of gray wolves to modern domestic dogs.”

She said the study may help dog and fox biologists determine if there are complex behavioral traits under the control of just a few genes.

Recent fox evolution in a domesticated population may seem to have little to do with understanding the genetics of human behavior, but interest in domestication has grown as an area of scientific interest in part because genes involved in behavior in one animal may play a similar role in another.

“One reason why it is interesting is it gives us some insights about us. Humans are domesticated themselves, in a way,” Boyko said.

“We’re much more tolerant of being around other humans than probably we were as we were evolving; we’ve had to undergo a transformation, even relatively recently from the agricultural revolution.”

https://www.sciencealert.com/soviet-era-fox-taming-experiment-may-reveal-genes-behind-social-behavior

by PETER DOCKRILL

The appearance of wrinkled, weathered skin and the disappearance of hair are two of the regrettable hallmarks of getting older, but new research suggests these physical manifestations of ageing might not be permanent – and can potentially be reversed.

New experiments with mice show that by treating a mutation-based imbalance in mitochondrial function, animals that looked physically aged regrew hair and lost their wrinkles – restoring them to a healthy, youthful appearance in just weeks.

“To our knowledge, this observation is unprecedented,” says geneticist Keshav Singh from the University of Alabama at Birmingham.

One of the focal points of anti-ageing research is investigating the so-called mitochondrial theory of ageing, which posits that mutations in the DNA of our mitochondria – the ‘powerhouse of the cell’ – contribute over time to defects in these organelles, giving rise to ageing itself, associated chronic diseases, and other human pathologies.

To investigate these mechanisms, Singh and fellow researchers genetically modified mice to have depleted mitochondrial DNA (mtDNA).

They did this by adding the antibiotic doxycycline to the food and drinking water of transgenic mice. This turned on a mutation which causes mitochondrial dysfunction and depletes their healthy levels of mtDNA.

In the space of eight weeks, the previously healthy mice developed numerous physical changes reminiscent of natural ageing: greying and significantly thinning hair, wrinkled skin, along with slowed movements and lethargy.

The depleted mice also showed an increased numbers of skin cells, contributing to an abnormal thickening of the outer layer of their skin, in addition to dysfunctional hair follicles, and an imbalance between enzymes and inhibitors that usually prevents collagen fibres from wrinkling skin.

But once the doxycycline was no longer fed to the animals, and their mitochondria could get back to doing what they do best, the mice regained their healthy, youthful appearance within just four weeks.

Effectively, they reverted to the animals they were before their mitochondrial DNA content was tampered with – which could mean mitochondria are reversible regulators of skin ageing and hair loss.

“It suggests that epigenetic mechanisms underlying mitochondria-to-nucleus cross-talk must play an important role in the restoration of normal skin and hair phenotype,” says Singh.

“Further experiments are required to determine whether phenotypic changes in other organs can also be reversed to wildtype level by restoration of mitochondrial DNA.”

Even though the mitochondrial depletion affected the entire animal, for the most part the induced mutation did not seem to greatly affect other organs – suggesting hair and skin tissue are most susceptible to the depletion.

But it could also mean the discovery here isn’t the fountain of youth for slowing or reversing the wider physiological causes of ageing – only its more surface, cosmetic symptoms. Although, at least some in the scientific community aren’t persuaded yet.

“While this is a clever proof of principle, I am not convinced of the clinical relevance of this,” biologist Lindsay Wu, from the Laboratory for Ageing Research at the University of New South Wales, who was not involved in the study, told ScienceAlert.

“The rate of mitochondrial DNA mutations here is many orders of magnitude higher than the rate of mitochondrial DNA mutations observed during normal ageing.”

“I would be really keen to see what happens when they turn down the rate of mutations to a lower level more relevant to normal ageing,” Wu added.

In that vein – with further research, and assuming these effects can be replicated outside the bodies of mice, which isn’t yet known – it’s possible this could turn out to be a major discovery in the field.

For their part, at least, the researchers are convinced mtDNA mutations can teach us a lot more about how the clocks in our bodies might be stopped (or wound back to another time entirely).

“This mouse model should provide an unprecedented opportunity for the development of preventative and therapeutic drug development strategies to augment the mitochondrial functions for the treatment of ageing-associated skin and hair pathology,” the authors write in their paper, “and other human diseases in which mitochondrial dysfunction plays a significant role.”

The findings are reported in Cell Death and Disease.

https://www.sciencealert.com/unprecedented-dna-discovery-actually-reverses-wrinkles-and-hair-loss-mitochondria-mutation-mtdna

Scientists have revealed a new link between alcohol, heart health and our genes.

The researchers investigated faulty versions of a gene called titin which are carried by one in 100 people or 600,000 people in the UK.

Titin is crucial for maintaining the elasticity of the heart muscle, and faulty versions are linked to a type of heart failure called dilated cardiomyopathy.

Now new research suggests the faulty gene may interact with alcohol to accelerate heart failure in some patients with the gene, even if they only drink moderate amounts of alcohol.

The research was carried out by scientists from Imperial College London, Royal Brompton Hospital, and MRC London Institute of Medical Sciences, and published this week in the latest edition of the Journal of the American College of Cardiology.

The study was supported by the Department of Health and Social Care and the Wellcome Trust through the Health Innovation Challenge Fund.

In the first part of the study, the team analysed 141 patients with a type of heart failure called alcoholic cardiomyopathy (ACM). This condition is triggered by drinking more than 70 units a week (roughly seven bottles of wine) for five years or more. In severe cases the condition can be fatal, or leave patients requiring a heart transplant.

The team found that the faulty titin gene may also play a role in the condition. In the study 13.5 per cent of patients were found to carry the mutation – much higher than the proportion of people who carry them in the general population.

These results suggest this condition is not simply the result of alcohol poisoning, but arises from a genetic predisposition – and that other family members may be at risk too, explained Dr James Ware, study author from the National Heart and Lung Institute at Imperial.

“Our research strongly suggests alcohol and genetics are interacting – and genetic predisposition and alcohol consumption can act together to lead to heart failure. At the moment this condition is assumed to be simply due to too much alcohol. But this research suggests these patients should also be checked for a genetic cause – by asking about a family history and considering testing for a faulty titin gene, as well as other genes linked to heart failure,” he said.

He added that relatives of patients with ACM should receive assessment and heart scans – and in some cases have genetic tests – to see if they unknowingly carry the faulty gene.

In a second part of the study, the researchers investigated whether alcohol may play a role in another type of heart failure called dilated cardiomyopathy (DCM). This condition causes the heart muscle to become stretched and thin, and has a number of causes including viral infections and certain medications. The condition can also be genetic, and around 12 per cent of cases of DCM are thought to be linked to a faulty titin gene.

In the study the team asked 716 patients with dilated cardiomyopathy how much alcohol they consumed.

None of the patients consumed the high-levels of alcohol needed to cause ACM. But the team found that in patients whose DCM was caused by the faulty titin gene, even moderately increased alcohol intake (defined as drinking above the weekly recommended limit of 14 units), affected the heart’s pumping power.

Compared to DCM patients who didn’t consume excess alcohol (and whose condition wasn’t caused by the faulty titin gene), excess alcohol was linked to reduction in heart output of 30 per cent.

More research is now needed to investigate how alcohol may affect people who carry the faulty titin gene, but do not have heart problems, added Dr Paul Barton, study co-author from the National Heart and Lung Institute at Imperial:

“Alcohol and the heart have a complicated relationship. While moderate levels may have benefits for heart health, too much can cause serious cardiac problems. This research suggests that in people with titin-related heart failure, alcohol may worsen the condition.

“An important wider question is also raised by the study: do mutations in titin predispose people to heart failure when exposed to other things that stress the heart, such as cancer drugs or certain viral infections? This is something we are actively seeking to address.”

The research was supported by the Department of Health and Social Care and Wellcome Trust through the Health Innovation Challenge Fund, the Medical Research Council, the NIHR Cardiovascular Biomedical Research Unit at Royal Brompton & Harefield NHS Foundation Trust and the British Heart Foundation.

Reference: Ware, J. S., Amor-Salamanca, A., Tayal, U., Govind, R., Serrano, I., Salazar-Mendiguchía, J., … Garcia-Pavia, P. (2018). Genetic Etiology for Alcohol-Induced Cardiac Toxicity. Journal of the American College of Cardiology, 71(20), 2293–2302. https://doi.org/10.1016/j.jacc.2018.03.462

https://www.technologynetworks.com/genomics/news/faulty-gene-leads-to-alcohol-induced-heart-failure-304365?utm_campaign=Newsletter_TN_BreakingScienceNews&utm_source=hs_email&utm_medium=email&utm_content=63228690&_hsenc=p2ANqtz-9oqDIw3te1NPoj51s94kxnA1ClK8Oiecfela6I4WiITEbm_-SWdmw6pjMTwm2YP24gqSzRaBvUK1kkb2kZEJKPcL5JtQ&_hsmi=63228690

Prof Neil Gemmell, a New Zealand scientist leading the project, said he did not believe in Nessie, but was confident of finding genetic codes for other creatures.

He said a “biological explanation” might be found to explain some of the stories about the Loch Ness Monster.

The team will collect tiny fragments of skin and scales for two weeks in June.

Prof Gemmell, from the University of Otago in Dunedin, said: “I don’t believe in the idea of a monster, but I’m open to the idea that there are things yet to be discovered and not fully understood.

“Maybe there’s a biological explanation for some of the stories.”

The University of the Highlands and Islands’ UHI Rivers and Lochs Institute in Inverness is assisting in the project.

Other organisms

After the research team’s trip to Loch Ness, the samples will be sent to laboratories in New Zealand, Australia, Denmark and France to be analysed against a genetic database.

Prof Gemmell said: “There’s absolutely no doubt that we will find new stuff. And that’s very exciting.

“While the prospect of looking for evidence of the Loch Ness monster is the hook to this project, there is an extraordinary amount of new knowledge that we will gain from the work about organisms that inhabit Loch Ness – the UK’s largest freshwater body.”

The scientist said the team expected to find sequences of DNA from plants, fish and other organisms.

He said it would be possible to identify these plants and animals by comparing the sequences of their DNA against sequences held on a large, international database.

Prof Gemmell added: “There is this idea that an ancient Jurassic Age reptile might be in Loch Ness.

“If we find any reptilian DNA sequences in Loch Ness, that would be surprising and would be very, very interesting.”

The Loch Ness Monster is one of Scotland’s oldest and most enduring myths. It inspires books, TV shows and films, and sustains a major tourism industry around its home.

The story of the monster can be traced back 1,500 years when Irish missionary St Columba is said to have encountered a beast in the River Ness in 565AD.

Later, in the 1930s, The Inverness Courier reported the first modern sighting of Nessie.

Whale-like creature

In 1933, the newspaper’s Fort Augustus correspondent, Alec Campbell, reported a sighting by Aldie Mackay of what she believed to be Nessie.

Mr Campbell’s report described a whale-like creature and the loch’s water “cascading and churning”.

The editor at the time, Evan Barron, suggested the beast be described as a “monster”, kick starting the modern myth of the Loch Ness Monster.

Over the years various efforts have tried and failed to find the beast.

In tourism terms, there are two exhibitions dedicated to the monster and there is not a tourist shop in the Highlands, and even more widely across Scotland, where a cuddly toy of Nessie cannot be found.

In 2016, the inaugural Inverness Loch Ness International Knitting Festival exhibited knitted Nessie’s made from all parts of the world.

‘Record high’

In popular culture, the Loch Ness Monster has reared its head many times, including in 1975’s four-part Doctor Who – Terror of the Zygons, the 1980s cartoon The Family-Ness as well as The Simpsons and 1996’s Loch Ness starring Ted Danson.

In 2014, it was reported that for the first time in almost 90 years no “confirmed sightings” had been made of the Loch Ness Monster.

Gary Campbell, who keeps a register of sightings, said no-one had come forward in 18 months to say they had seen the monster.

But last year, sightings hit a record high.

http://www.bbc.com/news/uk-scotland-highlands-islands-44223259

In the age of big data, we are quickly producing far more digital information than we can possibly store. Last year, $20 billion was spent on new data centers in the US alone, doubling the capital expenditure on data center infrastructure from 2016. And even with skyrocketing investment in data storage, corporations and the public sector are falling behind.

But there’s hope.

With a nascent technology leveraging DNA for data storage, this may soon become a problem of the past. By encoding bits of data into tiny molecules of DNA, researchers and companies like Microsoft hope to fit entire data centers in a few flasks of DNA by the end of the decade.

But let’s back up.

Backdrop

After the 20th century, we graduated from magnetic tape, floppy disks, and CDs to sophisticated semiconductor memory chips capable of holding data in countless tiny transistors. In keeping with Moore’s Law, we’ve seen an exponential increase in the storage capacity of silicon chips. At the same time, however, the rate at which humanity produces new digital information is exploding. The size of the global datasphere is increasing exponentially, predicted to reach 160 zettabytes (160 trillion gigabytes) by 2025. As of 2016, digital users produced over 44 billion gigabytes of data per day. By 2025, the International Data Corporation (IDC) estimates this figure will surpass 460 billion. And with private sector efforts to improve global connectivity—such as OneWeb and Google’s Project Loon—we’re about to see an influx of data from five billion new minds.

By 2020, three billion new minds are predicted to join the web. With private sector efforts, this number could reach five billion. While companies and services are profiting enormously from this influx, it’s extremely costly to build data centers at the rate needed. At present, about $50 million worth of new data center construction is required just to keep up, not to mention millions in furnishings, equipment, power, and cooling. Moreover, memory-grade silicon is rarely found pure in nature, and researchers predict it will run out by 2040.

Take DNA, on the other hand. At its theoretical limit, we could fit 215 million gigabytes of data in a single gram of DNA.

But how?

Crash Course

DNA is built from a double helix chain of four nucleotide bases—adenine (A), thymine (T), cytosine (C), and guanine (G). Once formed, these chains fold tightly to form extremely dense, space-saving data stores. To encode data files into these bases, we can use various algorithms that convert binary to base nucleotides—0s and 1s into A, T, C, and G. “00” might be encoded as A, “01” as G, “10” as C, and “11” as T, for instance. Once encoded, information is then stored by synthesizing DNA with specific base patterns, and the final encoded sequences are stored in vials with an extraordinary shelf-life. To retrieve data, encoded DNA can then be read using any number of sequencing technologies, such as Oxford Nanopore’s portable MinION.

Still in its deceptive growth phase, DNA data storage—or NAM (nucleic acid memory)—is only beginning to approach the knee of its exponential growth curve. But while the process remains costly and slow, several players are beginning to crack its greatest challenge: retrieval. Just as you might click on a specific file and filter a search term on your desktop, random-access across large data stores has become a top priority for scientists at Microsoft Research and the University of Washington.

Storing over 400 DNA-encoded megabytes of data, U Washington’s DNA storage system now offers random access across all its data with no bit errors.

Applications

Even before we guarantee random access for data retrieval, DNA data storage has immediate market applications. According to IDC’s Age 2025 study (Figure 5 (PDF)), a huge proportion of enterprise data goes straight to an archive. Over time, the majority of stored data becomes only potentially critical, making it less of a target for immediate retrieval.

Particularly for storing past legal documents, medical records, and other archive data, why waste precious computing power, infrastructure, and overhead?

Data-encoded DNA can last 10,000 years—guaranteed—in cold, dark, and dry conditions at a fraction of the storage cost.

Now that we can easily use natural enzymes to replicate DNA, companies have tons to gain (literally) by using DNA as a backup system—duplicating files for later retrieval and risk mitigation.

And as retrieval algorithms and biochemical technologies improve, random access across data-encoded DNA may become as easy as clicking a file on your desktop.

As you scroll, researchers are already investigating the potential of molecular computing, completely devoid of silicon and electronics.

Harvard professor George Church and his lab, for instance, envision capturing data directly in DNA. As Church has stated, “I’m interested in making biological cameras that don’t have any electronic or mechanical components,” whereby information “goes straight into DNA.” According to Church, DNA recorders would capture audiovisual data automatically. “You could paint it up on walls, and if anything interesting happens, just scrape a little bit off and read it—it’s not that far off.” One day, we may even be able to record biological events in the body. In pursuit of this end, Church’s lab is working to develop an in vivo DNA recorder of neural activity, skipping electrodes entirely.

Perhaps the most ultra-compact, long-lasting, and universal storage mechanism at our fingertips, DNA offers us unprecedented applications in data storage—perhaps even computing.

Potential

As DNA data storage plummets in tech costs and rises in speed, commercial user interfaces will become both critical and wildly profitable. Once corporations, startups, and people alike can easily save files, images or even neural activity to DNA, opportunities for disruption abound. Imagine uploading files to the cloud, which travel to encrypted DNA vials, as opposed to massive and inefficient silicon-enabled data centers. Corporations could have their own warehouses and local data networks could allow for heightened cybersecurity—particularly for archives.

And since DNA lasts millennia without maintenance, forget the need to copy databases and power digital archives. As long as we’re human, regardless of technological advances and changes, DNA will always be relevant and readable for generations to come.

But perhaps the most exciting potential of DNA is its portability. If we were to send a single exabyte of data (one billion gigabytes) to Mars using silicon binary media, it would take five Falcon Heavy rockets and cost $486 million in freight alone.

With DNA, we would need five cubic centimeters.

At scale, DNA has the true potential to dematerialize entire space colonies worth of data. Throughout evolution, DNA has unlocked extraordinary possibilities—from humans to bacteria. Soon hosting limitless data in almost zero space, it may one day unlock many more.

https://singularityhub.com/2018/04/26/the-answer-to-the-digital-data-tsunami-is-literally-in-our-dna/?utm_source=Singularity+Hub+Newsletter&utm_campaign=fa76321507-Hub_Daily_Newsletter&utm_medium=email&utm_term=0_f0cf60cdae-fa76321507-58158129#sm.000kbyugh140cf5sxiv1mnz7bq65u

by Michelle Z. Donahue

A baby girl who lived some 11,500 years ago survived for just six weeks in the harsh climate of central Alaska, but her brief life is providing a surprising and challenging wealth of information to modern researchers.

Her genome is the oldest-yet complete genetic profile of a New World human. But if that isn’t enough, her genes also reveal the existence of a previously unknown population of people who are related to—but older and genetically distinct from— modern Native Americans.

This new information helps sketch in more details about how, when, and where the ancestors of all Native Americans became a distinct group, and how they may have dispersed into and throughout the New World.

The baby’s DNA showed that she belonged to a population that was genetically separate from other native groups present elsewhere in the New World at the end of the Pleistocene. Ben Potter, the University of Alaska Fairbanks archaeologist who unearthed the remains at the Upward River Sun site in 2013 , named this new group “Ancient Beringians.”

The discovery of the baby’s bones, named Xach’itee’aanenh T’eede Gaay, or Sunrise Child-Girl in a local Athabascan language, was completely unexpected, as were the genetic results, Potter says

Found in 2006 and accessible only by helicopter, the Upward River Sun site is located in the dense boreal forest of central Alaska’s Tanana River Valley. The encampment was buried under feet of sand and silt, an acidic environment that makes the survival of organic artifacts exceedingly rare. Potter previously excavated the cremated remains of a three-year-old child from a hearth pit in the encampment, and it was beneath this first burial that the six-week-old baby and a second, even younger infant were found.

A genomics team in Denmark, including University of Copenhagen geneticist Eske Willerslev, performed the sequencing work on the remains, comparing the child’s genome with the genes of 167 ancient and contemporary populations from around the world. The results appeared today in the journal Nature.

“We didn’t know this population even existed,” Potter says. “Now we know they were here for many thousands of years, and that they were really successful. How did they do it? How did they change? We now have examples of two genetic groups of people who were adapting to this very harsh landscape.”

The genetic analysis points towards a divergence of all ancient Native Americans from a single east Asian source population somewhere between 36,000 to 25,000 years ago—well before humans crossed into Beringia, an area that includes the land bridge connecting Siberia and Alaska at the end of the last ice age. That means that somewhere along the way, either in eastern Asia or in Beringia itself, a group of people became isolated from other east Asians for about 10,000 years, long enough to become a unique strain of humanity.

The girl’s genome also shows that the Beringians became genetically distinct from all other Native Americans around 20,000 years ago. But since humans in North America are not reliably documented before 14,600 years ago, how and where these two groups could have been separated long enough to become genetically distinct is still unclear.

The new study posits two new possibilities for how the separation could have happened.

The first is that the two groups became isolated while still in east Asia, and that they crossed the land bridge separately—perhaps at different times, or using different routes

A second theory is that a single group moved out of Asia, then split into Beringians and ancient Native Americans once in Beringia. The Beringians lingered in the west and interior of Alaska, while the ancestors of modern Native Americans continued on south some time around 15,700 years ago.

“It’s less like a tree branching out and more like a delta of streams and rivers that intersect and then move apart,” says Miguel Vilar, lead scientist for National Geographic’s Genographic Project. “Twenty years ago, we thought the peopling of America seemed quite simple, but then it turns out to be more complicated than anyone thought.”

John Hoffecker, who studies the paleoecology of Beringia at the University of Colorado-Boulder, says there is still plenty of room for debate about the geographic locations of the ancestral splits. But the new study fits well with where the thinking has been heading for the last decade, he adds.

“We think there was a great deal more diversity in the original Native American populations than is apparent today, so this is consistent with a lot of other evidence,” Hoffecker says.

However, that same diversity—revealed through research on Native American cranial morphology and tooth structure—creates its own dilemma. How does a relatively small group of New World migrants, barricaded by a challenging climate with no access to fresh genetic material, evolve such a deep bank of differences from their east Asian ancestors? It certainly doesn’t happen over just 15,000 years, Hoffecker insists, referring to the estimated date of divergence of ancient Native Americans from Beringians.

“We’ve been getting these signals of early divergence for decades—the first mitochondrial work in the 1990s from Native Americans were coming up with estimates of 30, 35, even 40,000 years ago,” Hoffecker says. “They were being dismissed by everybody, myself included. Then people began to suspect there were two dates: one for divergence, and one for dispersal, and this study supports that.”

“Knowing about the Beringians really informs us as to how complex the process of human migration and adaptation was,” adds Potter. “It prompts the scientist in all of us to ask better questions, and to be in awe of our capacity as a species to come into such a harsh area and be very successful.”

https://news.nationalgeographic.com/2018/01/alaska-dna-ancient-beringia-genome/