Yale researchers have devised a way to peer into the brains of two people simultaneously while are engaged in discussion. What they found will not surprise anyone who has found themselves arguing about politics or social issues.
When two people agree, their brains exhibit a calm synchronicity of activity focused on sensory areas of the brain. When they disagree, however, many other regions of the brain involved in higher cognitive functions become mobilized as each individual combats the other’s argument, a Yale-led research team reports Jan. 13 in the journal Frontiers in Human Neuroscience.
“Our entire brain is a social processing network,” said senior author Joy Hirsch, the Elizabeth Mears and House Jameson Professor of Psychiatry and professor of comparative medicine and neuroscience. “However, it just takes a lot more brain real estate to disagree than to agree.”
For the study, the researchers from Yale and the University College London recruited 38 adults who were asked to say whether they agreed or disagreed with a series of statements such as “same-sex marriage is a civil right” or “marijuana should be legalized.” After matching up pairs based on their responses the researchers used an imaging technology called functional near-infrared spectroscopy to record their brain activity while they engaged in face-to-face discussions.
When the people were in agreement, brain activity was harmonious and tended to be concentrated on sensory areas of the brain such as the visual system, presumably in response to social cues from their partner. However, during disputes these areas of the brain were less active. Meanwhile, activity increased in the brain’s frontal lobes, home of higher order executive functions.
“There is a synchronicity between the brains when we agree,” Hirsch said. “But when we disagree, the neural coupling disconnects.”
Understanding how our brains function while disagreeing or agreeing is particularly important in a polarized political environment, Hirsch noted.
In discord, she said, two brains engage many emotional and cognitive resources “like a symphony orchestra playing different music.” In agreement, there “is less cognitive engagement and more social interaction between brains of the talkers, similar to a musical duet.”
The lead investigator of the paper is Alex Salama-Manteau, a former graduate student of economics at Yale and now a data scientist at Airbnb. Mark Tiede, a research scientist at the Haskins Laboratory at Yale, is second author of the paper.
A major mystery in Alzheimer’s disease research is why some brain cells succumb to the creeping pathology of the disease years before symptoms first appear, while others seem impervious to the degeneration surrounding them until the disease’s final stages.
Now, in a study published January 10, 2021 in Nature Neuroscience, a team of molecular biologists and neuropathologists from the UC San Francisco Weill Institute for Neurosciences have joined forces to identify for the first time the neurons that are among the first victims of the disease—accumulating toxic “tangles” and dying off earlier than neighboring cells.
“We know which neurons are first to die in other neurodegenerative diseases like Parkinson’s disease and ALS, but not Alzheimer’s,” said co-senior author Martin Kampmann, Ph.D., an associate professor in the UCSF Institute for Neurodegenerative Diseases and Chan Zuckerberg Biohub Investigator. “If we understood why these neurons are so vulnerable, maybe we could identify interventions that could make them, and the brain as a whole, more resilient to the disease.”
Alzheimer’s researchers have long studied why certain cells are more prone to producing the toxic tangles of the protein known as tau, whose spread through the brain drives widespread cell death and the resulting progressive memory loss, dementia, and other symptoms. But researchers have not looked closely at whether all cells are equally vulnerable to the toxic effects of these protein accumulations.
“The belief in the field has been that once these trash proteins are there, it’s always ‘game over’ for the cell, but our lab has been finding that that is not the case,” said Lea Grinberg, MD, the study’s other senior author, an associate professor and John Douglas French Alzheimer’s Foundation Endowed Professor in the UCSF Memory and Aging Center. “Some cells end up with high levels of tau tangles well into the progression of the disease, but for some reason don’t die. It has become a pressing question for us to understand the specific factors that make some cells selectively vulnerable to Alzheimer’s pathology, while other cells appear able to resist it for years, if not decades.”
To identify selectively vulnerable neurons, the researchers studied brain tissue from people who had died at different stages of Alzheimer’s disease, obtained from the UCSF Neurodegenerative Disease Brain Bank and the Brazilian BioBank for Aging Studies, a unique resource co-founded by Grinberg. The São Paulo-based biobank collects tissue samples from a broad population of deceased individuals, including many without a neurological diagnosis whose brains nevertheless show signs of very early-stage neurodegenerative disease, which is otherwise very difficult to study in humans.
First, led by Kampmann lab MD/Ph.D. student Kun Leng and Ph.D. student Emmi Li, the study’s co-first authors, the team studied tissue from 10 donor brains using a technique called single-nucleus RNA sequencing, which let them group neurons based on patterns of gene activity. In a brain region called the entorhinal cortex, one of the first areas attacked by Alzheimer’s, the researchers identified a particular subset of neurons that began to disappear very early on in the disease. Later on in the course of the disease, the researchers found, a similar group of neurons were also first to die off when degeneration reached the brain’s superior frontal gyrus.
In both regions, these vulnerable cells were distinguished by their expression of a protein called RORB. This allowed researchers in Grinberg’s neuropathology lab, led by former lab manager Rana Eser, to examine RORB-expressing neurons in more detail in brain tissue from a larger cohort of 26 donors. They used histological staining techniques to examine the fate of cells from both healthy individuals and those with early and late stage Alzheimer’s. This work validated that RORB-expressing neurons do in fact die off early on in the disease and also accumulate tau tangles earlier than neighboring, non-RORB-expressing neurons.
“These findings support the view that tau buildup is a critical driver of neurodegeneration, but we also know from other data from the Grinberg lab that not every cell that builds up these aggregates is equally susceptible,” said Leng, who plans continue to study factors underlying RORB neurons’ selective vulnerability using CRISPR-based technology the Kampmann lab has developed.
It’s not clear whether RORB itself causes the cells’ selective vulnerability, the researchers said, but the protein provides a valuable new molecular “handle” for future studies to understand what makes these cells succumb to Alzheimer’s pathology, and how their vulnerability could potentially be reversed.
“Our discovery of a molecular identifier for these selectively vulnerable cells gives us the opportunity to study in detail exactly why they succumb to tau pathology, and what could be done to make them more resilient,” Leng said. “This would be a totally new and much more targeted approach to developing therapies to slow or prevent the spread of Alzheimer’s disease.”
Despite decades of rigorous research, scientists are still struggling to crack the mystery of Alzheimer’s disease. Promising preclinical research has consistently led to frustrating clinical trial failures and some have started to question whether we are even targeting the correct pathological mechanisms.
One possible explanation for why we can’t crack the Alzheimer’s code is that we are mistakenly considering the disease as a single homogeneous entity. Currently Alzheimer’s disease (AD) is only really separated into two types, either early-onset Alzheimer’s or late-onset Alzheimer’s, depending on what stage of a person’s life they begin displaying symptoms.
A robust 2018 study investigated the cognitive and genomic characteristics of several thousand patients diagnosed with late-onset Alzheimer’s and concluded Alzheimer’s should be considered six distinctly different conditions instead of one single disease.
This new study arose from a similar foundation, trying to understand why the disease manifests with such a variety of clinical symptoms from patient to patient. One third of patients displaying clinical characteristics of Alzheimer’s, for example, show no toxic accumulation of amyloid proteins in their brain. And the opposite is also seen, with postmortem brain tissue biopsies revealing comprehensive pathological signs of Alzheimer’s despite no indication of cognitive decline during the person’s life.
Such differences strongly suggest there are subtypes of AD with different biological and molecular factors driving disease progression,” says lead author on the new study, Bin Zhang.
The new research set out to understand the specific molecular characteristics of different Alzheimer’s cases. Using RNA sequencing the research analyzed over 1,500 brain tissue samples, spanning five different brain regions.
Three major molecular subtypes of Alzheimer’s were identified based on factors including synaptic signaling, immune activity, mitochondria organization, myelination and specific gene activity. Only one third of the cases studied displayed “typical” Alzheimer’s hallmarks, such as decreased synaptic signaling and increased immune response. This subtype was dubbed “class C.”
mportantly, the study suggests the other two identified subtypes (class A and B) showed unique and distinct characteristics. In some instances the subtypes displayed opposite gene regulation, leading the researchers to hypothesize their findings as potentially helping explain previous clinical trial failures.
“This may partially explain how many existing clinical trials that showed promising efficacy in one particular mouse model later do not align with human trial results, assuming that study participants had consisted of a heterogeneous group of participants across many AD subtypes,” the researchers write in the study.
The challenge moving forward will be to find ways to detect these disease subtypes easily in living patients. The comprehensive brain tissue analysis in the study cannot translate into a diagnostic tool, so more work is needed to find biomarkers that correspond with these three subtypes.
“These findings lay down a foundation for determining more effective biomarkers for early prediction of AD, studying causal mechanisms of AD, developing next-generation therapeutics for AD and designing more effective and targeted clinical trials, ultimately leading to precision medicine for AD,” explains Zhang. “The remaining challenges for future research include replication of the findings in larger cohorts, validation of subtype specific targets and mechanisms, identification of peripheral biomarkers and clinical features associated with these molecular subtypes.”
Typically characterized as poisonous, corrosive and smelling of rotten eggs, hydrogen sulfide’s reputation may soon get a facelift. In experiments in mice, researchers have shown the foul-smelling gas may help protect aging brain cells against Alzheimer’s disease. The discovery of the biochemical reactions that make this possible opens doors to the development of new drugs to combat neurodegenerative disease.
The study was led by John Hopkins Medicine, working with the University of Exeter. The findings are reported in The Proceedings of the National Academies of Science.
“Our new data firmly link aging, neurodegeneration and cell signaling using hydrogen sulfide and other gaseous molecules within the cell,” says Bindu Paul, M.Sc., Ph.D., Faculty Research Instructor in neuroscience in the Solomon H. Snyder Department of Neuroscience at the Johns Hopkins University School of Medicine and lead corresponding author on the study.
The human body naturally creates small amounts of hydrogen sulfide to help regulate functions across the body from cell metabolism to dilating blood vessels. The rapidly burgeoning field of gasotransmission shows that gases are major cellular messenger molecules, with particular importance in the brain. However, unlike conventional neurotransmitters, gases can’t be stored in vesicles. Thus, gases act through very different mechanisms to rapidly facilitate cellular messaging. In the case of hydrogen sulfide, this entails the modification of target proteins by a process called chemical sulfhydration, which modulates their activity, says Solomon Snyder, D.Phil., D.Sc., M.D., professor of neuroscience at the Johns Hopkins University School of Medicine and co-corresponding author on the study.
Previous studies using a new method have shown that sulfhydration levels in the brain decrease with age, a trend that is amplified in patients with Alzheimer’s disease. “Here, using the same method, we now confirm a decrease in sulfhydration in the AD brain,” says collaborator Milos Filipovic, Ph.D., Principal Investigator, Leibniz-Institut für Analytische Wissenschaften—ISAS.
For the current research, the Johns Hopkins Medicine scientists studied mice genetically engineered to mimic human Alzheimer’s disease. They injected the mice with a hydrogen sulfide-carrying compound, called NaGYY, developed by their collaborators at University of Exeter, that slowly releases the passenger hydrogen sulfide molecules while traveling throughout the body. The researchers then tested the mice for changes in memory and motor function over a 12-week period.
Behavioral tests on the mice showed that hydrogen sulfide improved cognitive and motor function by 50per cent compared with mice that did not receive the injections of NaGYY. Treated mice were able to better remember the locations of platform exits and appeared more physically active than their untreated counterparts with simulated Alzheimer’s disease.
“Up until recently, researchers lacked the pharmacological tools to mimic how the body slowly makes tiny quantities of H2S inside cells. “The compound used in this study does just that and shows by correcting brain levels of H2S, we could successfully reverse some aspects of Alzheimer’s disease,” says collaborator on the study, Matt Whiteman, Ph.D., Professor of Experimental Therapeutics at the University of Exeter Medical School.
The results showed that the behavioral outcomes of Alzheimer’s disease could be reversed by introducing hydrogen sulfide, but the researchers wanted to investigate how the brain chemically reacted to the gaseous molecule.
A series of biochemical experiments revealed a change to a common enzyme, called glycogen synthase β (GSK3β). In the presence of healthy levels of hydrogen sulfide, GSK3β typically acts as a signaling molecule, adding chemical markers to other proteins & altering their function. However, the researchers observed that in the absence of hydrogen sulfide, GSK3β is over-attracted to another protein in the brain, called Tau.
When GSK3β interacts with Tau, Tau changes into a form that tangles and clumps inside nerve cells. As Tau clumps grow, the tangled proteins block communication between nerves, eventually causing them to die. This leads to the deterioration and eventual loss of cognition, memory and motor function that is characteristic of Alzheimer’s disease.
“Understanding the cascade of events is important to designing therapies that can block this interaction like natural hydrogen sulfide is able to do,” says Daniel Giovinazzo, M.D./Ph.D. student, the first author of the study.
The Johns Hopkins Medicine team and their international collaborators plan to continue studying how sulfur groups interact with GSK3β and other proteins involved in the pathogenesis of Alzheimer’s disease in other cell and organ systems. The team also plans to test novel hydrogen sulfide delivery molecules as part of their ongoing venture.
More information: Daniel Giovinazzo et al. Hydrogen sulfide is neuroprotective in Alzheimer’s disease by sulfhydrating GSK3β and inhibiting Tau hyperphosphorylation, Proceedings of the National Academy of Sciences (2021). DOI: 10.1073/pnas.2017225118
While it is known that sleep is critical for healthy brain function, it is not known when animals started to require sleep. A surprising new study from Kyushu University suggests that animals needed sleep before they even had brains.
The investigation was focused on hydras – tiny freshwater organisms that lack a central nervous system. The researchers discovered that not only do hydras show signs of a sleep-like state despite having no brain, but also respond to molecules associated with sleep in more evolved animals.
“We now have strong evidence that animals must have acquired the need to sleep before acquiring a brain,” said study lead author Professor Taichi Q. Itoh.
Three years ago, scientists at Caltech were the first to document sleeping behavior in a brainless animal, the Cassiopea jellyfish, a relative of hydras that is also known as the upside-down jellyfish.
In collaboration with experts at the Ulsan National Institute of Science and Technology in Korea, the Kyushu team found that several chemicals which cause drowsiness even in humans had similar effects on the species Hydra vulgaris.
“Based on our findings and previous reports regarding jellyfish, we can say that sleep evolution is independent of brain evolution,” said Professor Itoh.
“Many questions still remain regarding how sleep emerged in animals, but hydras provide an easy-to-handle creature for further investigating the detailed mechanisms producing sleep in brainless animals to help possibly one day answer these questions.”
To monitor resting behavior among hydras, the researchers used a video system to track their movement and determine when hydras they were in a sleep-like state.
The hydras displayed four-hour cycles of active and sleep-like states. The researchers uncovered many aspects of sleep regulation that were similar to that of animals who do possess a brain on a molecular and genetic level.
Melatonin moderately increased the amount and frequency of sleep among hydras, while the inhibitory neurotransmitter GABA greatly increased sleeping activity.
When the hydras were exposed to dopamine, they also slept more. In humans and other animals, dopamine typically causes arousal. “While some sleep mechanisms appear to have been conserved, others may have switched function during evolution of the brain,” explained Professor Itoh.
The researchers used vibrations and temperature changes to disturb the hydras’ sleep and induce signs of sleep deprivation. This caused the hydras to sleep longer during the following day and even suppressed cell proliferation.
Upon further analysis, the experts found that sleep deprivation led to changes in the expression of 212 genes. One of these genes in particular is related to PRKG, a protein involved in sleep regulation in a wide range of animals.
“Taken all together, these experiments provide strong evidence that animals acquired sleep-related mechanisms before the evolutional development of the central nervous system and that many of these mechanisms were conserved as brains evolved,” said Professor Itoh.
In extremely rare instances, newborns can contract cancer from their pregnant moms during delivery, a new case report suggests.
Two boys, a 23-month-old and a 6-year-old, developed lung cancers that proved an exact genetic match to cervical cancers within their mothers at the time of birth, Japanese researchers report.
It appears that the boys breathed in cancer cells from their mothers’ tumors while they were being born, cancer experts say.
“In our cases, we think that tumors arose from mother-to-infant vaginal transmission through aspiration of tumor-contaminated vaginal fluids during birth,” said lead researcher Dr. Ayumu Arakawa, a pediatric oncologist with the National Cancer Center Hospital in Tokyo.
Transmission of cancer from a mom to her offspring is a very rare event, occurring in only 1 infant for every 500,000 mothers with cancer, researchers said in background notes. By comparison, about 1 in every 1,000 live births involves a mother with cancer.
The small number of previously observed cases typically have involved cancer cells traveling across the placenta and into the still-developing fetus, researchers said. Leukemia, lymphoma and melanoma are the most common cancers that children contract through suspected transplacental transmission.
These are the first cases in which newborns appear to have contracted lung cancer by breathing in cancer cells from cervical tumors, cancer experts said.
“I found it fascinating, personally. I didn’t know this was possible,” said Debbie Saslow, senior director of HPV-related and women’s cancers at the American Cancer Society.
Most cervical cancers are caused by human papillomavirus (HPV), a virus against which there is an effective vaccine. Cases like this will become even rarer as more boys and girls are vaccinated against HPV, Saslow said.
“I think it’s interesting that this study was from Japan, where they’ve had a lot of backlash against the HPV vaccine and they saw vaccination rates plummet because of unfounded concerns,” Saslow said. “I also know Japan has had particularly low cervical screening rates.”
Doctors discovered cancer in both lungs of the 23-month-old boy after his family took him to the hospital for a cough that had gone on for two weeks. His mother had received a diagnosis of cervical cancer three months after the infant’s birth.
The 6-year-old boy went to a local hospital with chest pain on his left side, and a CT scan revealed a 6-centimeter mass in his left lung. His mother had a cervical tumor that was thought to be benign at the time of delivery; she died from cervical cancer two years after his birth.
“Neither mother was known to have a cervical cancer. The first patient had a negative pap smear, and the second had a cervical mass but it was thought to be benign. I don’t know the obstetricians would have done anything differently based on the information they had,” said Dr. Shannon Neville Westin, a gynecologic oncologist with the University of Texas MD Anderson Cancer Center.
In both cases, doctors used genetic testing to positively link the mothers’ cervical cancers to the lung cancers in their sons.
“If we hadn’t been able to test the tumors from the mother and the infant, you never would have known those were truly related,” Westin said. “Because they determined that they were, they were able to direct therapy in a way that was very successful for the infants.”
Both boys still are alive following successful cancer treatment, the Japanese researchers said. The findings were reported Jan. 7 in the New England Journal of Medicine.
Arakawa and his colleagues suggest that pregnant women with cervical cancers consider having a C-section, to avoid the risk of passing cancer to their newborn.
“Mother-to-infant transmission of tumor may be a risk of vaginal delivery among women with cervical cancers,” Arakawa said. “Cesarean section should be recommended for mothers with uterine cervical cancer.”
Westin and Saslow both disagree, arguing that too little is known to immediately change recommendations around this specific and rare situation.
“If we are diligent about testing more of these infants with cancer, we may be able to move forward and change practice and say every patient with cervical cancer should have a cesarean section,” said Westin, an expert with the American Society of Clinical Oncology. “We just need to gather up that data to be able to change the practice.”
“Only about 1% to 3% of all women with cervical cancer are pregnant or postpartum at the time of diagnosis. The incidence of cervical cancer ranges, but it’s around 12,000 a year,” Westin said. “You’re really selecting out such a tiny group of patients to even begin with.”
Researchers have successfully used a DNA-editing technique to extend the lifespan of mice with the genetic variation associated with progeria, a rare genetic disease that causes extreme premature aging in children and can significantly shorten their life expectancy. The study was published in the journal Nature, and was a collaboration between the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health; Broad Institute of Harvard and MIT, Boston; and the Vanderbilt University Medical Center, Nashville, Tennessee.
DNA is made up of four chemical bases—A, C, G and T. Progeria, which is also known as Hutchinson-Gilford progeria syndrome, is caused by a mutation in the nuclear lamin A (LMNA) gene in which one DNA base C is changed to a T. This change increases the production of the toxic protein progerin, which causes the rapid aging process.
Approximately 1 in 4 million children are diagnosed with progeria within the first two years of birth, and virtually all of these children develop health issues in childhood and adolescence that are normally associated with old age, including cardiovascular disease (heart attacks and strokes), hair loss, skeletal problems, subcutaneous fat loss and hardened skin.
For this study, researchers used a breakthrough DNA-editing technique called base editing, which substitutes a single DNA letter for another without damaging the DNA, to study how changing this mutation might affect progeria-like symptoms in mice.
“The toll of this devastating illness on affected children and their families cannot be overstated,” said Francis S. Collins, M.D., Ph.D., a senior investigator in NHGRI’s Medical Genomics and Metabolic Genetics Branch, NIH director and a corresponding author on the paper. “The fact that a single specific mutation causes the disease in nearly all affected children made us realize that we might have tools to fix the root cause. These tools could only be developed thanks to long-term investments in basic genomics research.”
David Liu, Ph.D., and his lab at the Broad Institute developed the base-editing method in 2016, funded in part by NHGRI.
“CRISPR editing, while revolutionary, cannot yet make precise DNA changes in many kinds of cells,” said Dr. Liu, a senior author on the paper. “The base-editing technique we’ve developed is like a find-and-replace function in a word processor. It is extremely efficient in converting one base pair to another, which we believed would be powerful in treating a disease like progeria.”
To test the effectiveness of their base-editing method, the team initially collaborated with the Progeria Research Foundation to obtain connective tissue cells from progeria patients. The team used the base editor on the LMNA gene within the patients’ cells in a laboratory setting. The treatment fixed the mutation in 90% of the cells.
“The Progeria Research Foundation was thrilled to collaborate on this seminal study with Dr. Collins’s group at the NIH and Dr. Liu’s group at Broad Institute,” said Leslie Gordon, M.D., Ph.D., a co-author and medical director of The Progeria Research Foundation, which partially funded the study. “These study results present an exciting new pathway for investigation into new treatments and the cure for children with progeria.”
Following this success, the researchers tested the gene-editing technique by delivering a single intravenous injection of the DNA-editing mix into nearly a dozen mice with the progeria-causing mutation soon after birth. The gene editor successfully restored the normal DNA sequence of the LMNA gene in a significant percentage of cells in various organs, including the heart and aorta.
Many of the mice cell types still maintained the corrected DNA sequence six months after the treatment. In the aorta, the results were even better than expected, as the edited cells seemed to have replaced those that carried the progeria mutation and dropped out from early deterioration. Most dramatically, the treated mice’s lifespan increased from seven months to almost 1.5 years. The average normal lifespan of the mice used in the study is two years.
“As a physician-scientist, it’s incredibly exciting to think that an idea you’ve been working on in the laboratory might actually have therapeutic benefit,” said Jonathan D. Brown, M.D., assistant professor of medicine in the Division of Cardiovascular Medicine at Vanderbilt University Medical Center. “Ultimately our goal will be to try to develop this for humans, but there are additional key questions that we need to first address in these model systems.”
Two prevailing theories exist about the origin of domesticated dogs. One proposes that prehistoric humans used early dogs as hunting partners, and the other says that wolves were attracted to our garbage piles. New research suggests both theories are wrong and that the real reason has to do with our limited capacity to digest protein.
Dogs were domesticated from wild wolves during the last ice age between 14,000 and 29,000 years ago, and they were the first animals to be domesticated by humans. That humans and wolves should form a collaborative relationship is an odd result, given that both species are pack hunters who often target the same prey.
“The domestication of dogs has increased the success of both species to the point that dogs are now the most numerous carnivore on the planet,” wrote the authors of a new study published today in Scientific Reports. “How this mutually beneficial relationship emerged, and specifically how the potentially fierce competition between these two carnivores was ameliorated, needs to be explained.”
Indeed, given this context, it’s not immediately obvious why humans would want to keep wolves around. Moreover, the two prevailing theories about the origin of dogs—either as partners used for hunting or as self-domesticated animals attracted to our garbage—aren’t very convincing. Wolves, even when tamed, would’ve made for awful hunting partners, as they lacked the collaborative and advanced communication skills found in domesticated dogs. And sure, wild wolves were probably attracted to human scraps, but this would’ve required some unlikely interactions between humans and wolves.
“In our opinion, the self-domestication in this way is not fully explained,” Maria Lahtinen, a chemist and archaeologist at the Finnish Food Authority in Finland and the first author of the new study, said in an email. “Hunter-gatherers do not necessarily leave waste in the same place over and over again. And why would they tolerate a dangerous carnivore group in their close surroundings? Humans tend to kill their competitors and other carnivores.”
Lahtinen and her colleagues say there’s a more likely reason for the domestication of dogs, and it has to do with an abundance of protein during the harsh ice age winters, which subsequently reduced competition between the two species. This in turn allowed humans and incipient dogs to live in symbiotic harmony, paving the way for the ongoing evolution of both species.
The researchers have “introduced a really interesting hypothesis that seeks to address the long-debated mechanism by which early dog domestication occurred,” James Cole, an archaeologist at the University of Brighton who’s not involved with the new study, wrote in an email. “The idea is that human populations and wolves could have lived alongside each other during the harsh climatic conditions [of the last ice age] because human populations would have produced enough protein, through hunting activities, to keep both populations fed during the harsh winter months.”
Seems hard to believe, but humans likely had more food during ice age winters than they could handle. This is due to our inability to subsist exclusively on lean protein for months at a time—something wolves have no issues with. For humans, excessive consumption of protein can lead to hyperinsulinemia (insulin resistance), hyperammonia (excess ammonia in blood), diarrhea, and in some extreme cases even death, according to the authors. To overcome this biological limitation, Pleistocene hunters adapted their diets during the winter months, targeting animal parts rich in fat, grease, and oils, such as lower limbs, organs, and the brain. And in fact, “there is evidence for such processing behavior during the Upper Palaeolithic,” according to the paper.
Consequently, wolves and humans were able to “share their game without competition in cold environments,” said Lahtinen. This in turn made it possible for humans to keep wolves as pets.
“Therefore, in the short term over the critical winter months, wolves and humans would not have been in competition over resources and may have mutually benefited from each other’s companionship,” wrote the authors. “This would have been critical in keeping the first proto-dogs for years and generations.”
It’s very possible, said Lahtinen, that the earliest dogs were wolf pups. Hunter-gatherers, she said, “do take pets in most cultures, and humans tend to find young animals cute,” so it would “not be a surprise if this would have happened.”
So dogs exist because wolf pups were cute and we had plenty of leftovers? Seems a legit theory, if you ask me.
Only later, due to traits introduced by artificial selection, were dogs used for hunting, guarding, pulling sleds, and so on, according to the researchers. This theory may also explain the complexity of early dog domestication, which appears to have occurred in Eurasia at multiple times, with dogs continuing to interbreed with wild wolves. The new theory may also explain why the domestication of dogs appears to have occurred in arctic and subarctic regions.
As for the summer months, that wasn’t as crucial for humans, given the relative abundance of food alternatives. During the critical winter months, however, “hunter-gatherers tend to give up their pets if there is a need to give up resources from humans,” said Lahtinen.
Importantly, Lahtinen and her colleagues did not pull this theory from thin air. To reach this conclusion, the team performed energy content calculations to estimate the amount of energy that would be left over from prey animals also hunted by wolves, such as deer, moose, and horses. The authors reasoned that, if humans and wolves were having to compete for these resources, there would be little to no cooperation between the two species. But their calculations showed that, aside from animals like weasels, all animals preyed upon by humans would have provided more lean protein than required.
“Therefore, the early domesticated wolves could have survived living alongside human populations by consuming the excess protein from hunting that humans could not,” explained Cole. “By having enough food for both populations, the competitive niche between the species is eliminated, thereby paving the way to domestication and the benefits of such a relationship to the two species.”
Cole described it as a “really intriguing hypothesis” because it provides a “mechanism that can explain the domestication of the wolf across a wide geographic and temporal range,” and it does so by “explaining how two carnivorous species could overcome the competition…under harsh climatic conditions.” Looking ahead, Cole said a similar approach would be useful for studying the interactions of humans and other species on this planet over time.
As a relevant aside, Cole is the author of a fascinating Scientific Reports paper published in 2017 arguing that ancient humans didn’t turn to cannibalism for nutrition. Using an approach similar to the one taken in the Lahtinen paper, Cole showed that human flesh simply doesn’t pack the same amount of calories as wild animals, and cannibalism wouldn’t have been worth all the trouble.
“We bring a skill set that can fill a real need – sleeping shelters that can serve as a transitional stop. Tiny houses aren’t legal in Vancouver but a shed less than 100 square feet and under 15 feet high is.”
Davidson has been pitching the idea for a while, but got fed up with the city talking about the problem with no action, and says he just decided “to get on with it.” He built the prototype himself with donations from his usual suppliers, his own resources, and donations from the public via Gofundme.
The 8′-by-12’6″ units are built out of structural insulated panels (SIPs) and include a Zehnder heat recovery ventilator to control moisture buildup and deliver fresh air. The units could be built on-site or prefabricated and delivered in two pieces on a conventional flatbed trailer. They cost about $11,700 (C$15,000) to build.
The one complaint I had with the design was the inclusion of lofts, which I consider to be dangerous and often uncomfortable. Davidson noted that because of the 15-foot height limit, the loft was a very inexpensive expansion of space that could be used for storage or other uses that did not necessarily involve climbing ladders in the middle of the night.
The accommodation here is pretty minimal, but as Davidson notes, it is meant to be transitional. The brilliance of the idea is that as it meets the definition of a shed, it’s legal; and it is ephemeral, it has no foundations, so it can be picked up and moved on short notice. That’s critical if you are going to install a community without a massive NIMBY battle.
Years ago I was involved in a proposal to build temporary housing for homeless people on the Toronto waterfront, with a very similar solution. After months of work, my partner and I sat around a big table at city hall where the head of every department laid out their reasons why it couldn’t be done, whether it was health or safety or plumbing or the final nail in the coffin, that the site was on a flood plain. But in the interim, the problem has just got worse, exacerbated by the Covid-19 crisis.
Bryn Davidson has proposed a solution that addresses so many of the problems and complications faced when trying to address the problem of homelessness. Because of the unit sizes, it dances around the building code and zoning issues. it can accommodate a lot of people on a small site. And unlike a tent, it is warm, dry, and secure.
From the looks of it, Bryn and his son enjoyed their night in it too. Help him finish the project by contributing through Gofundme; I just did.
Children with self-control are more likely to grow up to be healthier adults with younger brains and bodies, according to a new study from Duke University. The researchers tracked 1,000 individuals from birth to age 45 and found that people who had higher levels of self-control as children were aging more slowly than their peers.
Self-control is the ability to control one’s own emotions and behaviors, even when faced with difficult situations. Interviews with the study participants indicated that those in the higher self-control group were better equipped to handle health, financial, and social challenges later in life.
Furthermore, individuals who had more self control in childhood expressed more positive views of aging and felt more satisfied with life in middle age.
“Our population is growing older, and living longer with age-related diseases,” said study first author Professor Leah Richmond-Rakerd. “It’s important to identify ways to help individuals prepare successfully for later-life challenges, and live more years free of disability. We found that self-control in early life may help set people up for healthy aging.”
The researchers emphasized that childhood is not destiny, and that some study participants had shifted their self-control levels as adults and had better health outcomes than their childhood assessments would have predicted.
Self-control can be taught, and the experts propose that a societal investment in such training could improve life span and quality of life, not only in childhood, but also perhaps in midlife. A growing collection of evidence suggests that changing behaviors in midlife, such as quitting smoking or taking up exercise, can lead to improved outcomes.
“Everyone fears an old age that’s sickly, poor, and lonely, so aging well requires us to get prepared, physically, financially, and socially,” said study co-author Professor Terrie Moffitt. “We found people who have used self-control since childhood are far more prepared for aging than their same-age peers.”
The Dunedin Multidisciplinary Health and Development Study, based in New Zealand, has tracked the individuals since they were born in 1972 and 1973. The participants were assessed using a variety of psychological and health assessments at regular intervals, most recently at age 45.
Childhood self-control was assessed by teachers, parents, and the children themselves from the ages of three to 11. The experts measured factors such as impulsive aggression, over-activity, perseverance, and inattention.
From ages 26 to 45, the participants were measured for physiological signs of aging in several organ systems, including the brain. Across all of these measures, higher self-control in childhood was associated with slower aging. In addition, the people with the highest self-control were found to walk faster and have younger-looking faces at age 45.