Surprising new way to treat obesity

new-way-to-target-high-rates-of-obesity-307945
A UNSW-led team researching a drug to avoid insulin resistance was greeted with an unexpected result that could have implications for the nation’s rising rates of obesity and associated disease.

A novel drug is being touted as a major step forward in the battle against Australia’s escalating rates of obesity and associated metabolic diseases.

Two in three adults in Australia are overweight or obese. A long-term study between researchers at the Centenary Institute and UNSW Sydney has led to the creation of a drug which targets an enzyme linked to insulin resistance – a key contributor of metabolic diseases, such as type 2 diabetes.

The project has been a collaboration between the Centenary Institute’s Associate Professor Anthony Don, UNSW’s Metabolic research group and its leader Associate Professor Nigel Turner, and UNSW Professor Jonathan Morris’ synthetic chemistry group. Together, they set out to create a drug that targeted enzymes within the Ceramide Synthase family, which produce lipid molecules believed to promote insulin resistance in skeletal muscle, as well as liver and fat tissue.

The study has been published in the highly-regarded scientific journal Nature Communications. Surprisingly, although the drug was very effective at reducing the lipids of interest in skeletal muscle, it did not prevent mice (which had been fed a high-fat diet to induce metabolic disease) from developing insulin resistance. Instead, it prevented the mice from depositing and storing fat by increasing their ability to burn fat in skeletal muscle.

“We anticipated that targeting this enzyme would have insulin-sensitising, rather than anti-obesity, effects. However, since obesity is a strong risk factor for many different diseases including cardiovascular disease and cancer, any new therapy in this space could have widespread benefits,” says UNSW Associate Professor Nigel Turner.

While the study produced some unexpected results, it’s the first time scientists have been able to develop a drug that successfully targets a specific Ceramide Synthase enzyme in metabolic disease, making it a significant advancement in the understanding and prevention of a range of chronic health conditions.

“From here, I would like to develop drugs which target both the Ceramide Synthase 1 and 6 enzymes together, and see whether it produces a much stronger anti-obesity and insulin sensitising response. Although these drugs need more work before they are suitable for use in the clinic, our work so far has been a very important step in that direction,” says Centenary Institute’s Associate Professor Anthony Don.

https://newsroom.unsw.edu.au/news/health/surprise-result-researchers-targeting-high-rates-obesity

Gut Bacteria Hold the Key to Creating Universal Blood

gut-bacteria-hold-the-key-to-creating-universal-blood-307955

 

In January, raging storms caused medical emergencies along the U.S. East Coast, prompting the Red Cross to issue an urgent call for blood donations. The nation’s blood supply was especially in need of O-type blood that can be universally administered in an emergency. Now, scientists say they have identified enzymes — from the human gut — that can turn type A and B blood into O, as much as 30 times more efficiently than previously studied enzymes.

The researchers will present their results today at the 256th National Meeting & Exposition of the American Chemical Society (ACS). ACS, the world’s largest scientific society, is holding the meeting here through Thursday. It features more than 10,000 presentations on a wide range of science topics.

A brand-new video on the research is available at http://bit.ly/acsblood.

“We have been particularly interested in enzymes that allow us to remove the A or B antigens from red blood cells,” Stephen Withers, Ph.D., says. “If you can remove those antigens, which are just simple sugars, then you can convert A or B to O blood.” He says scientists have pursued the idea of adjusting donated blood to a common type for a while, but they have yet to find efficient, selective enzymes that are also safe and economical.

To assess potential enzyme candidates more quickly, Withers collaborated with a colleague at his institution, the University of British Columbia (UBC), who uses metagenomics to study ecology. “With metagenomics, you take all of the organisms from an environment and extract the sum total DNA of those organisms all mixed up together,” Withers explains. Casting such a wide net allows Withers’ team to sample the genes of millions of microorganisms without the need for individual cultures. The researchers then use E. coli to select for DNA containing genes that code for enzymes that can cleave sugar residues. So instead of using metagenomics as a means of learning about microbial ecology, Withers uses it to discover new biocatalysts. “This is a way of getting that genetic information out of the environment and into the laboratory setting and then screening for the activity we are interested in,” he says.

Withers’ team considered sampling DNA from mosquitoes and leeches, the types of organisms that degrade blood, but ultimately found successful candidate enzymes in the human gut microbiome. Glycosylated proteins called mucins line the gut wall, providing sugars that serve as attachment points for gut bacteria while also feeding them as they assist in digestion. Some of the mucin sugars are similar in structure to the antigens on A- and B-type blood. The researchers homed in on the enzymes the bacteria use to pluck the sugars off mucin and found a new family of enzymes that are 30 times more effective at removing red blood cell antigens than previously reported candidates.

Withers is now working with colleagues at the Centre for Blood Research at UBC to validate these enzymes and test them on a larger scale for potential clinical testing. In addition, he plans to carry out directed evolution, a protein engineering technique that simulates natural evolution, with the goal of creating the most efficient sugar-removing enzyme.

“I am optimistic that we have a very interesting candidate to adjust donated blood to a common type,” Withers says. “Of course, it will have to go through lots of clinical trails to make sure that it doesn’t have any adverse consequences, but it is looking very promising.”

The researchers acknowledge support and funding from the Canadian Institutes of Health Research.

https://www.acs.org/content/acs/en/pressroom/newsreleases/2018/august/gut-bacteria-provide-key-to-making-universal-blood-video.html

Creationists and conspiracy theorists share the same core process of teleological thinking.

It’s not uncommon to hear someone espouse the idea that “everything happens for a reason” or that something that happened was “meant to be.” Now, researchers reporting in Current Biology on August 20 have found that this kind of teleological thinking is linked to two seemingly unrelated beliefs: creationism, the belief that life on Earth was purposely created by a supernatural agent, and conspiracism, the tendency to explain historical or current events in terms of secret conspiracies or conspiracy theories.

“We find a previously unnoticed common thread between believing in creationism and believing in conspiracy theories,” says Sebastian Dieguez of the University of Fribourg. “Although very different at first glance, both these belief systems are associated with a single and powerful cognitive bias named teleological thinking, which entails the perception of final causes and overriding purpose in naturally occurring events and entities.”

A teleological thinker, for example, will accept as true propositions such as “the sun rises in order to give us light” or “the purpose of bees is to ensure pollination,” he says. “This type of thinking is anathema to scientific reasoning, and especially to evolutionary theory, and was famously mocked by Voltaire, whose character Pangloss believed that ‘noses were made to wear spectacles.’ Yet it is very resilient in human cognition, and we show that it is linked not only to creationism, but also to conspiracism.”

In previous work, Dieguez and colleagues showed that conspiracism wasn’t explained by the tendency to assume that “nothing happens by accident.” They realized that conspiracism isn’t driven by a rejection of the idea that the world is random and complex, but that it still could be linked to the notion that events in the world are actively and purposely fabricated. They also noticed that this looked “striking similar” to creationism. If correct, they reasoned, then conspiracism, like creationism, should be associated with teleological thinking, and both types of beliefs should be correlated with each other.

To find out whether this was the case, the researchers asked more than 150 college students in Switzerland to complete a questionnaire including teleological claims and conspiracist statements, as well as measures of analytical thinking, esoteric and magical beliefs, and a randomness perception task. The survey data showed that the tendency to ascribe function and meaning to natural facts and events was significantly, though modestly, correlated with conspiracist belief scales. Drawing on a large-scale survey of people in France, the researchers also found a strong association between creationism and conspiracism.

To look more closely at this pattern, the researchers next recruited more than 700 people to complete questionnaires online. Those data again confirmed associations among teleological thinking, creationism, and conspiracism. The data also show that those relationships are partly distinct from other variables, including gender, age, analytical thinking, political orientation, education, and agency detection.

“By drawing attention to the analogy between creationism and conspiracism, we hope to highlight one of the major flaws of conspiracy theories and therefore help people detect it, namely that they rely on teleological reasoning by ascribing a final cause and overriding purpose to world events,” Dieguez says. “We think the message that conspiracism is a type of creationism that deals with the social world can help clarify some of the most baffling features of our so-called ‘post-truth era.'”

The researchers say the findings have important implications for science educators and communicators. They may also help in formulating policies to “discourage the endorsement of socially debilitating and sometimes dangerous beliefs and belief systems.”

The researchers are now in the process of assessing the effectiveness of ongoing attempts to educate kids and adolescents about the nature of conspiracy theories and other types of misinformation. They say what’s ultimately needed is a thorough understanding of the factors that contribute to a conspiracist mindset, which is relevant to many beliefs, including global warming denialism and vaccine rejection, and they are developing a general framework to help disentangle the relevant factors.

The findings may help to explain how certain types of misinformation spread so easily aided by social media channels. “It’s possible that content framed in teleological terms are easier to process and spread faster than other types of information, and this could be tested on a much larger scale,” Dieguez says.

https://medicalxpress.com/news/2018-08-core-error-underlies-belief-creationism.html

Cleveland Clinic Researchers Discover Novel Subtype of Multiple Sclerosis


Reprinted from The Lancet Neurology, http://dx.doi.org/10.1016/S1474-4422(18)30245-X, Trapp et al, Cortical neuronal densities and cerebral white matter demyelination in multiple sclerosis: a retrospective study, Copyright (2018), with permission from Elsevier


Bruce Trapp, Ph.D., chair of Cleveland Clinic’s Lerner Research Institute Department of Neurosciences

Cleveland Clinic researchers have discovered a new subtype of multiple sclerosis (MS), providing a better understanding of the individualized nature of the disease.

MS has long been characterized as a disease of the brain’s white matter, where immune cells destroy myelin – the fatty protective covering on nerve cells. The destruction of myelin (called demyelination) was believed to be responsible for nerve cell (neuron) death that leads to irreversible disability in patients with MS.

However, in the new findings, a research team led by Bruce Trapp, Ph.D., identified for the first time a subtype of the disease that features neuronal loss but no demyelination of the brain’s white matter. The findings, published in Lancet Neurology, could potentially lead to more personalized diagnosis and treatments.

The team’s findings support the concept that neurodegeneration and demyelination can occur independently in MS and underscore the need for more sensitive MRI imaging techniques for evaluating brain pathology in real time and monitoring treatment response in patients with the disease. This new subtype of MS, called myelocortical MS (MCMS), was indistinguishable from traditional MS on MRI. The researchers observed that in MCMS, part of the neurons become swollen and look like typical MS lesions indicative of white matter myelin loss on MRI. The disease was only diagnosed in post-mortem tissues.

“This study opens up a new arena in MS research. It is the first to provide pathological evidence that neuronal degeneration can occur without white matter myelin loss in the brains of patients with the disease,” said Trapp, chair of Cleveland Clinic’s Lerner Research Institute Department of Neurosciences. “This information highlights the need for combination therapies to stop disability progression in MS.”

In the study of brain tissue from 100 MS patients who donated their brains after death, the researchers observed that 12 brains did not have white matter demyelination. They compared microscopic tissue characteristics from the brains and spinal cords of 12 MCMS patients, 12 traditional MS patients and also individuals without neurological disease. Although both MCMS and traditional MS patients had typical MS lesions in the spinal cord and cerebral cortex, only the latter group had MS lesions in the brain white matter.

Despite having no typical MS lesions in the white matter, MCMS brains did have reduced neuronal density and cortical thickness, which are hallmarks of brain degeneration also observed in traditional MS. Contrary to previous belief, these observations show that neuronal loss can occur independently of white matter demyelination.

“The importance of this research is two-fold. The identification of this new MS subtype highlights the need to develop more sensitive strategies for properly diagnosing and understanding the pathology of MCMS,” said Daniel Ontaneda, M.D., clinical director of the brain donation program at Cleveland Clinic’s Mellen Center for Treatment and Research in MS. “We are hopeful these findings will lead to new tailored treatment strategies for patients living with different forms of MS.”

Dr. Trapp is internationally known for his work on mechanisms of neurodegeneration and repair in MS and has published more than 240 peer-reviewed articles and 40 book chapters. He also holds the Morris R. and Ruth V. Graham Endowed Chair in Biomedical Research. In 2017 he received the prestigious Outstanding Investigator award by the National Institute of Neurological Disorders and Stroke to examine the biology of MS and to seek treatments that could slow or reverse the disease.

Cleveland Clinic Researchers Discover Novel Subtype of Multiple Sclerosis

New Research Suggests It’s all About the Bass

When we listen to music, we often tap our feet or bob our head along to the beat – but why do we do it? New research led by Western Sydney University’s MARCS Institute suggests the reason could be related to the way our brain processes low-frequency sounds.

The study, published in PNAS, recorded the electrical activity of volunteers’ brains while they listened to rhythmic patterns played at either low or high-pitched tones. The study found that while listening, volunteer’s brain activities and the rhythmic structure of the sound became synchronized – particularly at the frequency of the beat.

Co-author of the paper, Dr Sylvie Nozaradan from the MARCS Institute, say these findings strongly suggest that the bass exploits a neurophysiological mechanism in the brain – essentially forcing it to lock onto the beat.

“There is mounting evidence supporting the hypothesis that selective synchronization of large pools of neurons of the brain to the beat frequency may support perception and movement to the musical beat”, says Dr Nozaradan.

While this research is an important step in answering the mystery of why we ‘dance to the beat of the drum’, according to co-author Dr Peter Keller from the MARCS Institute, these findings could also prove important in clinical rehabilitation.

“Music is increasingly being used in clinical rehabilitation of cognitive and motor disorders caused by brain damage and these findings, and a better understanding of the relationship between music and movement, could help develop such treatments,” says Dr Keller.

The research team – also comprising of co-authors Dr Manuel Varlet and Tomas Lenc – suggests that while this research is an important step in understanding the relationship between bass and movement, there are still many open questions about the mechanisms behind this phenomenon.

“Future research is needed to clarify what networks of brain areas are responsible for this synchronization to the beat and how it develops from early in infancy” says Dr Nozaradan.

https://www.westernsydney.edu.au/newscentre/news_centre/more_news_stories/new_research_suggests_its_all_about_the_bass

Can Eyes Predict Parkinson’s Disease? Retinal thinning from dopamine loss may be an early disease marker.

by Judy George

Retinal thinning was linked to dopaminergic neuronal atrophy in a cross-sectional analysis, raising the possibility that it could be a way to detect pathologic changes in early Parkinson’s disease (PD) patients, researchers said.

Drug-naïve patients with early Parkinson’s showed retinal thinning as measured by optical coherence tomography (OCT) that correlated with both disease severity and nigral dopaminergic degeneration, reported Jee-Young Lee, MD, PhD, of the Seoul National University Boramae Medical Center, and colleagues in Neurology.

“Our study is the first to show a link between the thinning of the retina and a known sign of the progression of the disease — the loss of brain cells that produce dopamine,” Lee said in a statement.

“We also found the thinner the retina, the greater the severity of disease. These discoveries may mean that neurologists may eventually be able to use a simple eye scan to detect Parkinson’s disease in its earliest stages, before problems with movement begin.”

Retinal pathology has been tied to other neurodegenerative disorders including dementia. In previous studies, retinal nerve fiber layer thickness has been linked to Parkinson’s disease, and OCT is a potential PD biomarker.

The search for a definitive Parkinson’s biomarker has been extensive and includes clinical (anosmia; REM behavior disorder), genetic (GBA mutation; LRRK2 mutation), and biochemical (blood and cerebrospinal fluid) techniques, along with positron emission tomography (PET), magnetic resonance imaging (MRI), and single photon emission computed tomography (SPECT) imaging.

No biomarker has been validated for clinical practice, noted Jamie Adams, MD, of the University of Rochester Medical Center in New York, and Chiara La Morgia, MD, PhD, of the University of Bologna in Italy, in an accompanying editorial: “Because of the complexity of the disease, combining biomarkers from different categories is likely the best strategy to accurately predict PD status and progression.”

In this analysis, Lee and colleagues studied 49 Parkinson’s patients with an average age of 69, along with 54 age-matched controls, including only early-stage, drug-naïve PD patients without ophthalmologic disease.

The researchers used high-resolution OCT to measure retinal nerve fiber layer thickness, microperimetry to measure retinal function, and dopamine transporter analysis to measure N(3-[18F]fluoropropyl)-2-carbomethoxy-3-(4-iodophenyl) nortropane uptake in the basal ganglia. Retinal layer thickness and volume were measured and compared in PD patients and controls.

Retinal thinning was found in the inferior and temporal perifoveal sectors of the PD patients, particularly the inner plexiform and ganglion cell layers, along with an association between retinal thinning and dopaminergic loss in the left substantia nigra. The team also reported an inverse association between inner retinal thickness in the inferior perifoveal sector and disease severity (Hoehn and Yahr stage), and a positive correlation between macular sensitivity and retinal layer thickness.

“Overall, these data support the presence of an association between retinal thinning and dopaminergic loss in PD,” said Adams and La Morgia. “Inner retinal thinning in individuals with PD has been reported in previous studies, but this is the first study that demonstrates a correlation between inner retinal thinning and nigral dopaminergic loss.”

“These findings may point to a pathologic connection between the retina and basal ganglia in PD and are in line with previous studies reporting asymmetric retinal nerve fiber layer loss, more evident in the eye contralateral to the most affected body side.”

The results need to be interpreted with caution, Lee and co-authors noted. Retina analysis was limited to the macular area in this research. Studies with larger numbers of Parkinson’s patients are needed to confirm the findings. And this study was a cross-sectional analysis, so correlations between retinal changes and PD severity need to be established over time.

But if the findings are confirmed, “retina scans may not only allow earlier treatment of Parkinson’s disease, but more precise monitoring of treatments that could slow progression of the disease as well,” Lee said.

https://www.medpagetoday.com/neurology/parkinsonsdisease/74575

Medical school at NYU will now be free for all students

New York University said Thursday that it will offer free tuition to all its medical school students, in the hope of encouraging more doctors to choose lower-paying specialties.

Many surveys have shown that medical school graduates gravitate to the more lucrative specialties, in part to pay off enormous student debts.

“Every student enrolled in our MD degree program receives a full-tuition scholarship, regardless of merit or financial need, that covers the majority of the cost of attendance,” the school says on its website.

NYU said it got a batch of grants to pay for the full scholarship option, including some from Home Depot co-founder Kenneth Langone, who chairs the medical school’s board of trustees.

“This decision recognizes a moral imperative that must be addressed, as institutions place an increasing debt burden on young people who aspire to become physicians,” Dr. Robert Grossman, dean of NYU’s school of medicine, said in a statement.

Medical school is expensive. The Association of American Medical Colleges calculates that it costs an average of more than $240,000 to attend a public medical school. It costs $322,000 for four years at a private school, the group calculates.

NYU says its scholarship, which begins in the 2018-19 school year, is worth $55,000 a year.

According to the American Academy of Family Physicians, the average debt for medical students is more than $100,000. The medical college association pegs the average debt at nearly twice that, or $180,000.To pay it off fast, medical school graduates often choose high-paying specialties such as orthopedics or plastic surgery. A survey last year by Medscape showed that orthopedists make $489,000 a year, compared with family practice physicians and pediatricians, who earn $200,000 a year.

The result is a shortage of the general care practitioners who are most needed, especially in rural parts of the country and in the so-called Rust Belt across the Midwest, according to several studies.

https://www.nbcnews.com/health/health-news/medical-school-will-be-free-nyu-n901431

Here’s what robots could learn from fire ants

Robots, take note: When working in tight, crowded spaces, fire ants know how to avoid too many cooks in the kitchen.

Observations of fire ants digging an underground nest reveal that a few industrious ants do most of the work while others dawdle. Computer simulations confirm that, while this strategy may not be the fairest, it is the most efficient because it helps reduce overcrowding in tunnels that would gum up the works. Following fire ants’ example could help robot squads work together more efficiently, researchers report in the Aug. 17 Science.

Robots that can work in close, crowded quarters without tripping each other up may be especially good at digging through rubble for search-and-rescue missions, disaster cleanup or construction, says Justin Werfel, a collective behavior researcher at Harvard University who has designed insect-inspired robot swarms.

Daniel Goldman, a physicist at Georgia Tech in Atlanta, and colleagues pored over footage of about 30 fire ants digging tunnels during 12-hour stretches. “To our surprise, we found that there’s only about three to five ants doing anything” at a time, Goldman says. Although individual ants’ activity levels varied over time, about 30 percent of the ants did about 70 percent of the work in any given 12-hour period.

To investigate why fire ants divvy up work this way, Goldman’s team created computer simulations of two ant colonies digging tunnels. In one, the virtual ants mimicked the real insects’ unequal work split; in the other, all the ants pitched in equally. The colony with fewer heavy lifters was better at keeping tunnel traffic moving; in three hours, that colony dug a tunnel that was about three times longer than the group of ants that all did their fair share.

Goldman’s team then tested the fire ants’ teamwork strategy on autonomous robots. These robots trundled back and forth along a narrow track, scooping up plastic balls at one end and dumping them at the other. Programming the robots to do equal work is “not so bad when you have two or three,” Goldman says, “but when you get four in that little narrow tunnel, forget about it.” The four-bot fleet tended to get stuck in pileups. Programming the robots to share the workload unequally helped avoid these smashups and move material 35 percent faster, the researchers found.

J. Aguilar et al. Collective clog control: Optimizing traffic flow in confined biological and robophysical excavation. Science. Vol. 361, August 17, 2018, p. 672.

Here’s what robots could learn from fire ants

Migraines are more common in women, but why?


Migraines are not typical headaches; they are extremely painful events and are often accompanied by nausea, blurred vision, or ultrasensitivity to smells, light, or sounds. These episodes can be debilitating and highly disruptive to day-to-day life. More women than men tend to experience them, and researchers ask why.

Ny Maria Cohut

Scientists at the Universitas Miguel Hernández in Elche, Spain, believe that the answer as to why migraines are more common among women may lie with the activity of sex hormones.

“We can observe significant differences in our experimental migraine model between males and females and are trying to understand the molecular correlates responsible for these differences,” says Prof. Antonio Ferrer-Montiel.

The trigeminovascular system is made up of neurons that are found in a cranial nerve known as the trigeminovascular nerve. Researchers have suggested that this system is involved in migraine mechanisms.

In the new study, Prof. Ferrer-Montiel and his team argue that the activity of sex-specific hormones interact with the trigeminal system in a way that renders its nerve cells more sensitive to migraine triggers.

These findings now appear in the journal Frontiers in Molecular Biosciences, as part of a special issue focusing on the importance of targeting proteins in cell membranes as an effective therapeutic approach in medicine.

In the future, Prof. Ferrer-Montiel and colleagues hope that their findings may lead to a better, more personalized approach to migraine management.

The researchers conducted a review of existing studies about sex hormones, what drives migraine sensitivity, and how nerves react to migraine triggers. In doing so, they were looking to understand how specific sex hormones might facilitate the development of migraines.

Soon enough, they found that certain sex hormones — such as testosterone — actually appear to play a protective role. However, other hormones — such as prolactin — seem to intensify the severity of migraines, according to the scientists.

Yhese hormones, the authors say, either boost cells’ sensitivity to migraine triggers or desensitize them, by interacting with the cells’ ion channels. These are a type of membrane protein that allow ions (charged particles) to pass through and influence the cells’ sensitivity to various stimuli.

Through their research, Prof. Ferrer-Montiel and team identified the hormone estrogen as a key player in the development of migraines.

At first, the team saw that estrogen was tied to higher migraine prevalence in women experiencing menstruation. Moreover, they also found that certain types of migraine were linked to changes in hormone levels around menstruation.

Specifically, Prof. Ferrer-Montiel and colleagues noticed that changes in estrogen levels means that trigeminal nerve cells may become more sensitive to external stimuli, which can lead to a migraine episode.

At the same time, the researchers warn that nobody should jump to any conclusions based on the evidence gathered so far. This study, they say, is preliminary, and much more research is needed to determine the exact roles that hormones play in the development and prevention of migraine.

Also, the new study has focused on findings from research conducted in vitro, or on animal models, so Prof. Ferrer-Montiel and colleagues advise that in the future, it will be important to conduct longitudinal studies with human participants.

If their findings are confirmed and consolidated, the scientists believe they could lead to improved strategies for the management of migraines.

“If successful, we will contribute to better personalized medicine for migraine therapy,” concludes Prof. Ferrer-Montiel.

https://www.medicalnewstoday.com/articles/322767.php

Scientists have found that our big toe was one of the last parts of the foot to evolve

As our early ancestors began to walk on two legs, they would also have hung about in trees, using their feet to grasp branches. They walked differently on the ground, but were still able to move around quite efficiently. The rigid big toe that eventually evolved gives efficient push-off power during walking and running.

The findings have been published in the journal Proceedings of the National Academy of Sciences.

In this new study, scientists made 3D scans of the toe bone joints from living and fossil human relatives, including primates such as apes and monkeys, and then compared them to modern day humans.

They overlaid this information onto an evolutionary tree, revealing the timing and sequence of events that produced the human forefoot.

The main finding is that the current shape of the bones in the big toe, or “hallux” in anatomical language, must have evolved quite late in comparison with the rest of the bones that they investigated.

In an interview with the BBC, lead author of the study Dr Peter Fernandez, from Marquette University in Milwaukee, said: “Our ability to efficiently walk and run on two feet, or be ‘bipedal’, is a crucial feature that enabled humans to become what they are today. For everything to work together, the foot bones first had to evolve to accommodate the unique biomechanical demands of bipedalism”.

He then said: “The big toe is mechanically very important for walking. In our study, we showed that it did not reach its modern form until considerably later than the other toes.”

When asked whether the rigid big toe evolved last because it is most or least important, Dr Fernandez commented: “It might have been last because it was the hardest to change. We also think there was a compromise. The big toe could still be used for grasping, as our ancestors spent a fair amount of their time in the trees, before becoming fully committed to walking on the ground.”

He added: “Modern humans have increased the stability of the joint to put the toe in an orientation that is useful for walking, but the foot is no longer dextrous like an ape.”vvvv

The reason that our ancestors stood upright and then walked on two feet is still a mystery, but there are plenty of ideas. Scientists think that walking may have evolved, either because it freed our hands to carry tools, or because climate change led to a loss of forests, or that overhead arms can be used to support walking on two legs along thin branches.

Studies such as this new one show that early human ancestors must have able been to walk upright for millions of years, since the 4.4 million year old fossil Ardipithecus ramidus, but that they did not fully transition to a modern walk until much later, perhaps in closer relatives within our own group, Homo.

This new study, alongside other work, now confirms that early walking humans, or “hominins” still used their feet to grasp objects.

Dr William Harcourt-Smith from City University of New York, who was not involved in this study, said: “They are suggesting that one of the earliest hominins, Ardipithecus, was already adapting in a direction away from the predicted morphology of the last common ancestor of chimps and modern humans, but not ‘towards’ modern humans. To me this implies that there were several lineages within hominins that were likely experimenting with bipedalism in different ways to each other.”

Professor Fred Spoor, an expert in human anatomy at the Natural History Museum, London said: “It was a bit of shock when hominins were found that have a grasping, or opposable, big toe, as this was thought to be incompatible with effective bipedalism. This work shows that different parts of the foot can have different functions. When a big toe is opposable, you can still function properly as a biped.”

The scientists involved say that this work shows that early hominin feet had a mixed and versatile set of functions. Becoming human was not a giant step, but a series of gradual changes, with some of the last and arguably most important changes being made to big toes. Peter Fernandez said that they would like to conduct similar analyses on the remaining bones of the forefoot, in order to fully characterise the changes involved in the evolution of bipedal walking.

https://www.bbc.co.uk/news/science-environment-45183651