Scientists Identify The Location of ‘Taste’ in Your Head, And It’s Not The Tongue

by David Nield

How exactly do our brains sort between the five taste groups: sweet, sour, salty, bitter and umami? We’ve now got a much better idea, thanks to research that has pinned down where in the brain this taste processing happens.

Step forward: the insular cortex. Already thought to be responsible for everything from motor control to social empathy, we can now add flavour identification to its list of jobs.

It’s an area of the brain scientists have previously suspected could be responsible for sorting tastes, and which has been linked to taste in rodents, but this new study is much more precise in figuring out the role it plays in decoding what our tongues are telling us.

“We have known that tastes activate the human brain for some time, but not where primary taste types such as sweet, sour, salty, and bitter are distinguished,” says one of the team, Adam Anderson from Cornell University in New York.

“By using some new techniques that analyse fine-grained activity patterns, we found a specific portion of the insular cortex – an older cortex in the brain hidden behind the neocortex – represents distinct tastes.”

Anderson and his team used detailed fMRI scans of 20 adults as well as a new statistical model to dig deeper than previous studies into the link between the insular cortex and taste. This helped separate the taste response from other related responses – like the disgust we might feel when eating something sour or bitter.

Part of the problem in pinning down the taste-testing parts of the brain is that multiple regions of neurons get busy whenever we’re eating something. However, this study helps to cut through some of that noise.

In particular, it seems that different tastes don’t necessarily affect different parts of the insular cortex, but rather prompt different patterns of activity. Those patterns help the brain determine what it’s tasting.

For example, one particular section of the insular cortex was found to light up – in terms of neural activity – whenever something sweet was tasted. It’s a literal sweet spot, in other words, but it also showed that different brains have different wiring.

“While we identified a potential sweet spot, its precise location differed across people and this same spot responded to other tastes, but with distinct patterns of activity,” says Anderson.

“To know what people are tasting, we have to take into account not only where in the insula is stimulated, but also how.”

The work follows on from previous research showing just how big a role the brain plays in perceiving taste. It used to be thought that receptors on the tongue did most of the taste testing, but now it seems the brain is largely in charge of the process.

That prior study showed how switching certain brain cells on and off in mice was enough to prevent them from distinguishing between sweet and bitter. The conclusion is that while the tongue does identify certain chemicals, it’s the brain that interprets them.

The new research adds even more insight into what’s going on in the brain in humans when we need to work out what we’re tasting – and shows just how important a job the insular cortex is doing.

“The insular cortex represents experiences from inside our bodies,” says Anderson. “So taste is a bit like perceiving our own bodies, which is very different from other external senses such as sight, touch, hearing or smell.”

The research has been published in Nature Communications.

https://www.sciencealert.com/now-we-know-the-part-of-the-brain-that-tells-us-what-we-re-tasting

Neuroscience proves Nietzsche right: some people are wired to be more spontaneous than others

by Parashkev Nachev

“Why can’t you just relax into it?” is a question many of us have asked in frustration with ourselves or others – be it on the dance floor, the sporting field or in rather more private circumstances. The task typically requires us to respond spontaneously to external events, without any deliberation whatsoever. It ought to be easy – all you have to do is let go – yet it can be infuriatingly difficult.

“Stop thinking about it!” is the standard remedial advice, although cancelling thought with thought is something of a paradox. The retort, “I am trying!”, is equally puzzling, for deliberate intent is precisely what we are here struggling to avoid. So what is this act of choosing not to choose, of consciously relinquishing control over our actions? Our new study, published in Communications Biology, has finally provided insights into how this capacity is expressed in the brain.

Astonishingly, this fundamental human phenomenon has no name. It might have escaped academic recognition entirely had the German philosopher Friedrich Nietzsche not given it a brilliant gloss in his first book The Birth of Tragedy, itself a paradoxical work of philosophy in tacitly encouraging the reader to stop reading and get a drink instead. Whereas other thinkers saw culture on a single continuum, evolving into ever greater refinement, order and rationality, Nietzsche saw it as distributed across two radically different but equally important planes.

Perpendicular to the conventional “Apolline” dimension of culture, he introduced the “Dionysiac”: chaotic, spontaneous, vigorous and careless of the austere demands of rationality. Neither aspect was held to be superior, each may be done badly or well, and both are needed for a civilisation to find its most profound creative expression. Every Batman needs a Joker, he might have said, had he lived in a more comical age.

Of course, Nietzsche was not the first to observe that human beings sometimes behave with wanton abandon. His innovation consisted in realising it is a constitutional characteristic we could and should develop. And as with any behavioural characteristic, the facility to acquire it will vary from one person to another.

Seeing the light

As Dionysus and neuroscientists are mostly strangers, it should come as no surprise that the capacity for “meta-volition” – to give it a name that captures the notion of choosing not to choose one’s actions – has until now escaped experimental study. To find out how our brains allow us to give up control and explain why some of us are better at it than others, my colleagues and I wanted to develop a behavioural test and examine the patterns of brain activity that go with lesser or greater ability.

Most tests in behavioural neuroscience pit conscious, deliberate, complex actions against their opposites, measuring the power to suppress them. A classic example is the anti-saccade task, which purportedly measures “cognitive control”. Participants are instructed not to look towards the light when they see a brief flash in the visual periphery, but instead to the opposite side. That’s hard to do because looking towards the light is the natural inclination. People who are better at this are said to have greater cognitive control.

To measure how good people are at relinquishing control, we cannot simply flip a task around. If people are asked to look into the light, will and instinct are placed in perfect agreement. To put the two in opposition, we must make the automatic task unconscious so that volition could only be a hindrance.


White matter map of the brain (ray traced rendering), with the area correlated with spontaneity in red. Credit: Parashkev Nachev

It turns out that this is easy to do by flashing two lights on opposite sides of the visual periphery nearly simultaneously, and asking the subject to orient as fast as possible to the one they see first. If a flash comes a few dozen milliseconds before the next, people typically obtain an automatic bias to the first flash. You need at least double that amount of time to reach the threshold for consciously detecting which one comes first. Thinking about what came first could only impair your performance because your instinct operates well beneath the threshold at which the conscious gets a foothold.

Amazingly for such a simple task, people vary dramatically in their ability. Some – the Dionysiacs – effortlessly relax into allowing themselves to be guided by the first light, requiring no more than a few milliseconds between the flashes. Others – the Apollines – cannot let go, even when the flashes are many times further apart. Since trying harder does not help, the differences are not a matter of effort but appear to be part of who we are.

We used magnetic resonance imaging to investigate the brains of people performing the task, focusing on white matter – the brain’s wiring. A striking picture emerged. Extensive sections of the wiring of the right prefrontal lobe, a region heavily implicated in complex decision making, was revealed to be stronger in those who were worse at the task: the Apollines. The more developed the neural substrates of volition, it seems, the harder to switch them off.

By contrast, no part of the Dionysiac brain showed evidence of stronger wiring. Suppressing volition appears to depend less on a “meta-volitional centre” that is better developed than on the interplay between spontaneous and deliberate actions. We can think of it as two coalitions of brain cells in competition, with the outcome dependent on the relative strength of the teams, not the qualities of any umpire.

The competitive brain

The results demonstrate how the brain operates by competition at least as much as by cooperation. It may fail at a task not because it does not have the power, but because another, more dominant power stands in opposition. Our decisions reflect the outcomes of battles between warring factions that differ in their characteristics and evolutionary lineage, battles we can do little to influence because we are ourselves their products.

People also differ widely in their qualities, including spontaneity, not because evolution has not yet arrived at an optimum, but because it seeks to diversify the field as far as possible. That’s why it creates individuals tuned to respond to their environment in very different ways. The task of evolution is less to optimise a species for the present than to prepare it for a multiplicity of futures unknown.

That our lives are now dominated by a rational, Apolline order does not mean we shall not one day descend into an instinctual, Dionysiac chaos. Our brains are ready for it – our culture should be too.

https://medicalxpress.com/news/2019-03-neuroscience-nietzsche-people-wired-spontaneous.html

Scientists Have Devised a Blood Test That Can Accurately Diagnose Fibromyalgia

by CARLY CASSELLA

Scientists are closing in on a blood test for fibromyalgia, and the result could save patients from what is currently a lengthy and vague process of diagnosis.

Researchers at Ohio State University are now aiming to have a diagnostic blood test available for widespread use within the next five years.

Their confidence stems from a recently discovered biomarker – a “metabolic fingerprint” as the researchers put it – traceable in the blood of those with the disorder.

“We found clear, reproducible metabolic patterns in the blood of dozens of patients with fibromyalgia,” says lead author Kevin Hackshaw, a rheumatologist at Ohio State University.

“This brings us much closer to a blood test than we have ever been.”

Fibromyalgia is a common, debilitating, and poorly understood disorder, marked by widespread pain and fatigue, with no known cause and absolutely no cure.

In the United States, it’s the most common cause of chronic widespread pain, and that’s not even counting the thousands of patients who go undiagnosed every year.

Without a reliable way to detect this disorder, it’s estimated that up to three out of four people with the condition remain undiagnosed. And on average it can take five years from when a person’s symptoms first appear to them actually receiving a diagnosis.

In total, the US Centers for Disease Control and Prevention estimates that about two percent of the population – around four million adults – have fibromyalgia, with women making up a disproportionate slice.

Left with few options, many patients are simply forced to live with their pain.With nowhere to go, many become desperate and turn to potentially harmful treatments.

“When you look at chronic pain clinics, about 40 percent of patients on opioids meet the diagnostic criteria for fibromyalgia,” says Hackshaw.

“Fibromyalgia often gets worse, and certainly doesn’t get better, with opioids.”

It was Hackshaw’s goal to intervene sooner. Using vibrational spectroscopy, a technique which measures the energy of molecules, his team analysed blood samples from 50 people with fibromyalgia, 29 with rheumatoid arthritis, 19 with osteoarthritis, and 23 with lupus.

Despite the fact these disorders can present with similar symptoms, the blood of those participants with fibromyalgia was distinct.

Using these unique patterns, the researchers then tried to blindly predict participants’ diagnoses. Even without knowing their true disorder, the researchers were able to accurately diagnose every study participant based on that molecular fingerprint in the blood.

“These initial results are remarkable,” says co-author Luis Rodriguez-Saona, an expert in vibrational spectroscopy at Ohio State University.

“If we can help speed diagnosis for these patients, their treatment will be better and they’ll likely have better outlooks. There’s nothing worse than being in a grey area where you don’t know what disease you have.”

While the sample size is undoubtedly small, the results are promising. If the team can replicate their results on a larger scale, with a couple hundred diverse participants, then a blood test in five years might not seem so far-fetched.

Not to mention what that would mean for treatment. If the researchers can prove they really have identified a biological fingerprint for fibromyalgia, this could give us new drug targets in the future.

“Thus,” the authors conclude, “our studies have great importance both from development of a reproducible biomarker as well as identifying potential new therapeutic targets for treatment.”

This study has been published in the Journal of Biological Chemistry.

https://www.sciencealert.com/scientists-have-devised-a-blood-test-that-can-accurately-diagnose-fibromyalgia

Sun bears copy each other’s facial expressions to communicate

The world’s smallest bears copy one another’s facial expressions as a means of communication.

A team at the University of Portsmouth, UK, studied 22 sun bears at the Bornean Sun Bear Conservation Centre in Malaysia. In total, 21 matched the open-mouthed expressions of their playmates during face-to-face interactions.

When they were facing each other, 13 bears made the expressions within 1 second of observing a similar expression from their playmate.

“Mimicking the facial expressions of others in exact ways is one of the pillars of human communication,” says Marina Davila-Ross, who was part of the team. “Other primates and dogs are known to mimic each other, but only great apes and humans were previously known to show such complexity in their facial mimicry.”

Sun bears have no special evolutionary link to humans, unlike monkeys or apes, nor are they domesticated animals like dogs. The team believes this means the behaviour must also be present in various other species.

Also known as honey bears, sun bears are the smallest members of the bear family. They grow to between 120 centimetres and 150 centimetres long and weigh up to 80 kilograms. The species is endangered and lives in the tropical forests of South-East Asia.

While the bears prefer a solitary life, the team says that they engage in gentle and rough play and may use facial mimicry to indicate they are ready to play more roughly or strengthen social bonds.

“It is widely believed that we only find complex forms of communication in species with complex social systems,” says Derry Taylor, also on the team. “As sun bears are a largely solitary species, our study of their facial communication questions this belief, because it shows a complex form of facial communication that until now was known only in more social species.”

Journal reference: Scientific Reports, DOI: 10.1038/s41598-019-39932-6

Gravity Influences How We Make Decisions, Says New Study

Returning to Earth from the International Space Station, Canadian astronaut Chris Hadfield remarked how making the right decision is vital in high pressure environments, saying:

Most of the time, you only really get one try to do most of the critical stuff and the consequences are life or death.

Mankind is preparing for a new space age: manned missions to Mars are no longer a distant dream and commercial ventures may open up the prospect for non astronauts to visit other planets. Understanding how gravity impacts the way in which we make decisions has never been more pressing.

All living organisms on Earth have evolved under a constant gravitational field. That’s because gravity is always there and it is part of the background of our perceptual world: we cannot see it, smell it or touch it.

Nevertheless, gravity plays a fundamental role in human behaviour and cognition.

The central nervous system does not have “specialised” sensors for gravity. Rather, gravity is inferred through the integration of several sensory signals in a process termed graviception. This involves vision, our balance system and information from the joints and muscles.

Sophisticated organs inside the inner ear are particularly important in this process. Under terrestrial gravity, when our head is upright, small stones – the vestibular otoliths – are perfectly balanced on a viscous fluid.

When we move the head, for instance looking up, gravity makes the fluid move and this triggers a signal which informs the brain that our head is no longer upright.

Long-duration exposure to zero gravity, such as during space missions, leads to several structural and functional changes in the human body. While the influence of zero gravity on our physical functions has been largely investigated, the effects on decision-making are not yet fully understood.

Given the technical limitations and the expected gap of a few minutes in communication with Earth if we go to Mars, knowing the impact of altered gravity on how people make decisions is essential.

Novelty versus routine

In a nutshell, human behaviour is a constant trade off between the exploitation of familiar but possibly sub-optimal choices and the exploration of new and potentially more profitable alternatives.

For example, in a restaurant you can exploit by choosing your usual chocolate cake, or you can explore by trying that tiramisu that you’ve not had before. Thus, exploitation involves routine behaviour, while exploration involves varying choices.

We investigated whether alterations in gravity impact the choice between routine and novel behaviour. We asked participants to come to the lab and produce sequences of numbers as randomly as possible.

Every time they heard a beep sound, they needed to name a number between one and nine. Importantly, there was no time to think or to count, just name a number.

Critically, this task requires our brain to suppress routine responses and generate novel responses, and it can be considered a proxy for successful adaptive behaviour.

But how does this change under the influence of gravity? We manipulated how the otoliths sense gravity by changing the orientation of participants’ bodies with respect to the direction of terrestrial gravity by asking them to lie down.

When we are upright, our body and otoliths are congruent with the direction of gravity, while when we are lying down they are orthogonal (at right angles).

This is a very efficient laboratory manipulation, which allows us to mimic alterations of gravitational signals reaching the brain. It is actually a better way to study the effects of gravity than sending someone to space.

That’s because when we are in space we are also affected by weightlessness, radiation and isolation – and it can be hard to separate what effect the lack of gravity alone has.

Our results indicate that lying down does seem to influence how people make decisions, with participants struggling with random number generation. This indicates that people are therefore less prone to generating novel behaviours in the absence of gravity.

This may be of importance to the planning of actual space missions. Astronauts are in an extremely challenging environment in which decisions must be made quickly and efficiently. An automatic preference for routine or stereotyped options might not help with complex problem solving, and could even place life at risk.

The results add to research suggesting that people also suffer changes in perception and cognition when under conditions mimicking zero-gravity. The absence of gravity can be profoundly unsettling, and can potentially compromise performance levels in many ways.

This suggests that astronauts may benefit from some sort of cognitive enhancement training to help them overcome the effects of altered gravity on the brain, and to assure successful and safe manned space missions.The Conversation

https://www.sciencealert.com/exposure-to-zero-gravity-can-change-how-human-make-decisions

Having great-grandparents or cousins with Alzheimer’s disease substantially increases your risk of developing it also.

Having a parent with Alzheimer’s disease has been known to raise a person’s risk of developing the disease, but new research published in Neurology suggests that having second- and third-degree relatives who have had Alzheimer’s may also increase risk.

“Family history is an important indicator of risk for Alzheimer’s disease, but most research focuses on dementia in immediate family members, so our study sought to look at the bigger family picture,” said Lisa A. Cannon-Albright, PhD, University of Utah School of Medicine, Salt Lake City, Utah. “We found that having a broader view of family history may help better predict risk. These results potentially could lead to better diagnoses and help patients and their families in making health-related decisions.”

For the study, researchers looked at the Utah Population Database, which includes the genealogy of Utah pioneers from the 1800s and their descendants up until modern day. The database is linked to Utah death certificates, which show causes of death, and in a majority of cases, contributing causes of death.

In that database, researchers analysed data from over 270,800 people who had at least 3 generations of genealogy connected to the original Utah pioneers including genealogy data for both parents, all 4 grandparents, and at least 6 of 8 great-grandparents. Of those, 4,436 have a death certificate that indicates Alzheimer’s disease as a cause of death.

Results showed that people with 1 first-degree relative with Alzheimer’s disease (18,494 people) had a 73% increased risk of developing the disease. Of this group of people, 590 developed Alzheimer’s disease; the researchers would have expected this group to have 341 cases.

People with 2 first-degree relatives were 4 times more likely to develop the disease; those with 3 were 2.5 times more likely; and those with 4 were nearly 15 times more likely to develop Alzheimer’s disease.

Of the 21 people in the study with 4 first-degree relatives with Alzheimer’s, 6 had the disease. The researchers would have expected only 0.4 people to develop the disease.

Those with 1 first-degree relative and 1 second-degree relative had a 21 times greater risk. Examples of this would be a parent and one grandparent with the disease, or a parent and one aunt or uncle. There were 25 people in this category in the study; 4 of them had the disease when researchers would have expected 0.2 cases.

Those who had only third-degree relatives, and 3 such relatives, with Alzheimer’s disease had a 43% greater risk of developing the disease. An example of this would be two great-grandparents with the disease, along with one great uncle, but no parents or grandparents with the disease. Of the 5,320 people in this category, 148 people had the disease when researchers would have expected 103.

“More and more, people are increasingly seeking an estimate of their own genetic risk for Alzheimer’s disease,” said Dr. Cannon-Albright. “Our findings indicate the importance of clinicians taking a person’s full family history that extends beyond their immediate family members.”

She noted that among all of the study participants, 3% had a family history that doubled their risk of Alzheimer’s disease, and a little over one-half of a percent had a family history that increased their risk by ≥3 times that of a person without a family history of the disease.

Limitations of the study include that not all individuals dying from Alzheimer’s disease may have had a death certificate listing it as cause of death. Dr. Cannon-Albright said death certificates are known to underestimate the prevalence of the disease.

“There are still many unknowns about why a person develops Alzheimer’s disease,” she said. “A family history of the disease is not the only possible cause. There may be environmental causes, or both. There is still much more research needed before we can give people a more accurate prediction of their risk of the disease.”

Reference:
https://n.neurology.org/content/early/2019/03/13/WNL.0000000000007231

https://dgnews.docguide.com/having-great-grandparents-cousins-alzheimer-s-linked-higher-risk?overlay=2&nl_ref=newsletter&pk_campaign=newsletter&nl_eventid=20119

AI and MRIs at birth can predict cognitive development at age 2


Researchers at the University of North Carolina School of Medicine used MRI brain scans and machine learning techniques at birth to predict cognitive development at age 2 years with 95 percent accuracy.

“This prediction could help identify children at risk for poor cognitive development shortly after birth with high accuracy,” said senior author John H. Gilmore, MD, Thad and Alice Eure Distinguished Professor of psychiatry and director of the UNC Center for Excellence in Community Mental Health. “For these children, an early intervention in the first year or so of life – when cognitive development is happening – could help improve outcomes. For example, in premature infants who are at risk, one could use imaging to see who could have problems.”

The study, which was published online by the journal NeuroImage, used an application of artificial intelligence called machine learning to look at white matter connections in the brain at birth and the ability of these connections to predict cognitive outcomes.

Gilmore said researchers at UNC and elsewhere are working to find imaging biomarkers of risk for poor cognitive outcomes and for risk of neuropsychiatric conditions such as autism and schizophrenia. In this study, the researchers replicated the initial finding in a second sample of children who were born prematurely.

“Our study finds that the white matter network at birth is highly predictive and may be a useful imaging biomarker. The fact that we could replicate the findings in a second set of children provides strong evidence that this may be a real and generalizable finding,” he said.

Jessica B. Girault, PhD, a postdoctoral researcher at the Carolina Institute for Developmental Disabilities, is the study’s lead author. UNC co-authors are Barbara D. Goldman, PhD, of UNC’s Frank Porter Graham Child Development Institute, Juan C. Prieto, PhD, assistant professor, and Martin Styner, PhD, director of the Neuro Image Research and Analysis Laboratory in the department of psychiatry.

AI and MRIs at birth can predict cognitive development at age 2

15 Foods That Naturally Lower Blood Pressure

High blood pressure is one of those conditions that can quietly and slowly damage your body. For starters, uncontrolled blood pressure can lead to stroke, narrowing of arteries, heart failure, kidney problems, damage to eye vessels, dementia, and other serious conditions. That’s the bad news. The good news is that what you eat can significantly help you reduce your blood pressure, especially if it’s already elevated or in the borderline range.

Here are the best 15 foods that you can eat to naturally lower your blood pressure and safeguard your long-term health:

1. Swiss chard, spinach, arugula, turnip greens, beet greens, collard greens, and other leafy greens

What do all these foods have in common? Potassium. This nutrient reduces blood pressure by balancing electrolytes in the body and helping the kidneys get rid of excess sodium. Aiming for 4,700 mg of potassium daily from foods like leafy greens can help you do that! Bananas tend to get all the fame when it comes to potassium, but a cup of cooked Swiss chard contains 960 mg potassium, and a cup of cooked spinach contains 839 mg potassium, while one banana contains only about half of that (422 mg). Since they are lower in carbohydrates and calories, leafy greens may fit better into your overall health goals.

Magnesium is another mineral that helps lower blood pressure by dilating blood vessels. A meta-analysis found that, on average, 400 mg of magnesium per day lowers diastolic blood pressure by 2.2 points. A cup of cooked spinach contains 157 mg of magnesium.

Spinach and arugula also contain nitrate, which dilates arteries and reduces blood pressure. Don’t confuse nitrate, found naturally in spinach, arugula, celery, and other vegetables, with nitrites found in cured and aged meats. Healthy women who ate nitrate-rich vegetables for one week reduced systolic blood pressure. Another study found that a Japanese diet high in nitrate reduced diastolic blood pressure by 4.5 points compared to a diet low in nitrates.

2. Acorn squash, yams, sweet potatoes, and other winter squashes

One baked potato contains 926 g of potassium, and a cup of acorn squash contains 896 g of potassium. Sweet potatoes and butternut squash are runners-up. These vegetables are starchy, so stick to about one cup a day and use them to replace other high-carbohydrate foods like processed grains, sweets, or pastries. Roast acorn squash and mix with collard greens for a nice fall side dish. Make butternut squash soup. A baked white or sweet potato can fit into a healthy diet—as long as your plate has other nonstarchy veggies.

3. Berries

Berries are rich in polyphenols and vitamin C, which can help reduce inflammation in arteries. Two servings of berries a day for eight weeks reduced systolic and diastolic blood pressure in people who had mild hypertension. Those who had higher blood pressure levels at the beginning of the study showed the most reductions. Incorporate a variety of berries in your diet—in smoothies, snacks, or salads.

4. Beans and lentils

Beans and lentils are excellent sources of potassium and magnesium. Cooked lentils have 731 g of potassium per cup and a cup of cooked lima, white, pinto, or kidney beans has between 700 to 950 g potassium. Beans also contain magnesium, with as much as 120 g of magnesium packed into just one cup of cooked black beans.

5. Oats

It might come as a surprise, but oats are also a great food to eat if you want to be mindful of your blood pressure. This is thanks to a special fiber in oats, called beta-glucan, which helps reduce blood pressure. A study found that consuming oat beta-glucan daily lowered blood pressure in obese men and women with elevated blood pressure at baseline. A different small study found that 5.5 g of beta-glucan daily from oats for six weeks reduced systolic and diastolic blood pressure by 7.5 and 5.5 points in people who had mild or borderline hypertension.

Unfortunately, the amount of beta-glucan in oats will vary and isn’t listed on nutrition labels. A rule of thumb to follow is that higher fiber content in general means more beta-glucan. Rolled oats contain 3.3 grams of fiber per ⅓ cup while the same amount of oat bran packs 6 grams. Just be aware that oat fiber may increase bowel movement frequency or cause stomach upset as it gets fermented by your gut bacteria. If your diet is low in fiber, start gradually. If it causes severe diarrhea or stomach pain that won’t go away with slow introduction, consult with a dietitian who has experience in digestive health to see if oats are a good fit for you.

6. Beetroot juice

If you’re a fan of beets, you’ll be happy to learn that beetroot contains nitrate, which dilates vessels and reduces blood pressure, in addition to potassium and polyphenols. One study found that a little less than 5 ounces of beetroot juice reduced systolic and diastolic blood pressure by 7.9 and 5.7 points just three hours after drinking it. A meta-analysis also found that beet juice reduced systolic and diastolic blood pressure, especially when consumed for 14 days or more.

Roasted beets as a side dish or adding beats to salads is a healthy and beautiful addition to everyday meals. However, the research on blood pressure is done with beet juice. If you want to replicate the benefit at home, pull the juicer out and add some fresh beet juice to your daily routine.

7. Salmon

The health benefits of salmon seem to be never-ending—and blood pressure is no different. In one study, researchers found that eating 150 mg (5 ounces) of salmon containing 2.1 g of omega-3 fatty acids three times a week reduced diastolic blood pressure by 2 points. Fish oil capsules that contained 1.3 g of omega-3s had a similar effect. While cod had no effect on blood pressure in that study, it packs—along with tuna, halibut, and scallops—an excellent amount of potassium, making it a great addition to a healthy diet.

8. Olive oil

Olive oil is another food that has endless health benefits. One study showed that a daily intake of 1 ounce of polyphenols from olive oil for two months reduced systolic and diastolic blood pressure by 7.91 and 6.65 points. Improvement was more significant in people who had higher blood pressure levels to start with.

Olive oil products have a wide range of polyphenol levels, so keep in mind that the fresher and more bitter and pungent the olive oil, the more polyphenols in has. Obtain olive oil from high-quality sources and eat it raw as much as possible. Drizzle over salads and on vegetables after you finish cooking them. In the study, polyphenol-depleted olive oil didn’t show lower blood pressure.

9. Pistachio

If you’re willing to put in the work of de-shelling pistachios one by one, your blood pressure will thank you. Research has shown that people with high cholesterol who followed three diets for four weeks each: a control low-fat diet, a diet with one serving of pistachios a day (10 percent of calories), or a diet with two servings of pistachios a day (20 percent of calories). Eating one serving of pistachios reduced systolic blood pressure the most by 4.8 points.

All you cashew and almond butter fans might be wondering: What about other nuts? Mixed nuts lowered blood pressure but only in people without type 2 diabetes—and pistachios were still the most effective.

10. Flaxseeds

Flaxseeds aren’t just great for their high-fiber content, as one study showed that people with high blood pressure who ate 30 g of milled flaxseed a day for six months reduced systolic and diastolic blood pressure by 10 and 7 points.

If possible, buy whole flaxseed and grind as much as you need every few days. Add to your oatmeal or smoothie, or use instead of white flour for pancakes, muffins, or breading. Flaxseed oil may not lower systolic blood pressure, but diastolic blood pressure may improve with both the oil and the meal.

Other seeds like pumpkin and chia seeds may also help lower blood pressure as they are excellent magnesium sources.

11. Dairy foods

Dairy isn’t for everyone, but if you can tolerate it, you should know that a study on over 2,500 people—with normal blood pressure who were tracked for 14.6 years—showed that those who ate three or more servings of dairy per day or week compared to fewer than one serving, had slower increases in blood pressure. In other words, dairy consumption delayed hypertension but didn’t completely prevent it.

A meta-analysis also showed that low-fat dairy and milk reduced the risk of hypertension while cheese, yogurt, fermented dairy, or full-fat milk had no effects. A later systematic review agreed, but the authors concluded that it is not clear whether low-fat dairy was more beneficial than regular-fat dairy when it comes to blood pressure.

If you can tolerate dairy, enjoy it daily or weekly as it may help reduce your risk for developing high blood pressure. If you can’t tolerate it due to food sensitivities, allergies, digestive discomforts, or autoimmune issues, don’t stress. You can get benefits from the other foods on this list.

12. Pomegranate juice

Pomegranate juice is more than just a beautiful color, with one study showing that men and women who drank 11 ounces of pomegranate juice daily for four weeks reduced systolic and diastolic blood pressure by 3.14 and 3.33 points. Another study found that drinking 5 ounces reduced systolic blood pressure by 7 percent and diastolic blood pressure by 6 percent when measured six hours later.

But do you really need 11 ounces to get the benefit? A meta-analysis found that any amount of pomegranate juice (higher or lower than 8 ounces) and for any duration (longer or shorter than 12 weeks) reduced systolic blood pressure. However, diastolic blood pressure reduction was significant only with more than 8 ounces a day. Start with a small amount, about 4 to 8 ounces, if you’re trying to manage your sugar intake.

13. Garlic

Garlic is one of those low-key superfoods we tend to underestimate. But several studies found that taking garlic powders and extracts for one to three months can lower systolic and diastolic blood pressure in people with high or normal blood pressure. However, it is difficult to extrapolate the exact benefit of garlic as a food from studies that looked at concentrated doses. Allicin is the active ingredient in garlic and makes up only 1 percent of its weight.

Fresh garlic has more allicin than cooked, so eat few raw garlic cloves daily to see significant changes in blood pressure. Add minced or chopped garlic to salad dressings or dips. Raw garlic goes well with tahini and lemon, while parsley or cilantro help neutralize garlic breath! Don’t take garlic supplements without consulting your dietitian or doctor as they may cause heartburn, burping, upset stomach, or too much blood thinning.

14. Dark chocolate

Good news for all the chocolate lovers out there! Polyphenol-rich chocolate can lower blood pressure by 2 points on average, especially if your blood pressure is already elevated. In one study, people who had slightly elevated blood pressure reduced systolic and diastolic levels by 2.9 and 1.9 points after eating 0.2 ounces of dark chocolate daily for 18 weeks. However, another study on middle-aged overweight women found that 22 g of cocoa daily had no effect on blood pressure (they found other cardiovascular benefits from chocolate, though!

15. Hibiscus tea

The dried flowers and stems of the hibiscus plant have been used throughout history for blood pressure and other ailments. In patients who were pre-hypertensive or had mild hypertension, drinking three cups of hibiscus tea daily for six weeks reduced systolic blood pressure. Reductions were most significant in people who started with higher levels.

Enjoy hibiscus tea warm or cold. It has a sour taste, so resist the temptation to add too much sugar. Keep in mind that it’s not safe during pregnancy as it can affect hormone levels and induce early labor. If you’re not pregnant but have hormone fluctuations, start slowly and monitor how your body reacts.

https://www.mindbodygreen.com/articles/foods-that-lower-blood-pressure

Laser probe detects deadly melanoma in seconds

By Lauren Sharkey

Melanoma is the deadliest form of skin cancer. With its incidence rates continuing to rise, researchers are looking for ways to spot it early on. A new laser device may be able to do so instantly.

“With skin cancer, there’s a saying that if you can spot it you can stop it — and that’s exactly what this probe is designed to do,” says Daniel Louie, a Ph.D. student at the University of British Columbia (UBC) in Canada.

Louie has helped design a low-cost device that can quickly detect cancerous skin cells.

Skin cancer is the most common cancer in the United States, according to the Centers for Disease Control and Prevention (CDC).

Typically split into two categories — melanoma and nonmelanoma — the condition can result in a series of complications if a person does not seek treatment.

While nonmelanoma cases may lead to disfigurement, melanoma can be deadly. Also, melanoma’s rates have been going up for the past 30 years, according to the American Cancer Society.

It is now one of the most common cancers in young adults, particularly young women.

How light waves detect cancer

Detecting the cancer early is essential for a good prognosis. One way to do so is using light waves. As these pass through objects, they scatter in a certain way. Louie used this principle to design a laser probe that could interpret these patterns within seconds.

“Because cancer cells are denser, larger, and more irregularly shaped than normal cells, they cause distinctive scattering in the light waves as they pass through,” he explains.

Researchers from UBC, BC Cancer, and the Vancouver Coastal Health Research Institute analyzed these light beam changes. They examined 69 lesions from 47 people at the Vancouver General Hospital Skin Care Centre in Canada.

This research — the results of which now appear in the Journal of Biomedical Optics — informed the probe’s design. Not only can it show the precise pattern of laser beams, but it can also very easily read them to detect the presence of cancer.

https://www.medicalnewstoday.com/articles/324690.php

Humans couldn’t pronounce ‘f’ and ‘v’ sounds before farming developed

By Alison George

Human speech contains more than 2000 different sounds, from the ubiquitous “m” and “a” to the rare clicks of some southern African languages. But why are certain sounds more common than others? A ground-breaking, five-year investigation shows that diet-related changes in human bite led to new speech sounds that are now found in half the world’s languages.

More than 30 years ago, the linguist Charles Hockett noted that speech sounds called labiodentals, such as “f” and “v”, were more common in the languages of societies that ate softer foods. Now a team of researchers led by Damián Blasi at the University of Zurich, Switzerland, has pinpointed how and why this trend arose.

They found that the upper and lower incisors of ancient human adults were aligned, making it hard to produce labiodentals, which are formed by touching the lower lip to the upper teeth. Later, our jaws changed to an overbite structure, making it easier to produce such sounds.

The team showed that this change in bite correlated with the development of agriculture in the Neolithic period. Food became easier to chew at this point, which led to changes in human jaws and teeth: for instance, because it takes less pressure to chew softer, farmed foods, the jawbone doesn’t have to do as much work and so doesn’t grow to be so large.

Analyses of a language database also confirmed that there was a global change in the sound of world languages after the Neolithic era, with the use of “f” and “v” increasing dramatically in recent millennia. These sounds are still not found in the languages of many hunter-gatherer people today.

This research overturns the prevailing view that all human speech sounds were present when Homo sapiens evolved around 300,000 years ago. “The set of speech sounds we use has not necessarily remained stable since the emergence of our species, but rather the immense diversity of speech sounds that we find today is the product of a complex interplay of factors involving biological change and cultural evolution,” said team member Steven Moran, a linguist at the University of Zurich, at a briefing about this study.

This new approach to studying language evolution is a game changer, says Sean Roberts at the University of Bristol, UK. “For the first time, we can look at patterns in global data and spot new relationships between the way we speak and the way we live,” he says. “It’s an exciting time to be a linguist.”

Journal reference: Science, DOI: 10.1126/science.aav3218

https://www.newscientist.com/article/2196580-humans-couldnt-pronounce-f-and-v-sounds-before-farming-developed/