An explanation of the Standard Model of Physics

The Standard Model. What dull name for the most accurate scientific theory known to human beings.

More than a quarter of the Nobel Prizes in physics of the last century are direct inputs to or direct results of the Standard Model. Yet its name suggests that if you can afford a few extra dollars a month you should buy the upgrade. As a theoretical physicist, I’d prefer The Absolutely Amazing Theory of Almost Everything. That’s what the Standard Model really is.

Many recall the excitement among scientists and media over the 2012 discovery of the Higgs boson. But that much-ballyhooed event didn’t come out of the blue – it capped a five-decade undefeated streak for the Standard Model. Every fundamental force but gravity is included in it. Every attempt to overturn it to demonstrate in the laboratory that it must be substantially reworked – and there have been many over the past 50 years – has failed.

In short, the Standard Model answers this question: What is everything made of, and how does it hold together?

The smallest building blocks

You know, of course, that the world around us is made of molecules, and molecules are made of atoms. Chemist Dmitri Mendeleev figured that out in the 1860s and organized all atoms – that is, the elements – into the periodic table that you probably studied in middle school. But there are 118 different chemical elements. There’s antimony, arsenic, aluminum, selenium … and 114 more.


But these elements can be broken down further.

Physicists like things simple. We want to boil things down to their essence, a few basic building blocks. Over a hundred chemical elements is not simple. The ancients believed that everything is made of just five elements – earth, water, fire, air and aether. Five is much simpler than 118. It’s also wrong.

By 1932, scientists knew that all those atoms are made of just three particles – neutrons, protons and electrons. The neutrons and protons are bound together tightly into the nucleus. The electrons, thousands of times lighter, whirl around the nucleus at speeds approaching that of light. Physicists Planck, Bohr, Schroedinger, Heisenberg and friends had invented a new science – quantum mechanics – to explain this motion.

That would have been a satisfying place to stop. Just three particles. Three is even simpler than five. But held together how? The negatively charged electrons and positively charged protons are bound together by electromagnetism. But the protons are all huddled together in the nucleus and their positive charges should be pushing them powerfully apart. The neutral neutrons can’t help.

What binds these protons and neutrons together? “Divine intervention” a man on a Toronto street corner told me; he had a pamphlet, I could read all about it. But this scenario seemed like a lot of trouble even for a divine being – keeping tabs on every single one of the universe’s 10⁸⁰ protons and neutrons and bending them to its will.

Expanding the zoo of particles

Meanwhile, nature cruelly declined to keep its zoo of particles to just three. Really four, because we should count the photon, the particle of light that Einstein described. Four grew to five when Anderson measured electrons with positive charge – positrons – striking the Earth from outer space. At least Dirac had predicted these first anti-matter particles. Five became six when the pion, which Yukawa predicted would hold the nucleus together, was found.

Then came the muon – 200 times heavier than the electron, but otherwise a twin. “Who ordered that?” I.I. Rabi quipped. That sums it up. Number seven. Not only not simple, redundant.

By the 1960s there were hundreds of “fundamental” particles. In place of the well-organized periodic table, there were just long lists of baryons (heavy particles like protons and neutrons), mesons (like Yukawa’s pions) and leptons (light particles like the electron, and the elusive neutrinos) – with no organization and no guiding principles.

Into this breach sidled the Standard Model. It was not an overnight flash of brilliance. No Archimedes leapt out of a bathtub shouting “eureka.” Instead, there was a series of crucial insights by a few key individuals in the mid-1960s that transformed this quagmire into a simple theory, and then five decades of experimental verification and theoretical elaboration.

Quarks. They come in six varieties we call flavors. Like ice cream, except not as tasty. Instead of vanilla, chocolate and so on, we have up, down, strange, charm, bottom and top. In 1964, Gell-Mann and Zweig taught us the recipes: Mix and match any three quarks to get a baryon. Protons are two ups and a down quark bound together; neutrons are two downs and an up. Choose one quark and one antiquark to get a meson. A pion is an up or a down quark bound to an anti-up or an anti-down. All the material of our daily lives is made of just up and down quarks and anti-quarks and electrons.


The Standard Model of elementary particles provides an ingredients list for everything around us.

Simple. Well, simple-ish, because keeping those quarks bound is a feat. They are tied to one another so tightly that you never ever find a quark or anti-quark on its own. The theory of that binding, and the particles called gluons (chuckle) that are responsible, is called quantum chromodynamics. It’s a vital piece of the Standard Model, but mathematically difficult, even posing an unsolved problem of basic mathematics. We physicists do our best to calculate with it, but we’re still learning how.

The other aspect of the Standard Model is “A Model of Leptons.” That’s the name of the landmark 1967 paper by Steven Weinberg that pulled together quantum mechanics with the vital pieces of knowledge of how particles interact and organized the two into a single theory. It incorporated the familiar electromagnetism, joined it with what physicists called “the weak force” that causes certain radioactive decays, and explained that they were different aspects of the same force. It incorporated the Higgs mechanism for giving mass to fundamental particles.

Since then, the Standard Model has predicted the results of experiment after experiment, including the discovery of several varieties of quarks and of the W and Z bosons – heavy particles that are for weak interactions what the photon is for electromagnetism. The possibility that neutrinos aren’t massless was overlooked in the 1960s, but slipped easily into the Standard Model in the 1990s, a few decades late to the party.

Discovering the Higgs boson in 2012, long predicted by the Standard Model and long sought after, was a thrill but not a surprise. It was yet another crucial victory for the Standard Model over the dark forces that particle physicists have repeatedly warned loomed over the horizon. Concerned that the Standard Model didn’t adequately embody their expectations of simplicity, worried about its mathematical self-consistency, or looking ahead to the eventual necessity to bring the force of gravity into the fold, physicists have made numerous proposals for theories beyond the Standard Model. These bear exciting names like Grand Unified Theories, Supersymmetry, Technicolor, and String Theory.

Sadly, at least for their proponents, beyond-the-Standard-Model theories have not yet successfully predicted any new experimental phenomenon or any experimental discrepancy with the Standard Model.

After five decades, far from requiring an upgrade, the Standard Model is worthy of celebration as the Absolutely Amazing Theory of Almost Everything.

https://theconversation.com/the-standard-model-of-particle-physics-the-absolutely-amazing-theory-of-almost-everything-94700#?utm_source=ls-newsletter&utm_medium=email&utm_campaign=05272018-ls

Faulty Gene Leads to Alcohol-Induced Heart Failure

Scientists have revealed a new link between alcohol, heart health and our genes.

The researchers investigated faulty versions of a gene called titin which are carried by one in 100 people or 600,000 people in the UK.

Titin is crucial for maintaining the elasticity of the heart muscle, and faulty versions are linked to a type of heart failure called dilated cardiomyopathy.

Now new research suggests the faulty gene may interact with alcohol to accelerate heart failure in some patients with the gene, even if they only drink moderate amounts of alcohol.

The research was carried out by scientists from Imperial College London, Royal Brompton Hospital, and MRC London Institute of Medical Sciences, and published this week in the latest edition of the Journal of the American College of Cardiology.

The study was supported by the Department of Health and Social Care and the Wellcome Trust through the Health Innovation Challenge Fund.

In the first part of the study, the team analysed 141 patients with a type of heart failure called alcoholic cardiomyopathy (ACM). This condition is triggered by drinking more than 70 units a week (roughly seven bottles of wine) for five years or more. In severe cases the condition can be fatal, or leave patients requiring a heart transplant.

The team found that the faulty titin gene may also play a role in the condition. In the study 13.5 per cent of patients were found to carry the mutation – much higher than the proportion of people who carry them in the general population.

These results suggest this condition is not simply the result of alcohol poisoning, but arises from a genetic predisposition – and that other family members may be at risk too, explained Dr James Ware, study author from the National Heart and Lung Institute at Imperial.

“Our research strongly suggests alcohol and genetics are interacting – and genetic predisposition and alcohol consumption can act together to lead to heart failure. At the moment this condition is assumed to be simply due to too much alcohol. But this research suggests these patients should also be checked for a genetic cause – by asking about a family history and considering testing for a faulty titin gene, as well as other genes linked to heart failure,” he said.

He added that relatives of patients with ACM should receive assessment and heart scans – and in some cases have genetic tests – to see if they unknowingly carry the faulty gene.

In a second part of the study, the researchers investigated whether alcohol may play a role in another type of heart failure called dilated cardiomyopathy (DCM). This condition causes the heart muscle to become stretched and thin, and has a number of causes including viral infections and certain medications. The condition can also be genetic, and around 12 per cent of cases of DCM are thought to be linked to a faulty titin gene.

In the study the team asked 716 patients with dilated cardiomyopathy how much alcohol they consumed.

None of the patients consumed the high-levels of alcohol needed to cause ACM. But the team found that in patients whose DCM was caused by the faulty titin gene, even moderately increased alcohol intake (defined as drinking above the weekly recommended limit of 14 units), affected the heart’s pumping power.

Compared to DCM patients who didn’t consume excess alcohol (and whose condition wasn’t caused by the faulty titin gene), excess alcohol was linked to reduction in heart output of 30 per cent.

More research is now needed to investigate how alcohol may affect people who carry the faulty titin gene, but do not have heart problems, added Dr Paul Barton, study co-author from the National Heart and Lung Institute at Imperial:

“Alcohol and the heart have a complicated relationship. While moderate levels may have benefits for heart health, too much can cause serious cardiac problems. This research suggests that in people with titin-related heart failure, alcohol may worsen the condition.

“An important wider question is also raised by the study: do mutations in titin predispose people to heart failure when exposed to other things that stress the heart, such as cancer drugs or certain viral infections? This is something we are actively seeking to address.”

The research was supported by the Department of Health and Social Care and Wellcome Trust through the Health Innovation Challenge Fund, the Medical Research Council, the NIHR Cardiovascular Biomedical Research Unit at Royal Brompton & Harefield NHS Foundation Trust and the British Heart Foundation.

Reference: Ware, J. S., Amor-Salamanca, A., Tayal, U., Govind, R., Serrano, I., Salazar-Mendiguchía, J., … Garcia-Pavia, P. (2018). Genetic Etiology for Alcohol-Induced Cardiac Toxicity. Journal of the American College of Cardiology, 71(20), 2293–2302. https://doi.org/10.1016/j.jacc.2018.03.462

https://www.technologynetworks.com/genomics/news/faulty-gene-leads-to-alcohol-induced-heart-failure-304365?utm_campaign=Newsletter_TN_BreakingScienceNews&utm_source=hs_email&utm_medium=email&utm_content=63228690&_hsenc=p2ANqtz-9oqDIw3te1NPoj51s94kxnA1ClK8Oiecfela6I4WiITEbm_-SWdmw6pjMTwm2YP24gqSzRaBvUK1kkb2kZEJKPcL5JtQ&_hsmi=63228690

An explanation of blood sugar

By Alina Bradford

Blood sugar, or glucose, is the main sugar found in blood. The body gets glucose from the food we eat. This sugar is an important source of energy and provides nutrients to the body’s organs, muscles and nervous system. The absorption, storage and production of glucose is regulated constantly by complex processes involving the small intestine, liver and pancreas.

Glucose enters the bloodstream after a person has eaten carbohydrates. The endocrine system helps keep the bloodstream’s glucose levels in check using the pancreas. This organ produces the hormone insulin, releasing it after a person consumes protein or carbohydrates. The insulin sends excess glucose in the liver as glycogen.

The pancreas also produces a hormone called glucagon, which does the opposite of insulin, raising blood sugar levels when needed. The two hormones work together to keep glucose balanced.

When the body needs more sugar in the blood, the glucagon signals the liver to turn the glycogen back into glucose and release it into the bloodstream. This process is called glycogenolysis.

When there isn’t enough sugar to go around, the liver hoards the resource for the parts of the body that need it, including the brain, red blood cells and parts of the kidney. For the rest of the body, the liver makes ketones , which breaks down fat to use as fuel. The process of turning fat into ketones is called ketogenesis. The liver can also make sugar out of other things in the body, like amino acids, waste products and fat byproducts.

Glucose vs. dextrose
Dextrose is also a sugar. It’s chemically identical to glucose but is made from corn and rice, according to Healthline. It is often used as a sweetener in baking products and in processed foods. Dextrose also has medicinal purposes. It is dissolved in solutions that are given intravenously to increase a person’s blood sugar levels.

Normal blood sugar
For most people, 80 to 99 milligrams of sugar per deciliter before a meal and 80 to 140 mg/dl after a meal is normal. The American Diabetes Association says that most nonpregnant adults with diabetes should have 80 to 130 mg/dl before a meal and less than 180 mg/dl at 1 to 2 hours after beginning the meal.

These variations in blood-sugar levels, both before and after meals, reflect the way that the body absorbs and stores glucose. After you eat, your body breaks down the carbohydrates in food into smaller parts, including glucose, which the small intestine can absorb.

Problems
Diabetes happens when the body lacks insulin or because the body is not working effectively, according to Dr. Jennifer Loh, chief of the department of endocrinology for Kaiser Permanente in Hawaii. The disorder can be linked to many causes, including obesity, diet and family history, said Dr. Alyson Myers of Northwell Health in New York.

“To diagnose diabetes, we do an oral glucose-tolerance test with fasting,” Myers said.

Cells may develop a tolerance to insulin, making it necessary for the pancreas to produce and release more insulin to lower your blood sugar levels by the required amount. Eventually, the body can fail to produce enough insulin to keep up with the sugar coming into the body.

It can take decades to diagnose high blood-sugar levels, though. This may happen because the pancreas is so good at its job that a doctor can continue to get normal blood-glucose readings while insulin tolerance continues to increase, said Joy Stephenson-Laws, founder of Proactive Health Labs (pH Labs), a nonprofit that provides health care education and tools. She also wrote “Minerals – The Forgotten Nutrient: Your Secret Weapon for Getting and Staying Healthy” (Proactive Health Labs, 2016).

Health professionals can check blood sugar levels with an A1C test, which is a blood test for type 2 diabetes and prediabetes, according to the U.S. National Library of Medicine. This test measures your average blood glucose, or blood sugar, level over the previous three months.

Doctors may use the A1C alone or in combination with other diabetes tests to make a diagnosis. They also use the A1C to see how well you are managing your diabetes. This test is different from the blood sugar checks that people with diabetes do for themselves every day.

In the condition called hypoglycemia, the body fails to produce enough sugar. People with this disorder need treatment when blood sugar drops to 70 mg/dL or below. According to the Mayo Clinic, symptoms of hypoglycemia can be:

Tingling sensation around the mouth
Shakiness
Sweating
An irregular heart rhythm
Fatigue
Pale skin
Crying out during sleep
Anxiety
Hunger
Irritability


Keeping blood sugar in control

Stephenson-Laws said healthy individuals can keep their blood sugar at the appropriate levels using the following methods:

Maintaining a healthy weight

Talk with a competent health care professional about what an ideal weight for you should be before starting any kind of weight loss program.

Improving diet

Look for and select whole, unprocessed foods, like fruits and vegetables, instead of highly processed or prepared foods. Foods that have a lot of simple carbohydrates, like cookies and crackers, that your body can digest quickly tend to spike insulin levels and put additional stress on the pancreas. Also, avoid saturated fats and instead opt for unsaturated fats and high-fiber foods. Consider adding nuts, vegetables, herbs and spices to your diet.

Getting physical

A brisk walk for 30 minutes a day can greatly reduce blood sugar levels and increase insulin sensitivity.

Getting mineral levels checked

Research also shows that magnesium plays a vital role in helping insulin do its job. So, in addition to the other health benefits it provides, an adequate magnesium level can also reduce the chances of becoming insulin-tolerant.

Get insulin levels checked

Many doctors simply test for blood sugar and perform an A1C test, which primarily detects prediabetes or type 2 diabetes. Make sure you also get insulin checks.

https://www.livescience.com/62673-what-is-blood-sugar.html#?utm_source=ls-newsletter&utm_medium=email&utm_campaign=05272018-ls

Rapamycin lotion reduces facial tumors caused by tuberous sclerosis


Researching tuberous sclerosis from the left are Adelaide Hebert, M.D.; John Slopis, M.D.; Mary Kay Koenig, M.D.; Joshua Samuels, M.D., M.P.H.; and Hope Northrup, M.D. PHOTO CREDIT Maricruz Kwon, UTHealth

Addressing a critical issue for people with a genetic disorder called tuberous sclerosis complex (TSC), doctors at The University of Texas Health Science Center at Houston (UTHealth) reported that a skin cream containing rapamycin significantly reduced the disfiguring facial tumors affecting more than 90 percent of people with the condition.

Findings of the multicenter, international study involving 179 people with tuberous sclerosis complex appear in the journal JAMA Dermatology.

“People with tuberous sclerosis complex want to look like everyone else,” said Mary Kay Koenig, M.D., the study’s lead author, co-director of the Tuberous Sclerosis Center of Excellence and holder of the Endowed Chair of Mitochondrial Medicine at McGovern Medical School at UTHealth. “And, they can with this treatment.”

Tuberous sclerosis complex affects about 50,000 people in the United States and is characterized by the uncontrolled growth of non-cancerous tumors throughout the body.

While benign tumors in the kidney, brain and other organs pose the greater health risk, the tumors on the face produce a greater impact on a patient’s daily life by making them look different from everyone else, Koenig said.

Koenig’s team tested two compositions of facial cream containing rapamycin and a third with no rapamycin. Patients applied the cream at bedtime for six months.

“Eighty percent of patients getting the study drug experienced a significant improvement compared to 25 percent of those getting the mixture with no rapamycin,” she said.

“Angiofibromas on the face can be disfiguring, they can bleed and they can negatively impact quality of life for individuals with TSC,” said Kari Luther Rosbeck, president and CEO of the Tuberous Sclerosis Alliance.

“Previous treatments, including laser surgery, have painful after effects. This pivotal study and publication are a huge step toward understanding the effectiveness of topical rapamycin as a treatment option. Further, it is funded by the TSC Research Program at the Department of Defense. We are so proud of this research,” Rosbeck said.

Rapamycin is typically given to patients undergoing an organ transplant. When administered by mouth, rapamycin suppresses the immune system to make sure the organ is not rejected.

Rapamycin and tuberous sclerosis complex are linked by a protein called mTOR. When it malfunctions, tuberous sclerosis complex occurs. Rapamycin corrects this malfunction.

Rapamycin was initially used successfully to treat brain tumors caused by tuberous sclerosis complex, so researchers decided to try it on TSC-related facial tumors. Building on a 2010 pilot study on the use of rapamycin to treat TSC-related facial tumors, this study confirmed that a cream containing rapamycin shrinks these tumors.

As the drug’s toxicity is a concern when taken by mouth, researchers were careful to check for problems tied to its use on the skin. “It looks like the medication stays on the surface of the skin. We didn’t see any appreciable levels in the bloodstreams of those participating in the study,” Koenig said.

The Topical Rapamycin to Erase Angiofibromas in TSC – Multicenter Evaluation of Novel Therapy or TREATMENT trial involved 10 test sites including one in Australia.

Koenig said additional studies are needed to gauge the long-term impact of the drug, the optimal dosage and whether the facial cream should be a combined with an oral treatment.

Koenig’s coauthors include Adelaide Hebert, M.D.; Joshua Samuels, M.D., M.P.H.; John Slopis, M.D.; Cynthia S. Bell; Joan Roberson, R.N.; Patti Tate; and Hope Northrup, M.D. All are from McGovern Medical School at UTHealth with the exception of Slopis, who is with The University of Texas MD Anderson Cancer Center. Hebert is also on the faculty of the MD Anderson Cancer Center and Northrup on the faculty of The University of Texas MD Anderson Cancer Center UTHealth Graduate School of Biomedical Sciences.

The study was supported in part by the United States Department of Defense grant DOD TSCRP CDMRP W81XWH-11-1-0240 and by the Tuberous Sclerosis Alliance of Australia.

“The face is our window to the world and when you look different from everyone else, it impacts your confidence and your ability to interact with others. This treatment will help those with TSC become more like everyone else,” Koenig said.

https://www.uth.edu/media/story.htm?id=37af25df-14a2-4c5e-b1ee-ac9585946aa0

New test is ably to reliably predict the risk of preterm birth

By Laura Kurtzman

Scientists at UC San Francisco have developed a test to predict a woman’s risk of preterm birth when she is between 15 and 20 weeks pregnant, which may enable doctors to treat them early and thereby prevent severe complications later in the pregnancy.

Preterm birth is the leading cause of death for children under five in the United States, and rates are increasing both in the U.S. and around the world. It is often associated with inflammation and has many potential causes, including an acute infection in the mother, exposure to environmental toxins, or chronic conditions like hypertension and diabetes.

The new test screens for 25 biomarkers of inflammation and immune system activation, as well as for levels of proteins that are important for placenta development. Combined with information on other risk factors, such as the mother’s age and income, the test can predict whether a woman is at risk for preterm birth with more than 80 percent accuracy. In the highest risk pregnancies—preterm births occurring before 32 weeks or in women with preeclampsia, a potentially fatal pregnancy complication marked by high blood pressure in the mother—the test predicted nearly 90 percent of cases.

In the study, published Thursday, May 24, 2018, in the Journal of Perinatology, the researchers built a comprehensive test that would capture both spontaneous preterm births, which occurs when the amniotic sac breaks or contractions begin spontaneously, and “indicated” preterm birth, in which a physician induces labor or performs a cesarean section because the health of the mother or baby is in jeopardy. The researchers also wanted to be able to identify risk for preeclampsia, which is not included in current tests for preterm birth.

“There are multifactorial causes of preterm birth, and that’s why we felt like we needed to build a model that took into account multiple biological pathways,” said first author Laura Jelliffe-Pawlowski, PhD, director of Precision Health and Discovery with the UCSF California Preterm Birth Initiative and associate professor of epidemiology and biostatistics at UCSF. “The model works especially well for early preterm births and preeclampsia, which suggests that we’re effectively capturing severe types of preterm birth.”

The researchers developed the screen using blood samples taken from 400 women as part of routine prenatal care during the second trimester, comparing women who went on to give birth before 32 weeks, between 32 and 36 weeks, and after 38 weeks (full-term). The researchers first tested the samples for more than 60 different immune and growth factors, ultimately narrowing the test down to 25 factors that together could help predict risk for preterm birth. When other data, including whether or not the mother was over 34 years old or if she qualified as low income (indicated by Medicaid eligibility), improved the accuracy of the test by an additional 6 percent.

Researchers said the test could help prevent some cases of preterm birth. Based on a woman’s probability of preterm birth derived by the test, she could discuss with her clinician how best to follow-up and try to lower her risk. Some cases of preterm birth, including those caused by preeclampsia, can be prevented or delayed by taking aspirin, but treatment is most helpful if started before 16 weeks. Physicians could also evaluate high-risk women for underlying infections that may have gone undetected but could be treated. For others, close monitoring by their doctor could help flag early signs of labor like cervical shortening that can be staved off with progesterone treatment.

“We hope that this test could lead to more education and counseling of women about their level of risk so that they know about preterm birth and know what preeclampsia or early signs of labor look like,” said Jelliffe-Pawlowski. “If we can get women to the hospital as soon as possible, even if they’ve gone into labor, we can use medications to stave off contractions. This might give her some additional days before she delivers, which can be really important for the baby.”

A test for preterm birth is currently available, but it is expensive and only screens for spontaneous preterm birth, not for signs that could lead to indicated preterm births or for preeclampsia. Jelliffe-Pawlowski said that the new screen would likely be a fraction of the cost, making it more accessible to women who need it the most.

“One of the reasons we’re most excited about this test is that we see some potential for it addressing preterm birth in those most at risk, including low-income women, women of color, and women living in low-income countries,” she said. “We want to make sure that we’re developing something that has the potential to help all women, including those most in need.”

Other authors on the study were Larry Rand, Scott Oltman, and Mary Norton of UCSF; Bruce Bedell, Jeffrey Murray, and Kelli Ryckman of the University of Iowa; Rebecca Baer of UC San Diego; and Gary Shaw and David Stevenson of Stanford University.

https://www.ucsf.edu/news/2018/05/410456/risk-preterm-birth-reliably-predicted-new-test?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+ucsf_press_releases+%28UCSF+Press+Releases%29

Researchers 3D Print on Skin for Breakthrough Applications


Researchers at the University of Minnesota use a customized 3D printer to print electronics on a real hand. Image: McAlpine group, University of Minnesota

Soldiers are commonly thrust into situations where the danger is the unknown: Where is the enemy, how many are there, what weaponry is being used? The military already uses a mix of technology to help answer those questions quickly, and another may be on its way. Researchers at the University of Minnesota have developed a low-cost 3D printer that prints sensors and electronics directly on skin. The development could allow soldiers to directly print temporary, disposable sensors on their hands to detect such things as chemical or biological agents in the field.

The technology also could be used in medicine. The Minnesota researchers successfully used bioink with the device to print cells directly on the wounds of a mouse. Researchers believe it could eventually provide new methods of faster and more efficient treatment, or direct printing of grafts for skin wounds or conditions.

“The concept was to go beyond smart materials, to integrate them directly on to skin,” says Michael McAlpine, professor of mechanical engineering whose research group focuses on 3D printing functional materials and devices. “It is a biological merger with electronics. We wanted to push the limits of what a 3D printer can do.”

McAlpine calls it a very simple idea, “One of those ideas so simple, it turns out no one has done it.”

Others have used 3D printers to print electronics and biological cells. But printing on skin presented a few challenges. No matter how hard a person tries to remain still, there always will be some movement during the printing process. “If you put a hand under the printer, it is going to move,” he says.

To adjust for that, the printer the Minnesota team developed uses a machine vision algorithm written by Ph.D. student Zhijie Zhu to track the motion of the hand in real time while printing. Temporary markers are placed on the skin, which then is scanned. The printer tracks the hand using the markers and adjusts in real time to any movement. That allows the printed electronics to maintain a circuit shape. The printed device can be peeled off the skin when it is no longer needed.

The team also needed to develop a special ink that could not only be conductive but print and cure at room temperature. Standard 3D printing inks cure at high temperatures of 212 °F and would burn skin.

In a paper recently published in Advanced Materals, the team identified three criteria for conductive inks: The viscosity of the ink should be tunable while maintaining self-supporting structures; the ink solvent should evaporate quickly so the device becomes functional on the same timescale as the printing process; and the printed electrodes should become highly conductive under ambient conditions.

The solution was an ink using silver flakes to provide conductivity rather than particles more commonly used in other applications. Fibers were found to be too large, and cure at high temperatures. The flakes are aligned by their shear forces during printing, and the addition of ethanol to the mix increases speed of evaporation, allowing the ink to cure quickly at room temperature.

“Printing electronics directly on skin would have been a breakthrough in itself, but when you add all of these other components, this is big,” McAlpine says.

The printer is portable, lightweight and cost less than $400. It consists of a delta robot, monitor cameras for long-distance observation of printing states and tracking cameras mounted for precise localization of the surface. The team added a syringe-type nozzle to squeeze and deliver the ink

Furthering the printer’s versatility, McAlpine’s team worked with staff from the university’s medical school and hospital to print skin cells directly on a skin wound of a mouse. The mouse was anesthetized, but still moved slightly during the procedure, he says. The initial success makes the team optimistic that it could open up a new method of treating skin diseases.

“Think about what the applications could be,” McAlpine says. “A soldier in the field could take the printer out of a pack and print a solar panel. On the cellular side, you could bring a printer to the site of an accident and print cells directly on wounds, speeding the treatment. Eventually, you may be able to print biomedical devices within the body.”

In its paper, the team suggests that devices can be “autonomously fabricated without the need for microfabrication facilities in freeform geometries that are actively adaptive to target surfaces in real time, driven by advances in multifunctional 3D printing technologies.”

Besides the ability to print directly on skin, McAlpine says the work may offer advantages over other skin electronic devices. For example, soft, thin, stretchable patches that stick to the skin have been fitted with off-the-shelf chip-based electronics for monitoring a patient’s health. They stick to skin like a temporary tattoo and send updates wirelessly to a computer.

“The advantage of our approach is that you don’t have to start with electronic wafers made in a clean room,” McAlpine says. “This is a completely new paradigm for printing electronics using 3D printing.”

http://www.asme.org/engineering-topics/articles/bioengineering/researchers-3d-print-skin-breakthrough

Bursts of brain activity linked to memory reactivation

By Hilary Hurd Anyaso

Leading theories propose that sleep presents an opportune time for important, new memories to become stabilized. And it’s long been known which brain waves are produced during sleep. But in a new study, researchers set out to better understand the brain mechanisms that secure memory storage.

The team from Northwestern and Princeton universities set out to find more direct and precisely timed evidence for the involvement of one particular sleep wave — known as the “sleep spindle.”

In the study, sleep spindles, described as bursts of brain activity typically lasting around one second, were linked to memory reactivation. The paper, “Sleep spindle refractoriness segregates periods of memory reactivation,” published today in the journal Current Biology.

“The most novel aspect of our study is that we found these spindles occur rhythmically — about every three to six seconds — and this rhythm is related to memory,” said James W. Antony, first author of the study and a postdoctoral fellow in Princeton’s Computational Memory Lab.

Three experiments explored how recent memories are reactivated during sleep. While volunteers took an afternoon nap, sound cues were surreptitiously played. Each was linked to a specific memory. The researchers’ final experiment showed that if cues were presented at opportune times such that spindles could follow them, the linked memories were more likely to be retained. If they were presented when a spindle was unlikely to follow, the linked memories were more likely to be forgotten.

“One particularly remarkable aspect of the study was that we were able to monitor spindles moment by moment while people slept,” said Ken A. Paller, senior author of the study and professor of psychology at Northwestern’s Weinberg College of Arts and Sciences. “Therefore, we could know when the brain was most ready for us to prompt memory reactivation.”
If the researchers reminded people of a recently learned fact, a spindle would likely be evident in the cerebral cortex, and memory for that information would be improved, added Paller, also director of Northwestern’s Cognitive Neuroscience Program.

“In memory research, we know it’s important to segregate experiences while you’re awake so that everything doesn’t just blend together,” said Antony, who worked in Paller’s lab at Northwestern as a doctoral student. “If that happens, you may have difficulty retrieving information because so many things will come to mind at once. We believe the spindle rhythmicity shown here might play a role in segregating successive memory reactivations from each other, preventing overlap that might cause later interference between memories.”

Ultimately, the researchers’ goal is to understand how sleep affects memory under natural conditions and how aging or disease can impact these functions.

“With that goal in mind, we’ve helped elucidate the importance of sleep spindles more generally,” Antony said.

Paller said they are on the trail of the physiology of memory reactivation.

“Future work will be needed to see how spindles fit together with other aspects of the physiology of memory and will involve other types of memory testing and other species,” Paller said.

In addition to Antony and Paller, co-authors are Luis Piloto, Margaret Wang, Paula Pacheco and Kenneth A. Norman, all of Princeton.

https://news.northwestern.edu/stories/2018/may/bursts-of-brain-activity-linked-to-memory-reactivation/

How brown fat keeps us warm


Adipose Connective Tissue Stores Fat in Our Body. Credit: Berkshire Community College Bioscience Image Library

A new technique to study fat stores in the body could aid efforts to find treatments to tackle obesity.

The approach focuses on energy-burning tissues found deep inside the body – called brown fat – that help to keep us warm when temperatures drop.

Experts are aiming to find it this calorie-burning power can be harnessed to stop weight gain, but little is known about how the process works.

Previous studies have mainly relied on a medical imaging technique called PET/CT to watch brown fat in action deep inside the body. But the method is unable to directly measure the chemical factors in the tissue.

Scientists at the University of Edinburgh developed a technique called microdialysis to measure how brown fat generates heat in people.

The approach involves inserting a small tube into an area of brown fat in the body and flushing it with fluid to collect a snapshot of the tissues’ chemical make-up.

The team tested the technique in six healthy volunteers, using PET/CT to guide the tube to the right location.

They discovered that in cold conditions, brown fat uses its own energy stores and other substances to generate heat.

Brown fat was active under warm conditions too, when the body does not need to generate its own heat, an outcome that had not been seen before.

Researchers hope the technique will help them to analyse the specific chemicals involved, so that they can better understand how brown fat works.

Most of the fat in our body is white fat, which is found under the skin and surrounding internal organs. It stores excess energy when we consume more calories than we burn.

Brown fat is mainly found in babies and helps them to stay warm. Levels can decrease with age but adults can still have substantial amounts of it, mainly in the neck and upper back region. People who are lean tend to have more brown fat.

The study, published in Cell Metabolism, was funded by the Medical Research Council and Wellcome.

Lead researcher Dr Roland Stimson, of the British Heart Foundation Centre for Cardiovascular Science at the University of Edinburgh, said: “Understanding how brown fat is activated could reveal potential targets for therapies that boost its energy-burning power, which could help with weight loss.”

This article has been republished from materials provided by the University of Edinburgh. Note: material may have been edited for length and content. For further information, please contact the cited source.

Reference: Weir, G., Ramage, L. E., Akyol, M., Rhodes, J. K., Kyle, C. J., Fletcher, A. M., … Stimson, R. H. (2018). Substantial Metabolic Activity of Human Brown Adipose Tissue during Warm Conditions and Cold-Induced Lipolysis of Local Triglycerides. Cell Metabolism, 0(0). https://doi.org/10.1016/j.cmet.2018.04.020

https://www.technologynetworks.com/proteomics/news/how-brown-fat-keeps-us-warm-304351?utm_campaign=Newsletter_TN_BreakingScienceNews&utm_source=hs_email&utm_medium=email&utm_content=63228690&_hsenc=p2ANqtz-9oqDIw3te1NPoj51s94kxnA1ClK8Oiecfela6I4WiITEbm_-SWdmw6pjMTwm2YP24gqSzRaBvUK1kkb2kZEJKPcL5JtQ&_hsmi=63228690

UMass Amherst Chemists Develop New Blood Test to Detect Liver Damage in Under an Hour

Chemist Vincent Rotello at the University of Massachusetts Amherst, with colleagues at University College London (UCL), U.K., announce today that they have developed a “quick and robust” blood test that can detect liver damage before symptoms appear, offering what they hope is a significant advance in early detection of liver disease. Details appear in Advanced Materials.

Their new method can detect liver fibrosis, the first stage of liver scarring that can lead to fatal disease if left unchecked, from a blood sample in 30-45 minutes, the authors note. They point out that liver disease is a leading cause of premature mortality in the United States and U.K., and is rising. It often goes unnoticed until late stages of the disease when the damage is irreversible.

For this work, Rotello and his team at UMass Amherst’s Institute of Applied Life Sciences (IALS) designed a sensor that uses polymers coated with fluorescent dyes that bind to blood proteins based on their chemical processes. The dyes change in brightness and color, offering a different signature or blood protein pattern.

He says, “This platform provides a simple and inexpensive way of diagnosing disease with potential for both personal health monitoring and applications in developing parts of the world.” Rotello and colleagues hope the new test can be used routinely in medical offices, clinics and hospitals to screen people with elevated liver disease risk so they can be treated “before it’s too late.”

The UCL team tested the sensor by comparing results from small blood samples equivalent to finger-prick checks from 65 people, in three balanced groups of healthy patients and among those with early-stage and late-stage fibrosis. This was determined using the Enhanced Liver Fibrosis (ELF) test, the existing benchmark for liver fibrosis detection. They found that the sensor identified different protein-level patterns in the blood of people in the three groups. The ELF test requires samples to be sent away to a lab.

Co-author William Peveler, a chemist now at the University of Glasgow, adds, “By comparing the different samples, the sensor array identified a ‘fingerprint’ of liver damage. It’s the first time this approach has been validated in something as complex as blood, to detect something as important as liver disease.”

The investigators report that the test distinguished fibrotic samples from healthy blood 80 percent of the time, reaching the standard threshold of clinical relevance on a widely-used metric and comparable to existing methods of diagnosing and monitoring fibrosis. The test distinguished between mild-moderate fibrosis and severe fibrosis 60 percent of the time. The researchers plan further tests with larger samples to refine the method’s effectiveness.

Peter Reinhart, director of UMass Amherst’s IALS says, “These exciting findings epitomize the mission of IALS to translate excellent basic science into diagnostics, therapeutic candidates and personalized health monitoring devices to improve human health and well-being.”

Peveler adds, “This may open the door to a cost-effective regular screening program thanks to its simplicity, low cost and robustness. We’re addressing a vital need for point-of-care diagnostics and monitoring, which could help millions of people access the care they need to prevent fatal liver disease.”

Rotello explains that the sensing strategy uses a “signature-based” approach that is highly versatile and should be useful in other areas. “A key feature of this sensing strategy is that it is not disease-specific, so it is applicable to a wide spectrum of conditions, which opens up the possibility of diagnostic systems that can track health status, providing both disease detection and monitoring wellness.”

In addition to UMass Amherst, UCL and the University of Glasgow, the U.K.-based research and development firm iQur Ltd. took part in the study. The work was supported by the U.K. Royal Society, the U.K. Engineering and Physical Sciences Research Council, the U.S. National Institutes of Health and the U.K. National Institute for Health Research UCLH Biomedical Research Centre.

http://www.umass.edu/newsoffice/article/umass-amherst-chemists-international-team

New research shows that heavy marijuana users may hold on more strongly to negative feelings

By Rachael Rettner

Many people tend to look back on the past with rose-colored glasses, remembering the good times and the good feelings…while forgetting the bad.

But a new study suggests that heavy marijuana users may have some trouble letting go of negative emotions tied to memories — a phenomenon that’s also seen in people with depression. Earlier research has also linked marijuana use with depression.

Although the new results are very preliminary, the findings, presented here on Friday (May 25) at the annual meeting of the Association for Psychological Science, may offer clues about the link between marijuana use and depression.

Rose-colored memories

The study explored a psychological phenomenon called “fading affect bias,” in which people tend to hold on to positive feelings tied to their memories more than they hold on to negative feelings. In other words, negative feelings related to our memories fade faster than positive ones.

Psychologists have hypothesized that this phenomenon, which is generally seen in people without mental health conditions, may serve as a sort of “psychological immune system,” said study lead author Daniel Pillersdorf, a graduate student in psychology at the University of Windsor in Ontario. This may be “so that we think more pleasantly in general, and don’t have that cognitive burden of holding on to negative emotions associated with memories,” Pillersdorf said.

Some previous studies have suggested that this fading affect bias may be different for people who use drugs, but no studies have looked at whether marijuana use could affect this phenomenon.

In the new study, the researchers analyzed information from 46 heavy marijuana users — most of whom used the drug at least four times a week — and 51 people who didn’t use marijuana. Participants were asked to recall, and provide written descriptions of, three pleasant memories and three unpleasant memories from the past year. The participants were then asked to rate the intensity of emotion tied to those memories, on a scale of negative 10, meaning extremely unpleasant, to positive 10, or extremely pleasant. They rated their emotions both at the time the memory was made, and at the current time. (Marijuana users were not under the influence at the time the researchers asked them the questions.)

The researchers found that both marijuana users and non-users showed fading affect bias, but for marijuana users, the fading was a lot less.

“They were hanging on to that unpleasant affect over time, much more” than non-users, Pillersdorf told Live Science. “They were less able … to shed that unpleasantness associated with their memories.”

The study also found that marijuana users tended to recall life events in more general terms than specific ones. For example, when asked about a happy event in the past year, marijuana users were more likely to respond with general or broad answers such as “I went on vacation,” rather than recalling a specific event or day, such as “I attended my college graduation.” This phenomenon is known as over-general autobiographical memory, and it’s also linked with depression, Pillersdorf said.

It’s important to note that the new study found only an association and cannot determine why marijuana users show less fading affect bias, and more overgeneral memory, than non-users.

Link with depression?

Even so, the new findings agree with previous research that has found a link between heavy marijuana use and depression. However, researchers don’t know why marijuana and depression are linked — it could be that marijuana use plays a role in developing depression, or that people who are already depressed are more likely to use the drug. [7 Ways Marijuana May Affect the Brain]

Based on the new findings, one hypothesis is that the decreased “fading” of negative memories in marijuana users could be contributing to the development or continuing of depression, Pillersdorf said. “It may be that, chronic or frequent cannabis use is putting [a person] more at risk for the development or continuing of depression,” he said. However, Pillersdorf stressed that this is just a hypothesis that would need to be investigated with future research.

To further investigate the link, researchers will need to study marijuana users and non-users over long periods of time. For example, researchers could start with people in their late teens or early 20s, who don’t have depression, and see if those who use marijuana frequently are more likely to eventually develop depression than non-users.

Additional studies could also investigate whether other substances have an effect on fading affect bias, Pillersdorf said.

The study has not yet been published in a peer-reviewed journal.

https://www.livescience.com/62679-marijuana-negative-memories.html?utm_source=notification