Posts Tagged ‘death’

By Alice Park

While death is inevitable, knowing when it will come isn’t necessarily, and scientists have been trying to develop a test that could reliably and easily predict how long a person will live — or, more technically, how healthy they are and therefore how vulnerable they might be to major mortality risk factors. Blood tests are the most likely avenue to such a test, since it’s easy to obtain blood samples and labs equipped to handle them are common.

The latest effort is described in a new paper published in Nature Communications, by a team led by Joris Deelen, postdoctoral researcher at the Max Planck Institute for the Biology of Aging and P. Eline Slagboom, head of molecular epidemiology at Leiden University Medical Center. The researchers report that, in a group of more than 44,000 healthy patients, their blood test was around 80% accurate in predicting mortality risk within five to 10 years.

The patients, who ranged in age from 18 to 109 years, provided blood samples and had their health events tracked for up to 16 years. The researchers analyzed a group of 226 so-called metabolites, or by-products of things that various cells and tissues in the body pour into the blood stream for circulation and removal. From this collection of markers, the team narrowed down the list to 14 that they determined could together, and along with the person’s sex, provide a pretty good picture of each person’s health risk, and, by association, their risk of dying in the next five to 10 years. They accomplished this by comparing those who died during the study to those who did not and isolating which agents in their blood differed to statistically significant amounts. The link between the final 14 factors and mortality remained strong even after the scientists accounted for potential confounding factors that also affect survival such as age, sex, and cause of death.

“We want to tackle the vulnerability of people’s health that is hidden and that doctors cannot see from the outside,” says Slagboom. “I am still surprised by the fact that in a group of people you can take one blood sample at one point of time in their life, and that would say anything meaningful about their five to 10 year mortality risk.”

Both Deelen and Slagboom stress that the test is not ready yet for doctors to use in the clinic with their patients, but that it does establish a foundation for one down the road. An eventual test could be most useful at first in assessing older patients and guiding treatment decisions, since the 14 metabolites represent a range of processes including the breakdown of fat and glucose, inflammation and fluid balance in the body, that impact a range of chronic ailments, as well as a person’s ability to recover from illness or injury.

Researchers at Leiden University are currently studying the test to see if it can help doctors predict which patients with hip fractures are more likely to develop complications during their recovery after surgery. Another study is looking at whether the test can predict which people with kidney failure are more likely to develop dementia or side effects like delirium as a result of their treatment; this information could help doctors to better adjust dosage and treatment decisions.

The researchers are also hoping to work with large databanks around the world to further validate the findings. “We see this as a foundation,” says Slagboom, “we do not see this test as an endpoint.”

https://time.com/5656767/blood-test-longevity/

Advertisements

Peter Mayhew, the towering actor who donned a huge, furry costume to give life to the rugged-and-beloved character of Chewbacca in the original “Star Wars” trilogy and two other films, has died, his family said Thursday.

Mayhew died at his home in north Texas on Tuesday, according to a family statement. He was 74. No cause was given.

As Chewbacca, known to his friends as Chewie, the 7-foot-3 Mayhew was a fierce warrior with a soft heart, loyal sidekick to Harrison Ford’s Han Solo, and co-pilot of the Millennium Falcon.

Mayhew went on to appear as the Wookiee in the 2005 prequel “Revenge of the Sith” and shared the part in 2015′s “The Force Awakens” with actor Joonas Suotamo, who took over the role in subsequent films.

“Peter Mayhew was a kind and gentle man, possessed of great dignity and noble character,” Ford said in a statement Thursday. “These aspects of his own personality, plus his wit and grace, he brought to Chewbacca. We were partners in film and friends in life for over 30 years and I loved him… My thoughts are with his dear wife Angie and his children. Rest easy, my dear friend.”

Mayhew defined the incredibly well-known Wookiee and became a world-famous actor for most of his life without speaking a word or even making a sound — Chewbacca’s famous roar was the creation of sound designers.

“He put his heart and soul into the role of Chewbacca and it showed in every frame of the films,” the family statement said. “But, to him, the ‘Star Wars’ family meant so much more to him than a role in a film.”

Mark Hamill, who played Luke Skywalker alongside Mayhew, wrote on Twitter that he was “the gentlest of giants — A big man with an even bigger heart who never failed to make me smile & a loyal friend who I loved dearly. I’m grateful for the memories we shared & I’m a better man for just having known him.”

Born and raised in England, Mayhew had appeared in just one film and was working as a hospital orderly in London when George Lucas, who shot the first film in England, found him and cast him in 1977′s “Star Wars.”

Lucas chose quickly when he saw Mayhew, who liked to say all he had to do to land the role was stand up.

“Peter was a wonderful man,” Lucas said in a statement Thursday. “He was the closest any human being could be to a Wookiee: big heart, gentle nature … and I learned to always let him win. He was a good friend and I’m saddened by his passing.”

From then on, “Star Wars” would become Mayhew’s life. He made constant appearances in the costume in commercials, on TV specials and at public events. The frizzy long hair he had most of his adult life made those who saw him in real life believe he was Chewbacca, along with his stature.

His height, the result of a genetic disorder known as Marfan syndrome, was the source of constant health complications late in his life. He had respiratory problems, his speech grew limited and he often had to use scooters and wheelchairs instead of walking.

His family said his fighting through that to play the role one last time in “The Force Awakens” was a triumph.

Even after he retired, Mayhew served as an adviser to his successor Suotamo, a former Finnish basketball player who told The Associated Press last year that Mayhew put him through “Wookiee boot camp” before he played the role in “Solo.”

Mayhew spent much of the last decades of his life in the United States, and he became a U.S. citizen in 2005.

The 200-plus-year-old character whose suit has been compared to an ape, a bear, and Bigfoot, and wore a bandolier with ammunition for his laser rifle, was considered by many to be one of the hokier elements in the original “Star Wars,” something out of a more low-budget sci-fi offering.

The films themselves seemed to acknowledge this.

“Will somebody get this big walking carpet out of my way?!” Carrie Fisher, as Princess Leia, says in the original “Star Wars.” It was one of the big laugh lines of the film, as was Ford calling Chewie a “fuzzball” in “The Empire Strikes Back.”

But Chewbacca would become as enduring an element of the “Star Wars” galaxy as any other character, his roar — which according to the Atlantic magazine was made up of field recordings of bears, lions, badgers and other animals — as famous as any sound in the universe.

“Chewbacca was an important part of the success of the films we made together,” Ford said in his statement.

Mayhew is the third major member of the original cast to die in recent years. Fisher and R2-D2 actor Kenny Baker died in 2016.

Mayhew’s family said he was active with various nonprofit groups and established the Peter Mayhew Foundation, which is devoted to alleviating disease, pain, suffering and the financial toll from traumatic events. The family asked that in lieu of flowers, friends and fans donate to the foundation.

Mayhew is survived by his wife, Angie, and three children. A private service will be held June 29, followed by a public memorial in early December at a Los Angeles “Star Wars” convention.

https://www.post-gazette.com/news/obituaries/2019/05/02/Peter-Mayhew-original-Chewbacca-Star-Wars-died-74/stories/201905020200

Sydney Brenner was one of the first to view James Watson and Francis Crick’s double helix model of DNA in April 1953. The 26-year-old biologist from South Africa was then a graduate student at the University of Oxford, UK. So enthralled was he by the insights from the structure that he determined on the spot to devote his life to understanding genes.

Iconoclastic and provocative, he became one of the leading biologists of the twentieth century. Brenner shared in the 2002 Nobel Prize in Physiology or Medicine for deciphering the genetics of programmed cell death and animal development, including how the nervous system forms. He was at the forefront of the 1975 Asilomar meeting to discuss the appropriate use of emerging abilities to alter DNA, was a key proponent of the Human Genome Project, and much more. He died on 5 April.

Brenner was born in 1927 in Germiston, South Africa to poor immigrant parents. Bored by school, he preferred to read books borrowed (sometimes permanently) from the public library, or to dabble with a self-assembled chemistry set. His extraordinary intellect — he was reading newspapers by the age of four — did not go unnoticed. His teachers secured an award from the town council to send him to medical school.

Brenner entered the University of the Witwatersrand in Johannesburg at the age of 15 (alongside Aaron Klug, another science-giant-in-training). Here, certain faculty members, notably the anatomist Raymond Dart, and fellow research-oriented medical students enriched his interest in science. On finishing his six-year course, his youth legally precluded him from practising medicine, so he devoted two years to learning cell biology at the bench. His passion for research was such that he rarely set foot on the wards — and he initially failed his final examination in internal medicine.


Sydney Brenner (right) with John Sulston, who both shared the Nobel Prize in Physiology or Medicine with Robert Horvitz in 2002.Credit: Steve Russell/Toronto Star/Getty

In 1952 Brenner won a scholarship to the Department of Physical Chemistry at Oxford. His adviser, Cyril Hinshelwood, wanted to pursue the idea that the environment altered observable characteristics of bacteria. Brenner tried to convince him of the role of genetic mutation. Two years later, with doctorate in hand, Brenner spent the summer of 1954 in the United States visiting labs, including Cold Spring Harbor in New York state. Here he caught up with Watson and Crick again.

Impressed, Crick recruited the young South African to the University of Cambridge, UK, in 1956. In the early 1960s, using just bacteria and bacteriophages, Crick and Brenner deciphered many of the essentials of gene function in a breathtaking series of studies.

Brenner had proved theoretically in the mid-1950s that the genetic code is ‘non-overlapping’ — each nucleotide is part of only one triplet (three nucleotides specify each amino acid in a protein) and successive ‘triplet codons’ are read in order. In 1961, Brenner and Crick confirmed this in the lab. The same year, Brenner, with François Jacob and Matthew Meselson, published their demonstration of the existence of messenger RNA. Over the next two years, often with Crick, Brenner showed how the synthesis of proteins encoded by DNA sequences is terminated.

This intellectual partnership dissolved when Brenner began to focus on whole organisms in the mid-1960s. He finally alighted on Caenorhabditis elegans. Studies of this tiny worm in Brenner’s arm of the legendary Laboratory of Molecular Biology (LMB) in Cambridge led to the Nobel for Brenner, Robert Horvitz and John Sulston.


Maxine Singer, Norton Zinder, Sydney Brenner and Paul Berg (left to right) at the 1975 meeting on recombinant DNA technology in Asilomar, California.Credit: NAS

And his contributions went well beyond the lab. In 1975, with Paul Berg and others, he organized a meeting at Asilomar, California, to draft a position paper on the United States’ use of recombinant DNA technology — introducing genes from one species into another, usually bacteria. Brenner was influential in persuading attendees to treat ethical and societal concerns seriously. He stressed the importance of thoughtful guidelines for deploying the technology to avoid overly restrictive regulation.

He served as director of the LMB for about a decade. Despite describing the experience as the biggest mistake in his life, he took the lab (with its stable of Nobel laureates and distinguished staff) to unprecedented prominence. In 1986, he moved to a new Medical Research Council (MRC) unit of molecular genetics at the city’s Addenbrooke’s Hospital, and began work in the emerging discipline of evolutionary genomics. Brenner also orchestrated Britain’s involvement in the Human Genome Project in the early 1990s.

From the late 1980s, Brenner steered the development of biomedical research in Singapore. Here he masterminded Biopolis, a spectacular conglomerate of chrome and glass buildings dedicated to biomedical research. He also helped to guide the Janelia Farm campus of the Howard Hughes Medical Institute in Ashburn, Virginia, and to restructure molecular biology in Japan.

Brenner dazzled, amused and sometimes offended audiences with his humour, irony and disdain of authority and dogma — prompting someone to describe him as “one of biology’s mischievous children; the witty trickster who delights in stirring things up.” His popular columns in Current Biology (titled ‘Loose Ends’ and, later, ‘False Starts’) in the mid-1990s led some seminar hosts to introduce him as Uncle Syd, a pen name he ultimately adopted.

Sydney was aware of the debt he owed to being in the right place at the right time. He attributed his successes to having to learn scientific independence in a remote part of the world, with few role models and even fewer mentors. He recounted the importance of arriving in Oxford with few scientific biases, and leaving with the conviction that seeing the double helix model one chilly April morning would be a defining moment in his life.

The Brenner laboratories (he often operated more than one) spawned a generation of outstanding protégés, including five Nobel laureates. Those who dedicated their careers to understanding the workings of C. elegans now number in the thousands. Science will be considerably poorer without Sydney. But his name will live forever in the annals of biology.

https://www.nature.com/articles/d41586-019-01192-9

by Dave Mosher

When a person dies, cremation is an increasingly popular option. The practice eclipsed burials in the US in 2015 and is expected to make up more than half of all body disposals by 2020, according to the Cremation Association of North America.

But instead of storing a loved one’s cremains in an urn or sprinkling them outside, a growing number of bereaved consumers are doing something more adventurous: forging the ashes into diamonds.

This is possible because carbon is the second-most abundant atomic element in the human body, and diamonds are made of crystallised carbon. Researchers have also improved ways to grow diamonds in the lab in recent years.

While at least five companies offer a “memorial diamond” service, Algordanza in Switzerland is one of the industry leaders — its services are available in 33 countries, and the company told Business Insider it sold nearly 1,000 corporeal gems in 2016. Algordanza also claims to be the only company of its kind that operates its own diamond-growing lab for cremains — one of two in the world. (The other is in Russia.)

“It allows someone to keep their loved one with them forever,” Christina Martoia, a spokesperson for Algordanza US, told Business Insider. “We’re bringing joy out of something that is, for a lot of people, a lot of pain.”

Here’s how the company uses extreme heat and pressure to turn dead people — and sometimes animals — into sparkling gems of all sizes, cuts, and colours.

Read more at https://www.businessinsider.com.au/turn-human-ashes-diamonds-carbon-algordanza-2017-7#14TJLUlcEiVFwIPR.99

Estimated age based on exercise stress testing performance may be a better predictor of mortality than chronological age, according to a study published online Feb. 13 in the European Journal of Preventive Cardiology.

Serge C. Harb, M.D., from the Cleveland Clinic, and colleagues evaluated whether age based on stress testing exercise performance (A-BEST) would be a better predictor of mortality than chronological age among 126,356 consecutive patients (mean age, 53.5 years) referred for exercise (electrocardiography, echocardiography, or myocardial perfusion imaging) stress testing between Jan. 1, 1991, and Feb. 27, 2015. Exercise capacity (number of peak estimated metabolic equivalents of task), chronotropic reserve index, and heart rate recovery were used to compute estimated age taking into account patient’s gender and medications that affect heart rate.

The researchers found that after adjustment for clinical comorbidities, improved survival was associated with higher metabolic equivalents of task (adjusted hazard ratio [aHR] for mortality, 0.71) and higher chronotropic reserve index (aHR for mortality, 0.97). Higher mortality was associated with abnormal heart rate recovery (aHR for mortality, 1.53) and higher A-BEST (aHR for mortality, 1.05). There was a significant increase in the area under the curve when A-BEST rather than chronological age was used in prediction models (0.82 versus 0.79). The overall net reclassification improvement was significant.

“For the first time we can quantify the impact of your performance level on a treadmill test in adding or subtracting years from your actual age,” Harb said in a statement.

https://www.physiciansbriefing.com/cardiology-2/age-health-news-7/stress-test-based-physiological-age-may-be-superior-mortality-predictor-742824.html


Dr. Lewis L. Judd led the National Institute of Mental Health from 1988 to 1990. (National Library of Medicine)

By Emily Langer

Lewis L. Judd, a nationally known psychiatrist who helped turn the focus of his profession from psychoanalysis to neuroscience, an approach that sought to destigmatize mental illness by treating it as cancer, heart disease or any other medical problem, died Dec. 16 in La Jolla, Calif. He was 88.

The cause was cardiac arrest, said his wife, Pat Judd.

For decades, psychiatrists were schooled in the theories of Sigmund Freud, the founder of psychoanalysis, who posited that mental disturbances could be treated through dialogue with a therapist. Practitioners sought to interpret their patients’ dreams, giving little attention to the physical functioning of the brain or the chemicals that regulate it.

Dr. Judd agreed, he once told the Associated Press, that a physician must look at patients as a “whole individual,” with all their “worries, concerns, aspirations and needs,” and not resort to simply “popping a pill in their mouth.” But he found the long-prevailing psychoanalytic approach too limiting to explain or treat afflictions such as depression, bipolar disorder, severe anxiety and schizophrenia — “these serious mental disorders that have defied our understanding for centuries,” he once told the Chicago Tribune.

Instead, he advocated a biological approach, starting at the molecular level of the brain. As director of the National Institute of Mental Health in Bethesda, Md. — a post he held from 1988 to 1990, during a hiatus from his decades-long chairmanship of the psychiatry department at the University of California at San Diego — he helped launch a federal research initiative known as the “Decade of the Brain.”

“He was obsessed with educating the public and the profession . . . that mental illnesses were biological illnesses, that schizophrenia and depression were diseases of the brain,” Alan I. Leshner, Dr. Judd’s deputy at NIMH and later chief executive of the American Association for the Advancement of Science, said in an interview. “At the time, that was a heretical thought.”

Today, the biological component of many mental illnesses is widely accepted. When Dr. Judd led NIMH, it was not; he once cited a survey in which 71 percent of respondents said mental illness was a result of personal weakness and a third attributed it to sinful behavior. Poor parenting was another common alleged culprit.

Dr. Judd argued that the biological approach to psychiatry held the promise not only of deepening understanding of the body’s most complex organ but of improving lives: If mental disorders could be shown to be a result of brain chemistry or of physical dysfunction, patients might feel less stigmatized and therefore more willing to seek treatment.

“We look at the homeless and feel that if they only got their act together, they could lift themselves up,” Dr. Judd told the Los Angeles Times in 1988, discussing the prevalence of mental illness among homeless people. “We would never believe that about someone who has cancer or some other physical disease.”

As head of NIMH, which is an arm of the National Institutes of Health and the chief federal agency for research on mental illness, Dr. Judd oversaw more than $500 million in research money. He described the Decade of the Brain — a designation conferred by Congress and President George H.W. Bush — as a “research plan designed to bring a precise and detailed understanding of all the elements of brain function within our own lifetimes.”

During his tenure at NIMH, scientists for the first time successfully grew brain tissue in a laboratory. Dr. Judd was among those scientists who touted the potential of medical imaging, such as MRIs and PET scans, to reveal the inner workings of the brain and the potential causes of diseases such as schizophrenia.

Almost 30 years after the Decade of the Brain began, much about the organ remains elusive. Leshner credited the initiative with helping bring attention to the importance of brain research as well as inspiring the Brain Initiative, a public-private research effort advanced by the Obama administration.

“The brain is really the last frontier for scientists,” Dr. Judd said.

Lewis Lund Judd was born in Los Angeles on Feb. 10, 1930. His father was an obstetrician-gynecologist, and his mother was a homemaker. Dr. Judd’s brother, Howard Judd, also became an OB/GYN and a noted researcher in women’s health at the University of California at Los Angeles.

Dr. Judd received a bachelor’s degree in psychology from the University of Utah in 1954 and a medical degree from UCLA in 1958. In the early years of his career, he served in the Air Force as a base psychiatrist.

He joined UC-San Diego in 1970 and became department chairman in 1977, helping grow his faculty into one of the most respected the country. He stepped down as chairman in 2013 and retired in 2015.

Dr. Judd’s first marriage, to Anne Nealy, ended in divorce. Survivors include his wife of 45 years, the former Patricia Hoffman, who is also a psychiatry professor at UC-San Diego, of La Jolla; three daughters from his first marriage, Allison Fee of Whidbey Island, Wash., Catherine Judd of Miami and Stephanie Judd of Chevy Chase, Md.; and four grandchildren.

Ever exploring the outer reaches of his field, Dr. Judd participated in a dialogue with the Dalai Lama in 1989 about life and the mind.

“Our model of mental health is mostly defined in terms of the absence of mental illness,” Dr. Judd told the New York Times, reflecting on the Tibetan Buddhist leader’s discussion of wisdom and compassion. “They may have more positive ones that might be worth our study.”

https://www.washingtonpost.com/local/obituaries/lewis-judd-psychiatrist-who-probed-the-science-of-the-brain-dies-at-88/2019/01/11/271e1f48-1549-11e9-b6ad-9cfd62dbb0a8_story.html?noredirect=on&utm_term=.18ed788ae8b3

2EQEGUFXTQI6RLSPFQKDTSLNPE
Walter Mischel in 2004. “If we have the skills to allow us to make discriminations about when we do or don’t do something,” Dr. Mischel said, “we are no longer victims of our desires.” (David Dini/Columbia University)

By Emily Langer

The experiment was “simplicity itself,” its creator, psychologist Walter Mischel, would later recall. The principal ingredient was a cookie or a pretzel stick or — most intriguingly to the popular imagination — a marshmallow.

In what became known as “the marshmallow test,” a child was placed in a room with a treat and presented with a choice. She could eat the treat right away. Or she could wait unaccompanied in the room, for up to 20 minutes, and then receive two treats in reward for her forbearance.

Conducting their work at a nursery school on the campus of Stanford University in the 1960s, Dr. Mischel and his colleagues observed responses that were as enlightening as they are enduringly adorable. Some children distracted themselves by putting their fingers in their ears or nose. At least one child caressed the marshmallow as he hungered for it. Only about 30 percent of the children managed to wait for the double reward.

Dr. Mischel, who continued his career at Columbia University and died Sept. 12 at 88, followed a cohort of the children for decades and presented his findings to mainstream readers in his 2014 book “The Marshmallow Test: Why Self-Control is the Engine of Success.”

His observations, widely noted and hotly debated, were striking: Children who had found ways to delay gratification, he found, had greater success in school, made more money and were less prone to obesity and drug addiction.

“What emerged from those studies is a different view of self-control, one that sees it as a matter of skill” and not a matter of “gritting your teeth,” said Yuichi Shoda, a professor of psychology at the University of Washington who worked with Dr. Mischel as a graduate student.

As worried parents conducted marshmallow tests at home, policymakers, educators and motivational speakers found a compelling catchphrase: “Don’t eat the marshmallow!” Even the ravenous Cookie Monster, a mainstay of the children’s TV show “Sesame Street,” was coaxed to resist a cookie.

Meanwhile, some psychologists challenged Dr. Mischel’s findings, arguing that a study group drawn from the privileged environs of Stanford could hardly yield reliable results. Skeptics noted that while affluent families might teach their children to delay gratification, in an effort to encourage financial and other forms of responsibility, children from disadvantaged homes learn that waiting to eat might mean not eating at all.

Dr. Mischel defended his research, emphasizing that in no way did he wish to suggest a laboratory performance — particularly by a preschooler — was destiny. The question, he said, is “how can you regulate yourself and control yourself in ways that make your life better?”

Walter Mischel was born Feb. 22, 1930, to a Jewish family in Vienna. His home was not far from that of Sigmund Freud, the founder of psychoanalysis. “Even as a young child I was aware of his presence,” Dr. Mischel once told the British Psychological Society, “and I suspect at some level I became quite interested in what makes people tick.”

Dr. Mischel’s family enjoyed a comfortable life until the rise of Nazism. His father, a businessman who had suffered from polio, was made to limp through the streets without his cane. Dr. Mischel recalled being humiliated by members of the Hitler Youth who tread on his new shoes. The experience, he told the Guardian, planted in him a desire to understand “the enabling conditions that allow people to go from being victims to being victors.”

After the Nazi annexation of Austria in 1938, the family fled the country and settled eventually in New York City, where they ran a five-and-dime store. Dr. Mischel, who became a U.S. citizen in the 1950s, helped support the family by working in an umbrella factory and as an elevator operator.

He was a 1951 psychology graduate of New York University and received a master’s degree from the City College of New York in 1953 and a PhD from Ohio State University in 1956, both in clinical psychology. He taught at Harvard University before settling at Stanford.

He said he became fascinated by the development of self-control in children by watching his daughters emerge from infancy into toddler-hood and girlhood.

“I began with a truly burning question,” he told the Guardian. “I wanted to know how my three young daughters developed, in a remarkably short period of time, from being howling, screaming, often impossible kids to people who were actually able to sit and do something that required them to concentrate. I wanted to understand this miraculous transformation.”

The subjects of the Stanford nursery-school tests were his daughters’ classmates. As the children grew up and he noticed correlations between their childhood self-control and future success, he decided to pursue the question more rigorously, through longitudinal study.

He conceded the limitations of his study group at Stanford. “It was an unbelievably elitist subset of the human race, which was one of the concerns that motivated me to study children in the South Bronx — kids in high-stress, poverty conditions,” he told the Atlantic in 2014, “and yet we saw many of the same phenomena as the marshmallow studies were revealing.”

Dr. Mischel proposed strategies for delaying gratification, such as putting the object at physical distance, by removing it from view, or at symbolic distance by imagining it to be something else. A marshmallow is not a sugary treat, for example, but rather a cotton ball.

In his own life, he reported success at resisting chocolate mousse by imagining the dessert to be covered in roaches. A self-described “three-packs-a-day smoker, supplemented by a pipe . . . supplemented by a cigar,” he said he conquered his addiction by recalling the image of a lung-cancer patient he had seen at Stanford, branded with X’s where he would be treated by radiation.

In addition to “The Marshmallow Test,” Dr. Mischel wrote and co-authored numerous texts on personality, child development and other fields of psychological research. He retired last year after more than three decades at Columbia.

His marriages to Frances Henry and Harriet Nerlove ended in divorce. Survivors include his partner of nearly two decades, Michele Myers of New York; three daughters from his second marriage, Judy Mischel of Chicago, Rebecca Mischel of Portland, Ore., and Linda Mischel Eisner of New York City; and six grandchildren.

Linda Mischel Eisner confirmed the death and said her father died at his home of pancreatic cancer.

Dr. Mischel professed to have found hope in his life’s work. “If we have the skills to allow us to make discriminations about when we do or don’t do something,” he told the New Yorker magazine, “we are no longer victims of our desires.”

“It’s not,” he said, “just about marshmallows.”

https://www.washingtonpost.com/local/obituaries/walter-mischel-psychologist-who-created-marshmallow-test-dies-at-88/2018/09/14/dcf24008-b782-11e8-94eb-3bd52dfe917b_story.html?utm_term=.bc74b74cf416