Posts Tagged ‘suicide’

mayosi-banner-x
The 51-year-old University of Cape Town researcher had been suffering from depression, and his death has prompted reflection on being a black academic in South Africa.

Bongani Mayosi, a prominent cardiologist and dean of the Faculty of Health Sciences at the University of Cape Town in South Africa, died of suicide on July 27. He was 51.

“In the last two years he has battled with depression and on that day [Friday] took the desperate decision to end his life,” his family said in a statement at the time, News24 reports. “We are still struggling to come to terms with this devastating loss.”

Born in 1967, Mayosi grew up under apartheid in the Transkei region of South Africa. Homeschooled by his mother as a child, he later studied medicine at the University of KwaZulu-Natal, incorporating a year of research to qualify for a BMedSci degree. In 1998, he won a fellowship to join the PhD program in the department of cardiovascular medicine at the University of Oxford.

Upon returning to South Africa a few years later, Bongani worked on a number of projects, including searching for the genetic mutations underpinning arrhythmogenic cardiomyopathy to identifying risk factors involved in cardiovascular disease. In 2006, at 38 years old, he became the first black person to chair the Department of Medicine at the University of Cape Town (UCT).

His career over the next decade would be marked by several awards recognizing his contributions to cardiology. In 2007, he was named one of the top 25 “influential leaders in healthcare in South Africa,” and, two years later, received the Order of Mapungubwe, South Africa’s highest honor. In 2017, he was elected to the US National Academy of Medicine.

Becoming dean in 2016, Mayosi was responsible for handling part of the university’s response to a tumultuous period of student unrest across the country. In a letter published on News24, the university’s vice chancellor Mamokgethi Phakeng writes that during that period, Mayosi’s “office was occupied for about two weeks in 2016. He had to manage pressure coming from many different directions, including from staff and students.” Over the next two years, Mayosi suffered from depression and took time off from his position; he resigned twice, but was persuaded to change his mind.

Mayosi’s death has led colleagues to examine the external forces that might have contributed to his desperation. In early August, Johannesburg’s City Press and other outlets reported that UCT had instigated an inquiry into the circumstances surrounding Mayosi’s death following calls from concerned colleagues and the university’s Black Academic Caucus. In a statement on Facebook on August 2, the Caucus wrote that “it is hard for us to exclude the UCT working environment from the tragic death of our colleague, and indeed others, including students.” Many researchers and activists also highlighted challenges Mayosi faced as a black academic in South Africa.

Matshidiso Moeti, the African regional director for the World Health Organization—where Mayosi had chaired the African Advisory Committee on Health Research & Development—was one of many health officials and researchers to send condolences after news of Mayosi’s death. “We will always cherish him for his diligence and immense contribution to the development of the WHO strategy for strengthening the use of evidence, information and research for policy-making in the African Region,” she wrote.

Cardiologists Hugh Watkins of the University of Oxford and Ntobeko Ntusi of UCT write in a memorial published yesterday (September 11) in Circulation that “one of the most striking impressions from his funeral, attended by thousands of mourners who remembered him with awe and love, was the abundant evidence of his commitment to bring others with him, nurture talent, and provide the sorts of opportunity from which he had benefited. . . . We speak for many in saying that we are in awe of what Bongani achieved.”

https://www.the-scientist.com/news-opinion/celebrated-cardiologist-bongani-mayosi-dies-64787?utm_campaign=TS_DAILY%20NEWSLETTER_2018&utm_source=hs_email&utm_medium=email&utm_content=65896990&_hsenc=p2ANqtz-_Xn_C3066EAlU479N7jk9yk0YpvAneSzSm7Ae9hwdounQSXC6y1NB1SlSwEHpKfuJXV3J_nz64REq0mTIGy6GuyMPE0Q&_hsmi=65896990

Advertisements

Suicide rates and temperatures are both on the rise, but are these two occurrences connected? A new study suggests maybe so. The research revealed hotter-than-average months corresponded to more deaths by suicide—and the effect isn’t limited to the summer, even warmer winters show the trend.

In the study, published in Nature Climate Change, the investigators looked at all of the suicides that occurred in the U.S. and Mexico over several decades (1968 to 2004 for the U.S. and 1990 to 2010 for Mexico), comprising 851,088 and 611,366 deaths, respectively. They then observed how monthly temperature fluctuations over these periods in every county or municipality in both countries correlated to the suicide rates for that region. They discovered that for every 1-degree Celsius (1.8-degree Fahrenheit) rise in temperature, there was a 0.7 percent increase in suicide rates in the U.S. and a 2.1 percent increase in Mexico, averaging a 1.4 percent increment across both countries. That is, over the years, a given county would see more deaths by suicide in warmer-than-average months.

Notably, the average temperature of the county did not matter; for example, Dallas and Minneapolis saw a similar rise in suicide rates. The effect did not depend on the month either—it made no difference whether it was January or July. There was also no difference between gender, socioeconomic status, access to guns, air-conditioning and whether it was an urban or rural region. Across the board, when temperatures rose in a given place, so did the number of suicides.

“A lot of times when you hear about climate change and climate change impacts, you hear this catch phrase ‘climate change is going to generate winners and losers,’” says study author Marshall Burke. “Some people could benefit from climate change, the idea being if you live in a really cold location, sometimes things improve when you warm it up a little bit. We do not find that for suicide.” He continues, “Climate change in terms of suicide is not going to generate winners and losers, it’s just going to generate losers. Everyone, as far as we can tell—no matter whether you live in a cold place or live in a hot place—everyone is going to be harmed in terms of suicide risk when we increase the temperature.”

If climate change continues on its current trajectory with an estimated temperature increase of 2.5 degrees C (4.5 degrees F) by 2050, Burke, who is an assistant professor of earth system science at Stanford University, projects suicide rates would rise by 1.85 percent, resulting in an additional 21,770 deaths by suicide across the U.S. and Mexico. For comparison, economic recession is thought to increase suicide rates by 0.8 percent whereas news of celebrity suicides accounts for a 4.6 percent bump in rates.

Not everyone is convinced by these projections, though. Jill Harkavy-Friedman, vice president of research at the American Foundation for Suicide Prevention, says, “I think it’s an interesting and provocative idea. These two things may be co-occurring. You know, it’s possible that the rate of suicide is going up as the temperature is going up. But we don’t know that there’s anything causal about that.”

In their study the researchers speculate there could be some biological effect linked to temperature regulation in the brain that alters mental health and could underlie the correlation. In an attempt to connect mental well-being with temperature change more generally, they examined more than 600 million Twitter posts for depressive language over a 14-month period. The researchers again found hotter months corresponded to a higher probability of using depressive language. Prior work by the researchers also saw a similar trend in interpersonal conflict, with a 4 percent rise in violence attributed to climate change.

Burke acknowledged suicide is a complex phenomenon and temperature is certainly not the only or even the most important factor affecting mental health: “What studies like ours contribute is just saying on average, as you increase temperature, what’s going to happen to suicide rates? So that won’t tell you with utmost certainty what’s going to happen in specific locations, but it will tell you okay on average this is what we should expect. Our view is it would be foolhardy to ignore the evidence,” he notes.

Radley Horton, an associate research professor at Columbia University who was not involved in the research, says the study is a good reminder of how fundamental temperature is and how widespread its impacts are. “The deeper we look, the more likely we are to uncover ways that temperature directly impacts things we care about,” he says. “Climate uncertainty is not our friend. The further we push things, the greater the risk.”

https://www.scientificamerican.com/article/global-warming-linked-to-higher-suicide-rates-across-north-america/

By the time Logan Paul arrived at Aokigahara forest, colloquially known as Japan’s “suicide forest,” the YouTube star had already confused Mount Fuji with the country Fiji. His over 15 million (mostly underage) subscribers like this sort of comedic aloofness—it serves to make Paul more relatable.

After hiking only a couple hundred yards into Aokigahara—where over 247 people attempted to take their own lives in 2010 alone, according to police statistics cited in The Japan Times—Paul encountered a suicide victim’s body hanging from a tree. Instead of turning the camera off, he continued filming, and later uploaded close-up shots of the corpse, with the person’s face blurred out.

“Did we just find a dead person in the suicide forest?” Paul said to the camera. “This was supposed to be a fun vlog.” He went on to make several jokes about the victim, while wearing a large, fluffy green hat.

Within a day, over 6.5 million people had viewed the footage, and Twitter flooded with outrage. Even though the video violated YouTube’s community standards, it was Paul in the end who deleted it.

“I should have never posted the video, I should have put the cameras down,” Paul said in a video posted Tuesday, which followed an earlier written apology. “I’ve made a huge mistake, I don’t expect to be forgiven.” He didn’t respond to two follow-up requests for comment.

YouTube, which failed to do anything about Paul’s video, has now found itself wrapped in another controversy over how and when it should police offensive and disturbing content on its platform—and as importantly, the culture it foments that led to it. YouTube encourages stars like Paul to garner views by any means necessary, while largely deciding how and when to censor their videos behind closed doors.

‘Absolutely Complicit’

Before uploading the video, which was titled “We found a dead body in the Japanese Suicide Forest…” Paul halfheartedly attempted to censor himself for his mostly tween viewers. He issued a warning at the beginning of the video, blurred the victim’s face, and included the number of several suicide hotlines, including one in Japan. He also chose to demonetize the video, meaning he wouldn’t make money from it. His efforts weren’t enough.

“The mechanisms that Logan Paul came up with fell flat,” says Jessa Lingel, an assistant professor at the University of Pennsylvania’s Annenberg School for Communication, where she studies digital culture. “Despite them, you see a video that nonetheless is very disturbing. You have to ask yourself: Are those efforts really enough to frame this content in a way that’s not just hollowly or superficially aware of damage, but that is meaningfully aware of damage?”

The video still included shots of a corpse, including the victim’s blue-turned hands. At one point, Paul referred to the victim as “it.” One of the first things he said to the camera after the encounter was, “This is a first for me,” turning the conversation back to himself.

There’s no excuse for what Paul did. His video was disturbing and offensive to the victim, their family, and to those who have struggled with mental illness. But blaming the YouTube star alone seems insufficient. Both he, and his equally famous brother Jake Paul, earn their living from YouTube, a platform that rewards creators for being outrageous, and often fails to adequately police its own content.

“I think that any analysis that continues to focus on these incidents at the level of the content creator is only really covering part of the structural issues at play,” says Sarah T. Roberts, an assistant professor of information studies at UCLA and an expert in internet culture and content moderation. “Of course YouTube is absolutely complicit in these kinds of things, in the sense that their entire economic model, their entire model for revenue creation is created fundamentally on people like Logan Paul.”

YouTube takes 45 percent of the advertising money generated via Paul and every other creator’s videos. According to SocialBlade, an analytics company that tracks the estimated revenue of YouTube channels, Paul could make as much as 14 million dollars per year. While YouTube might not explicitly encourage Paul to pull ever-more insane stunts, it stands to benefit financially when he and creators like him gain millions of views off of outlandish episodes.

“[YouTube] knows for these people to maintain their following and gain new followers they have to keep pushing the boundaries of what is bearable,” says Roberts.

YouTube presents its platform as democratic; anyone can upload and contribute to it. But it simultaneously treats enormously popular creators like Paul differently, because they command such massive audiences. (Last year, the company even chose Paul to star in The Thinning, the first full-length thriller distributed via its streaming subscription service YouTube Red, as well as Foursome, a romantic comedy series also offered via the service.)

“There’s a fantasy that he’s just a dude with a GoPro on a stick,” says Roberts. “You have to actually examine the motivations of the platform.”

For example, major YouTube creators I have spoken to in the past said they often work with a representative from the company who helps them navigate the platform, a luxury not afforded to the average person posting cat videos. YouTube didn’t respond to a follow-up request about whether Paul had a rep assigned to his channel.

All Things in Moderation

It’s unclear why exactly YouTube let the video stay up so long; it may have be the result of the platform’s murky community guidelines. YouTube’s comment on it doesn’t shed much light either.

“Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated,” a Google spokesperson said in an emailed statement. “We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.”

YouTube may have initially decided that Paul’s video didn’t violate its policy on violent and graphic content. But those guidelines only consists of a few short sentences, making it impossible to know.

“The policy is vague, and requires a bunch of value judgements on the part of the censor,” says Kyle Langvardt, an associate law professor at the University of Detroit Mercy Law School and an expert on First Amendment and internet law. “Basically, this policy reads well as an editorial guideline… But it reads terribly as a law, or even a pseudo-law. Part of the problem is the vagueness.”

What might constitute a meaningful step toward transparency would be for YouTube to implement a moderation or edit log, says Lingel. On it, YouTube could theoretically disclose what team screened a video and when. If the moderators choose to remove or age-restrict a video, the log could disclose what community standard violation resulted in that decision. It could be modeled on something like Wikipedia’s edit logs, which show all of the changes made to a specific page.

“When you flag content, you have no idea what happens in that process,” Lingel says. “There’s no reason we can’t have that sort of visibility, to see that content has a history. The metadata exists, it’s just not made visible to the average user.”

Fundamentally, Lingel says, we need to rethink how we envision content moderation. Right now, when a YouTube user flags a video as inappropriate, it’s often left to a low-wage worker to tick a series of boxes, making sure it doesn’t violate any community guidelines (YouTube pledged to expand its content moderation workforce to 10,000 people this year). The task is sometimes even left to an AI, that quietly combs through videos looking for inappropriate content or ISIS recruiting videos. Either way, YouTube’s moderation process is mostly anonymous, and conducted behind closed doors.

It’s helpful that the platform has baseline standards for what is considered appropriate; we can all agree that certain types of graphic content depicting violence and hate should be prohibited. But a positive step forward would be to develop a more transparent process, one centered around open discussion about what should and shouldn’t be allowed, on something like a public moderation forum.

Paul’s video represents a potential turning point for YouTube, an opportunity to become more transparent about how it manages its own content. If it doesn’t take the chance, scandals like this one will only continue to happen.

As for the Paul brothers, they’re likely going to keep making similarly outrageous and offensive videos to entertain their massive audience. On Monday afternoon, just hours after his brother Logan issued an apology for the suicide forest incident, Jake Paul uploaded a new video entitled “I Lost My Virginity…”. At the time this story went live, it already had nearly two million views.

If you or someone you know is considering suicide, help is available. You can call 1-800-273-8255 to speak with someone at the National Suicide Prevention Lifeline 24 hours a day in the United States. You can also text WARM to 741741 to message with the Crisis Text Line.

https://www.wired.com/story/logan-paul-video-youtube-reckoning/?mbid=nl_010317_daily_list1_p1

When someone commits suicide, their family and friends can be left with the heartbreaking and answerless question of what they could have done differently. Colin Walsh, data scientist at Vanderbilt University Medical Center, hopes his work in predicting suicide risk will give people the opportunity to ask “what can I do?” while there’s still a chance to intervene.

Walsh and his colleagues have created machine-learning algorithms that predict, with unnerving accuracy, the likelihood that a patient will attempt suicide. In trials, results have been 80-90% accurate when predicting whether someone will attempt suicide within the next two years, and 92% accurate in predicting whether someone will attempt suicide within the next week.

The prediction is based on data that’s widely available from all hospital admissions, including age, gender, zip codes, medications, and prior diagnoses. Walsh and his team gathered data on 5,167 patients from Vanderbilt University Medical Center that had been admitted with signs of self-harm or suicidal ideation. They read each of these cases to identify the 3,250 instances of suicide attempts.

This set of more than 5,000 cases was used to train the machine to identify those at risk of attempted suicide compared to those who committed self-harm but showed no evidence of suicidal intent. The researchers also built algorithms to predict attempted suicide among a group 12,695 randomly selected patients with no documented history of suicide attempts. It proved even more accurate at making suicide risk predictions within this large general population of patients admitted to the hospital.

Walsh’s paper, published in Clinical Psychological Science in April, is just the first stage of the work. He’s now working to establish whether his algorithm is effective with a completely different data set from another hospital. And, once confidant that the model is sound, Walsh hopes to work with a larger team to establish a suitable method of intervening. He expects to have an intervention program in testing within the next two years. “I’d like to think it’ll be fairly quick, but fairly quick in health care tends to be in the order of months,” he adds.

Suicide is such an intensely personal act that it seems, from a human perspective, impossible to make such accurate predictions based on a crude set of data. Walsh says it’s natural for clinicians to ask how the predictions are made, but the algorithms are so complex that it’s impossible to pull out single risk factors. “It’s a combination of risk factors that gets us the answers,” he says.

That said, Walsh and his team were surprised to note that taking melatonin seemed to be a significant factor in calculating the risk. “I don’t think melatonin is causing people to have suicidal thinking. There’s no physiology that gets us there. But one thing that’s been really important to suicide risk is sleep disorders,” says Walsh. It’s possible that prescriptions for melatonin capture the risk of sleep disorders—though that’s currently a hypothesis that’s yet to be proved.

The research raises broader ethical questions about the role of computers in health care and how truly personal information could be used. “There’s always the risk of unintended consequences,” says Walsh. “We mean well and build a system to help people, but sometimes problems can result down the line.”

Researchers will also have to decide how much computer-based decisions will determine patient care. As a practicing primary care doctor, Walsh says it’s unnerving to recognize that he could effectively follow orders from a machine. “Is there a problem with the fact that I might get a prediction of high risk when that’s not part of my clinical picture?” he says. “Are you changing the way I have to deliver care because of something a computer’s telling me to do?”

For now, the machine-learning algorithms are based on data from hospital admissions. But Walsh recognizes that many people at risk of suicide do not spend time in hospital beforehand. “So much of our lives is spent outside of the health care setting. If we only rely on data that’s present in the health care setting to do this work, then we’re only going to get part of the way there,” he says.

And where else could researchers get data? The internet is one promising option. We spend so much time on Facebook and Twitter, says Walsh, that there may well be social media data that could be used to predict suicide risk. “But we need to do the work to show that’s actually true.”

Facebook announced earlier this year that it was using its own artificial intelligence to review posts for signs of self-harm. And the results are reportedly already more accurate than the reports Facebook gets from people flagged by their friends as at-risk.

Training machines to identify warning signs of suicide is far from straightforward. And, for predictions and interventions to be done successfully, Walsh believes it’s essential to destigmatize suicide. “We’re never going to help people if we’re not comfortable talking about it,” he says.

But, with suicide leading to 800,000 deaths worldwide every year, this is a public health issue that cannot be ignored. Given that most humans, including doctors, are pretty terrible at identifying suicide risk, machine learning could provide an important solution.

https://www.doximity.com/doc_news/v2/entries/8004313


by Jolynn Tumolo

By analyzing a patient’s spoken and written words, computer tools classified with up to 93% accuracy whether the person was suicidal, in a study published online in Suicide and Life-Threatening Behavior.

“While basic sciences provide the opportunity to understand biological markers related to suicide,” researchers wrote, “computer science provides opportunities to understand suicide thought markers.”

The study included 379 patients from emergency departments, inpatient centers, and outpatient centers at 3 sites. Researchers classified 130 of the patients as suicidal, 126 as mentally ill but not suicidal, and 123 as controls with neither mental illness nor suicidality.

Patients completed standardized behavioral rating scales and participated in semi-structured interviews. Five open-ended questions were used to stimulate conversation, including “Do you have hope?” “Are you angry?” and “Does it hurt emotionally?”

Using machine learning algorithms to analyze linguistic and acoustic characteristics in patients’ responses, computers were 93% accurate in classifying a person who was suicidal and 85% accurate in identifying whether a person was suicidal, had a mental illness but was not suicidal, or was neither.

“These computational approaches provide novel opportunities to apply technological innovations in suicide care and prevention, and it surely is needed,” said study lead author John Pestian, PhD, a professor in the divisions of biomedical informatics and psychiatry at Cincinnati Children’s Hospital Medical Center in Ohio.

“When you look around health care facilities, you see tremendous support from technology, but not so much for those who care for mental illness. Only now are our algorithms capable of supporting those caregivers. This methodology easily can be extended to schools, shelters, youth clubs, juvenile justice centers, and community centers, where earlier identification may help to reduce suicide attempts and deaths.”

References

Pestian JP, Sorter M, Connolly B, et al. A machine learning approach to identifying the thought markers of suicidal subjects: a prospective multicenter trial. Suicide and Life-Threatening Behavior. 2016 November 3;[Epub ahead of print].

Using a patient’s own words machine learning automatically identifies suicidal behavior [press release]. Cincinnati, OH: Cincinnati Children’s Hospital Medical Center; November 7, 2016.

It is known that people who have attempted suicide have ongoing inflammation in their blood and spinal fluid. Now, a collaborative study from research teams in Sweden, the US and Australia published in Translational Psychiatry shows that suicidal patients have a reduced activity of an enzyme that regulates inflammation and its byproducts.

The study is the result of a longstanding partnership between the research teams of Professor Sophie Erhardt, Karolinska Institutet, Professor Lena Brundin at Van Andel Research Institute in Grand Rapids, USA, and Professor Gilles Guillemin at Macquarie University in Australia. The overall aim of the research is to find ways to identify suicidal patients.

Biological factors

“Currently, there are no biomarkers for psychiatric illness, namely biological factors that can be measured and provide information about the patient’s psychiatric health. If a simple blood test can identify individuals at risk of taking their lives, that would be a huge step forward”, said Sophie Erhardt, a Professor at the Department of Physiology and Pharmacology at the Karolinska Institutet, who led the work along with Lena Brundin.

The researchers analyzed certain metabolites, byproducts formed during infection and inflammation, in the blood and cerebrospinal fluid from patients who tried to take their own lives. Previously it has been shown that such patients have ongoing inflammation in the blood and cerebrospinal fluid. This new work has succeeded in showing that patients who have attempted suicide have reduced activity of an enzyme called ACMSD, which regulates inflammation and its byproducts.

“We believe that people who have reduced activity of the enzyme are especially vulnerable to developing depression and suicidal tendencies when they suffer from various infections or inflammation. We also believe that inflammation is likely to easily become chronic in people with impaired activity of ACMSD,” said Brundin

Important balance

The substance that the enzyme ACMSD produces, picolinic acid, is greatly reduced in both plasma and in the spinal fluid of suicidal patients. Another product, called quinolinic acid, is increased. Quinolinic acid is inflammatory and binds to and activates glutamate receptors in the brain. Normally, ACMSD produces picolinic acid at the expense of quinolinic acid, thus maintaining an important balance.

“We now want to find out if these changes are only seen in individuals with suicidal thoughts or if patients with severe depression also exhibit this. We also want to develop drugs that might activate the enzyme ACMSD and thus restore the balance between quinolinic and picolinic acid,” Erhardt said.

The study was funded with the support of the Swedish Research Council, Region Skåne and Central ALF funds. Additional support came from National Institute of Mental Health (NIMH), the American Foundation for Suicide Prevention, Van Andel Research Institute, Rocky Mountain MIRECC, the Merit Review CSR & D and the Joint Institute for Food Safety and Applied Nutrition (University of Maryland), and the Australian Research Council. Several of the researchers have indicated that they have business interests, which are recognized in the article.

Publication

An enzyme in the kynurenine pathway that governs vulnerability to suicidal behavior by regulating excitotoxicity and neuroinflammation
Lena Brundin, Carl M. Sellgren, Chai K. Lim, Jamie Grit, Erik Palsson, Mikael Landen, Martin Samuelsson, Christina Lundgren, Patrik Brundin, Dietmar Fuchs, Teodor T. Postolache, Lil Träskman-Bendz, Gilles J. Guillemin, Sophie Erhardt.
Translational Psychiatry, published online August 2, 2016, doi: 10.1038 / TP.2016.133.

http://ki.se/en/news/reduced-activity-of-an-important-enzyme-identified-among-suicidal-patients

by Tori Rodriguez, MA, LPC

Although there was a consistent reduction in US suicide rates from 1986 through 1999, the trend appears to have reversed during the most recent investigation period. A new report from the Centers for Disease Control and Prevention (1) reveals that suicide rates increased by 24% from 1999 to 2014, with the greatest increase observed in the latter half of that period.

The increase occurred among males and females in all age groups from 10-74. While rates for males still exceed those for females, the gap began to narrow during the most recent period. Among females, the rate increase was almost triple that of males: 45% vs 16%.

While the highest suicide rate was observed among men aged 75 and older, there was a reduction of 8% in this group from the previous report. There was a 43% increase among males in the 45-64 age group, making it the group with the greatest rate increase and the second-highest suicide rate among males. The second highest increase (37%) occurred among males aged 10–14, although this group had the lowest rate among all of the age groups.

As with males, the suicide rate also decreased among females in the 75 and over group, by 11%. The steepest increase (200%) occurred among females aged 10-14, though the actual number of suicides in this age group was relatively small (150 in 2014). The females with the highest suicide rates comprised the 45-64 age group, which had the second greatest increase (63%) since the previous period. For females in the age groups of 15-24, 25-44, and 65-74, rate increases ranged from 31% to 53%.

The most common cause of suicide in females was poisoning, which accounted for 34.1% of cases, while the use of firearms accounted for more than half of male suicides (55.4%). Cases involving some form of suffocation–including hanging and strangulation–increased among both males and females.

Though the report does not provide possible explanations for these trends, other recent findings offer clues about a host of variables that could be influencing rates in the middle age brackets in particular, with especially strong support for economic issues as a potential influence. A study published in 2015 in the American Journal of Preventive Medicine, for example, found that economic and legal problems disproportionately affected adults aged 40-64 who had committed suicide (2). Research reported in 2014 showed a robust link between suicide rates and unemployment rates in adults in middle-aged adults but not other age groups, and according to a 2011 CDC study, suicide rates increased during periods of economic recession and declined during economic growth among people aged 25-64 years (3,4).

A co-author of the 2014 and 2015 studies, Julie A. Phillips, PhD, of the Institute for Health, Health Care Policy and Aging Research at Rutgers University, has received a grant from the American Foundation of Suicide to investigate the numerous variables that could be influencing the trend in middle-aged adults.

Additionally, a randomized controlled trial published in 2016 in PLoS Medicine found promising results with a brief, low-cost treatment designed to address the main risk factor for suicide: previous attempts (5).

An approach called the Attempted Suicide Short Intervention Program (ASSIP) was shown to reduce subsequent attempts by 80% among patients admitted to the emergency department after a suicide attempt.

If you or someone you know is experiencing suicidal thoughts, contact the National Suicide Prevention Line at 1-800-273-TALK (8255) and visit online at http://www.suicidepreventionlifeline.org.

References

1. Curtin SC, Warner M, Hedegaard H. Increase in suicide in the United States, 1999–2014. NCHS data brief, no 241. 2016; Hyattsville, MD: National Center for Health Statistics.

2. Hempstead KA, Phillips JA. Rising suicide among adults aged 40-64 years: the role of job and financial circumstances. Am J Prev Med. 2015; 48(5):491-500.

3. Phillips JA, Nugent CN. Suicide and the Great Recession of 2007-2009: the role of economic factors in the 50 U.S. states. Social Science & Medicine. 2014; 116:22-31.

4. Luo F, Florence CS, Quispe-Agnoli M, et al. Impact of business cycles on US suicide rates, 1928-2007. Am J Public Health. 2011; 101(6):1139-46.

5. Gysin-Maillart A, Schwab S, Soravia L, Megert M, Michel K. A novel brief therapy for patients who attempt suicide: A 24-months follow-up randomized controlled study of the Attempted Suicide Short Intervention Program (ASSIP). PLoS Medicine. 2016; 13(3): e1001968.

http://www.psychiatryadvisor.com/suicide-and-self-harm/increase-in-suicide-rates-in-united-states-cdc/article/492762/?DCMP=EMC-PA_Update_RD&cpn=psych_md,psych_all&hmSubId=&hmEmail=5JIkN8Id_eWz7RlW__D9F5p_RUD7HzdI0&NID=1710903786&dl=0&spMailingID=14943637&spUserID=MTQ4MTYyNjcyNzk2S0&spJobID=820858811&spReportId=ODIwODU4ODExS0