Suicide Is the Short Goodbye: Technology's AI search for hope and prevention for thousands at risk
Dec 20 · 9 min read
So easy. A steady, practiced hand, a quick flip of a tie, and then relief. Fast and easy relief. The room dimmed, the sounds slid off into a deepening quiet that promised sleep.
Everyone knew things were turning around for him. He had come back to town determined to make it this time. A move into a friend’s family home, a new good-paying job at an expensive restaurant where he’d begin work tomorrow. A haircut and a few fresh white shirts, and it was all ready for him.
Not giant steps, but small ones that all seemed to be indications of his being on the right track. But then there was that needle dangling out of his arm the next morning when his friend’s mom checked to be sure he was up for work.
Just a little taste, a bit to help take the edge off. A friend had assured him it was “good stuff,” so he put it in his pocket, said good night, and he was gone. His mom, who’d been through numerous rehabs with him, and his uncles, too, who had physically stuffed him into their cars on more than one ride to rehab, always thought it would end this way. But they still hoped it wouldn’t, and now here it was.
Was it suicide or an accidental, fatal overdose on junk cut with Fentanyl? Was the thought of more years of trying, trying, trying to kick it, and get his life together too much for him? Had he actively sought the long sleep?
Bright, good looking and talented, no one could figure out why he hadn’t realized the success that was in him. His biological dad was a physician; after all, a medical intern who made a few bucks by donating to the sperm bank.
Suicide Statistics Loom Large
The statistics tell the stories with data, not words, the cold numbers that fail to reveal the heartache, the pain, and the unendurable. Figures on a sheet in neat little rows and columns telling stories that will never reach ears that fear hearing them.
The National Center for Health Statistics analysis of data from the National Vital Statistics System indicated that between 2000 and 2016, the number of deaths per 100,000 increased by about 1% per year from 2000 through 2006 and 2% per year from 2006 through 2016.
A 50% increase among death by suicide by girls and women were found between 2000 and 2016 and a 21% increase between boys and men from 2000 and 2016. In 2016, suicide was the 10th leading cause of death in the United States and the second leading cause of death among people ages 10 to 34, according to the APA Monitor
Who Kills Themselves and Why Do They Do It?
Two simple questions but with such elusive answers that only deep learning may unravel some of it. The hope is that prevention is possible, but more are offering hope than their unreleased data might show.
Depression and its accompanying sense of hopelessness are some of the most reliable indicators of suicidal behavior or thinking in college students. Colleges have been reluctant to admit they might have a problem with suicide. The outcry was heard when PBS aired the documentary, “College Can Be Killing.”
I was interviewed by a prominent TV network where I expressed my concern that, on college campuses, suicide counseling might be the task given to student-counselors. The calls came in to pull my license, calling me a fraud and worse, and some were quite threatening. An anonymous, whispering caller told me I was being investigated. Quite unsettling. I survived to live another day.
Were they affiliated with colleges? I don’t know. I suspect many were because that’s how they identified themselves in the emails they sent.
What caused such anger regarding my statement? As a psychologist, I know that when you cut to the quick, the outrage is palpable. I must have struck a nerve.
Suicide is a topic no college wants to deal with openly, especially when they have a suicide problem. Exam time, holidays, and near graduation are when the stats begin to climb.
The emotional state of depressive-hopelessness is a significant and persistent factor in suicide attempts. But college isn’t the setting where most suicides are committed.
True in university and college students, suicide is a serious issue, but “successful” suicide is more prevalent in adolescents. “The number of adolescent deaths that result from suicide in the United States had been increasing dramatically during recent decades until 1990 when it began to decrease modestly.
“From 1950 to 1990, the suicide rate for adolescents 15 to 19 years old increased by 300%, but from 1990 to 2013, the rate in this age group decreased by 28%. In 2013, there were 1748 suicides among people 15 to 19 years old. The exact number of deaths from suicide actually may be higher because some of these deaths are recorded as accidental.”
“Accidental death” presents a serious problem in gathering suicide statistics to formulate the means to provide help before it’s too late. Data is skewed by it, so it does more harm than good.
Often an “accidental death” shields the school from unwanted negative publicity. Then, too, the parents are offered a degree of comfort if comfort is ever a word that can be connected to the self-constructed death of a child.
Specific professionals have high rates of suicide, too. It is well-known that physician deaths from suicide are hidden to spare the family additional grief but also to free life insurance funds. The insurance companies often have a non-payment clause if the person commits suicide within the first two to three years.
Want to check this one out? Look at the obituary notices in major medical journals and look for the year of death. And some cities participate in the ruse by insisting that a suicide note or other evidence must be present or it’s an “unexplained death.”
The CDC has taken into account what medical examiners need to do to determine the cause of death. In the case of a possible suicide the guidelines have specifice requirements.
“There is evidence (explicit and/or implicit) that at the time of injury the decedent intended to kill self or wished to die and that the decedent understood the probable consequences of his or her actions.”
How AI May Help
Artificial intelligence (AI) may provide some assistance in preventing suicides, but there are problems here, too. Not all those “stopped” suicides are genuinely suicidal, it seems. So how many who indeed plan to commit suicide are saved? The true rate may be one of the flaws being questioned in the highly-touted Facebook app to prevent suicide.
Jason Reid, who founded Chooselife.org, knows about suicide from a personal perspective. His 14-year-old son took his own life in March 2018.
Interviewed for an article on suicide, he said, “People don’t need suicide awareness. They are pretty aware of suicide. What people really want to know is what they can do. We need suicide prevention…”
One research study published in Depression and Anxiety, described technology that runs on Alexa, Assistant, and Siri and can detect PTSD in veterans with 89% accuracy. The program analyzes the recordings of individual speech. It is believed that this method may be extended to use in other groups, as well. Adolescents would appear to be one group that might utilize this technology for help.
Specifically, this research had a small sample of 52 cases with PTSD and utilized clinical interviews to obtain 40,526 speech features in an algorithm. The article indicates “this study demonstrates that a speech-based algorithm can objectively differentiate PTSD cases from controls…”
Both researchers at Lawrence Berkeley National Laboratory, the Department of Veterans Affairs, and the Department of Energy are working toward developing an algorithm to identify veterans who are at risk for self-harm or suicide; veterans commit suicide at a rate of 22 per day. The database they are utilizing is substantial, containing medical and other data on 700K veterans and 40K patients from a hospital-based sample.
Another methodology being used is fMRI, which displays nerve pathways utilized during thought processes. This method allowed researchers to evaluate these nerve patterns “to determine with 91% accuracy those who had suicidal ideation and those who did not. What’s more, among those with suicidal thoughts, the algorithm differentiated with 94% accuracy those who had made suicide attempts from those who had not.” But is fMRI practical when you need a quick way to assess someone for suicidality?
Finding the Helpers
Artificial intelligence can be used to reach out to those who need suicide prevention assistance but also to identify those who might be in “helper” positions. With that thought in mind, a team at the University of Southern California designed an algorithm to identify groups that could be trained to be helpful in this way.
The goal was not only to develop the algorithm but also a plan for positioning trained helpers in networks that would identify persons in need. “We want to ensure that a maximum number of people are being watched out for, taking into account resource limitations and uncertainties of open-world deployment. For example, if some of the people in the network are not able to make it to the gatekeeper training, we will want to have a robust support network,” one of the researchers indicated.
They plan to deploy the algorithm in a manner that ensures fairness and transparency. Also, homeless youth were another group that they wished to help with their AI tool.
Facebook began utilizing an AI program that “scans posts and live videos for threats of suicide and self-harm and alerts a team of human-read viewers, who can contact emergency responders if needed. Facebook founder, Mark Zuckerberg, indicated on his site that “In the last month alone, these AI tools have helped us connect with first responders quickly more than 100 times.”
Question: How many people use Facebook daily and is 100 a significant number of “saves” or were they not “saves” at all? The problem that exists in Facebook’s work in this area is that they are not revealing their true data set.
Canada has also initiated an AI program that will screen social media posts for warnings of suicide. The initiative will identify suicide-related behaviors from thoughts to threats to attempts. With those aims in mind, a research program has been designed to identify patterns of online behavior. The question remains, however, as to whether these people genuinely intend self-harm behavior.
One text messaging-based crisis counseling hotline, Crisis Text Line, uses machine learning to collect and analyze words and emoji’s that can signal a high risk of suicidal ideation or self-harm. The amount of data that it has been collecting is in the order of 30 million texts it has exchanged with users. The data is available online.
Interesting aspects of the data have indicated, so far, that Wednesday “is the most anxiety-provoking day of the week. Crises involving self-harm often happen in the darkest hours of the night.” More to come as the data is churned further.
Although this may seem simplistic, it is the knowledge that we need to help those in need. The work continues as each new algorithm learns to improve itself through deep learning and, in that way, assist more of those persons at risk of suicide.