Mental Illness and the Promise of AI: The search for relief may lie in AI not alcohol
Mental illness and the terror of its symptoms, which can include paralyzing fear, paranoid thoughts, total life destruction, or disruption, is an affliction that will seize nearly one in five US adults, according to a National Institute of Mental Health survey completed in 2017. The numbers are significant; 46.6 million in the US. But, in the intervening years since that survey, how many more will be added to its ranks? The illnesses (yes, there are many) can go from mild to moderate to severe in their manifestation.
Some will not have a full-blown illness that is diagnosable, but they will suffer the pangs. Many will try introspection to seek relief, but it may be futile, and they may resort to the use of substances such as alcohol or street or prescription drugs.
Alcohol as a relief
As stated in a recent study of anxiety, “Co-occurring anxiety disorders and alcohol use disorders (AUDs) are of great interest to researchers and clinicians. Cumulative evidence from epidemiological and clinical studies over the past few decades has highlighted both the frequency and clinical impact of this comorbidity. Investigations into the unique connections between specific anxiety disorders and AUDs have shown that this association is multifaceted and complex, underscoring the importance of careful diagnostic scrutiny. Of clinical relevance, treatment for people with comorbid anxiety and AUDs can be complicated, and both the methods used and the timing of the interventions are relevant factors in treatment planning and delivery…”
The researchers concluded that “Additional advances and expansion of the empirical evidence are necessary to further move this area of research and clinical practice forward.” Seemingly, they had not considered advances in computer technology and artificial intelligence, but that is where the advances are coming, and the time may be right. Timing is crucial but not always possible, and the disorders, especially the common one of anxiety, lingers.
When should someone seek help, and who should they approach? Not knowing when and where to find it as well as denial to seek help, may begin early in life as we see in college students in distress.
College students are at risk
Suffering the ills of a mild to severe mental illness, whether transitory or chronic, may impede a blossoming career and see it whither under the stress of its demands; they will feel like failures. The feelings bombarding their self-esteem may linger for their life, or they will fight in a new way, ultimately, to seek help to resurrect the promise of a life of fulfillment.
College students, as well as medical students, were found vulnerable to anxiety and stress at an alarming degree. One study found, “Among these students, 1 in 4 said they had been diagnosed with or treated for a mental health problem. Furthermore, 20 percent of all students surveyed thought about suicide, 9 percent had attempted suicide, and nearly 20 percent injured themselves.
“These problems were particularly acute among transgender students, with about two-thirds saying that they had hurt themselves and more than one-third saying they had attempted suicide.”
Another study found similarly distressing results. “An estimated 6% of first-year students at this university had current suicide ideation. Depressive symptoms, low social support, affective dysregulation, and father-child conflict were each independently associated with suicide ideation. Only 40% of individuals with suicide ideation were classified as depressed according to standard criteria.” The lack of classification may have led to not seeking help.
AI and its contribution to relief
The search for help for everyone, whether a college student, a worker or someone with a medical condition continues. A new “helper” may be at hand and it’s in technology, not the usual therapist in an office.
Those in the field of psychiatry have not always manifested an enthusiastic approach when it comes to artificial intelligence and their profession. “Despite lively and ongoing debate, limited attention has been paid to the views of practicing physicians on the impact of AI on medical professions. This is especially relevant in mental health care, which depends on long-term, empathic relationships between physicians and highly vulnerable patients, and in light of the flood of mental health apps available for download.”
The survey, completed in spring/summer of 2019, was of a global cross-sectional study of psychiatrists. Although it noted that by 2030, the health burden is forecast to cost the global economy some $16 trillion, one estimate in the US showed that 77% of the counties are underserved by psychiatrists. The World Health Organization estimated that in low-income countries, it is 100 times lower than that in high-income countries.
What were the general comments from the psychiatrists regarding AI and mental illness in psychiatry? Those who completed the survey indicated:
1. lack of empathy in the therapeutic process
2. feelings of antipathy toward AI due to job displacement
3. AI would have an inability to assess the patient’s mental status comprehensively
4. AI is unable to have human understanding of mental illness
5. the stigma associated with these illnesses
6. the machine would be unaware or incapable of simulating consciousness
Regarding the potential for this technology in mental health and psychiatry, other respondents indicated that it could work under any environment without fatigue and not become moody.
Some estimated that it made fewer medical errors, incorporated more standardized protocols, which would result in a better quality of life for psychiatrists and would have less bias due to race or gender.
The inherent AI biases
The bias regarding race and gender in large data sets has been addressed by many as being “baked in” and could be unseen in the final program that resulted. Particular attention, they note, must be paid to how the data set was collected and careful inspection for any potential for bias in multiple areas, including race and gender.
It is a proven fact that depression and anxiety appear to be diagnosed at higher rates in women than in men. Psychiatry, as well as the other areas of medical care, is not exempt from this bias. These biases may skew the data and, if this data is used, the resulting data set used by AI is flawed.
Despite the dangers of bias, AI is forging ahead with new methods of addressing mental illness and how to improve it. Psychiatrists do have a new tool, and it does have a place in their treatment plans.
As described by researchers, An as-yet-unnamed AI tool has shown viable attributes. The tool “shows that artificial intelligence (AI) can help doctors to diagnose the mental health of patients with speed and accuracy. The new app designed for mobiles has been developed to improve patient monitoring using cues picked up from their speech, and is based on machine-learning tools.”
Evaluation requires 5 to 10 minutes of conversation on a phone where the potential patient is asked to tell a small story, listen to and then repeat a story, and perform several motor-skill tasks.
An evaluation of their speech is then performed and compared to previous samples to arrive at an indication of healthiness in terms of mental health. The specific tool that was used was tested with 225 patients who were either diagnosed with a psychiatric illness or healthy volunteers in two locations, rural Louisiana, and Northern Norway. The favorable results were then compared for accuracy.
Further testing is needed to ensure its validity. But there is a current app available, and its developers are highly confident of its use.
Tools currently in use
The tool being used in some mental health settings or by consumers is Woebot, an app-based mood tracker and chatbot, is a combination of AI and CBT (cognitive behavioral therapy). However, there is still some pessimism as to the utility and accuracy of this app. Some psychiatrists are indicating that it maybe 5 to 10 years before any algorithms are routinely used in a clinical mental health setting. The problems they’re indication is the data being utilized and the biases which we have outlined.
Artificial intelligence, however, is not limited to apps. Research at the University of Texas Health Sciences Center in Houston is using a machine-learning algorithm to assess functional magnetic resonance imaging for the detection and treatment of persons with schizophrenia. The algorithm inspects and measures areas of the brain and then identifies those with schizophrenia at 78% accuracy.
According to the Texas researchers, they can predict, at 82% accuracy, the response of a person to specific antipsychotic pharmacologic treatments. At the current juncture, pharmacologic treatments of schizophrenia require repeated trials with multiple (polypharmacy) medications or single-dose medications and a washout period.
Therefore, treatment is dependent on a trial-and-error process that can be extremely trying for the patient and which requires weeks without successful remission or change in symptoms. The AI program could make this a one-trip diagnosis with effective treatment medication.
The success of AI to date gives hope to all who practice any form of mental healthcare as well as to patients. AI can provide quick, seemingly accurate information leading to diagnostic success and pinpointing treatments with greater accuracy. The human frailties now present in healthcare will be a thing of the past if the promise of AI comes true.
Viewed as a technological workhorse with great potential, AI will enhance medicine to the benefit of all.