How SMS and applications of artificial intelligence are working to prevent suicide
AI-powered SMS warnings
Trigger warning: This content discusses self-harming behavior such as suicide and self-inflicted injuries.
For many years, tech giants such as Facebook and Instagram have been investing in applications of artificial intelligence and machine learning in order to sell us things better than anyone in the world. But, earlier this year, Facebook turned its sights onto something more serious. It is using its technological power to help prevent suicide.
It is not just major brands which are taking note of the power of AI for preventing suicide and self-harm. Doctors at research hospitals and even the US Department of Veteran Affairs are piloting new AI-driven suicide prevention platforms in order to capture useful data. Their goal is to build predictive models in order to intervene earlier because preventative measures are the best option when it comes to mental health.
How are SMS and applications of artificial intelligence being used to prevent suicide?
Applications of artificial intelligence have been used by companies to analyze data on retail purchases, but some companies have been using it to help them predict and prevent suicide and self-harm. This includes specially created services geared towards counseling people who might be experiencing a difficult time in their lives.
Crisis Text Line
Crisis Text Line is an SMS application in the United States of America which works as a text-based counseling service. Anyone who feels that they need to chat to a counselor when they’re in distress can simply send an SMS to the Crisis Text Line short code, and they’ll be connected to chat via SMS with a trained crisis counselor. They are currently using machine learning to look at the words and emojis that people use which can signal a high risk of suicide ideation or self-harm. The counselors are then told who needs to jump to the front of the queue to be helped.
The Crisis Text line is collecting massive amounts of data from the near 30 million SMS messages they receive on a regular basis. This data has shown some unique insights, such as the fact that Wednesday is the most anxiety-provoking day of the week and that crises involving self-harm often happen in the late hours of the night. Crisis Text Line looks at the conversation between the counselor and the texter, as well as the metadata around these conversations.
Companion by Cogito
Cogito is a Darpa-funded (Defense Advanced Research Projects Agency) company which is testing an app that gives you a picture of your mental health simply by listening to the sound (and tone) of your voice. The app is called Companion. It is opt-in software that passively collects everything that users say during the day. It uses this data to pick up vocal cues that can signal depression and mood changes.
Companion does not look at the content of these words, but rather at the inflection, tone, energy, fluidity of speaking, and levels of engagement with a conversation. In this way, it is one of the more unique applications of artificial intelligence, as many apps will first look at the content of conversations and not the tone. The Department of Veteran Affairs in America has been testing this platform with about one hundred veterans and while the data will only be available later this year, Companion has been able to detect major life changes that could lead to self-harm, such as homelessness.
Facebook’s suicide prevention AI
When it comes to applications of artificial intelligence, Facebook is ahead of the game. Their software has been termed “proactive detection” and it will be used to scan all posts for patterns of suicidal thoughts. This data will be analyzed and used to send mental health resources to these people, as well as to contact first responders if necessary.
The thought behind this process is that by using AI to flag worrying posts to human moderators rather than waiting for user reports to come in, Facebook can decrease how long it takes to send help to someone having suicidal thoughts or who might be practicing self-harm. Facebook was previously only using this AI in the USA, but it is now going to test it around the world. However, they are unable to do so in the European Union due to the GDPR (General Data Protection Regulation) privacy laws that prevent profiling users based on sensitive information.
Prediction before prevention
By using applications of artificial intelligence, machine learning, and SMS applications, companies such as Cogito, Crisis Text Line, and even Facebook will be able to predict suicidal and self-harming behavior before they need to prevent it. This technology will be able to detect behavioral changes that might not be obvious to a primary caregiver unless self-reported by the individual.
This rich data offers more than just a snapshot of someone’s mental health. To the app users, their behavior might just seem like a few missed gym trips, a few missed calls to your family and friends, and a few times where you decided to stay in bed. But, to these AI-driven apps themselves, this behavior will bring up red flags and the right people will be contacted to give the correct help.
While many of these innovations are still in their starting phases and it seems like a far-off future, the fact is that AI can be used for more than just gathering consumer data. To find out more about the unique possibilities for combining applications of artificial intelligence with SMS and other technologies, read our recent article on the topic.