Healthcare

Can the smartphone help predict suicides?

by

In March, Katelin Cruz left her last psychiatric hospitalization with a well-known mix of feelings. She was relieved to be out of the ward, where staff hid her shoelaces and sometimes followed her to the shower to make sure she didn’t try to injure herself.

But she said her life out there was as unstable as ever, with a pile of unpaid bills and no permanent home. It was easy to slip back into suicidal thoughts. For the frail patient, the weeks after discharge from a psychiatric institution are a notoriously difficult time, with a suicide rate 15 times that of the United States, according to one study.

This time, however, Cruz, 29, has left the hospital as part of a research project that is trying to use advances in artificial intelligence to do something psychiatrists have been trying to solve for centuries: predict who is susceptible to a suicide attempt and when. person will do it, so that an intervention is possible.

On her wrist, she wore a Fitbit programmed to track her sleep and physical activity. On her smartphone, an app collected data about her mood, her movements, and her social interactions. Each device provided a stream of information to a team of researchers at Harvard University.

In the field of mental health, few new areas generate as much excitement as machine learning, which uses computer algorithms to better predict human behavior. At the same time, there is great interest in biosensors that can track a person’s mood in real time, taking into account musical choices, social media posts, facial expression, and vocal expression.

Matthew K. Nock, a Harvard psychologist who is one of the country’s top suicide researchers, hopes to unite these technologies into a kind of early warning system that could be used when an at-risk patient is discharged from the hospital. He gave an example of how it might work: the sensor reports that a patient has disturbed sleep, reports a bad mood in questionnaires, and the GPS shows that he is not leaving the house. And an accelerometer on the phone shows that the person is moving around a lot, which suggests agitation. The algorithm signals the patient. An alarm sounds on a panel. And just in time, a doctor offers help with a phone call or a message.

There are many reasons to doubt that an algorithm can achieve this level of precision. Suicide is such a rare event, even among those most at risk, that any effort at prediction is sure to result in false positives, forcing intervention on people who may not need it. False negatives could put legal liability on doctors.

Algorithms need long-term granular data from large numbers of people, and it is nearly impossible to observe large numbers of suicides. Finally, the data needed for this type of monitoring raises the question of invasion of privacy of the most vulnerable members of society.

Nock is aware of all these arguments, but persists, partly out of frustration. “With all due respect to those who have been doing this work for decades, a century ago, we haven’t learned much about identifying people at risk and intervening. The suicide rate is now the same as it was a hundred years ago. So, frankly, we’re not getting better. “

The data

On an August afternoon at Harvard, a data scientist named Adam Bear was sitting at a monitor in Nock’s lab, looking at zigzag graphs of a subject’s stress levels over the course of a week.

When moods are mapped as data, patterns emerge, and looking for them is Bear’s job. He spent a few months this summer analyzing the days and hours of 571 participants who, after seeking medical attention for suicidal thoughts, agreed to be screened for six months. While they were being tracked, two committed suicide and between 50 and 100 made attempts.

The team is more interested in the days before the suicide attempts. Signs have already emerged: although suicidal impulses often do not change in the period before an attempt is made, the ability to resist these impulses appears to decline. Sleep deprivation seems to contribute to this.

Nock has been looking for ways to study these patients since 1994, when he had an experience that shocked him. During an undergraduate internship in the UK, he was assigned to a closed unit for violent and self-harming patients. There, he saw things he had never seen before: patients with cuts on their arms. One of them tore out his own eyeball. A young man he befriended, who appeared to be getting better, was later found in the River Thames.

He got another shock when he started asking doctors about the treatment of these patients and realized that they knew very little. He remembers that one of them replied, “We prescribe some medicine, talk to them and hope they get better.”

Nock concluded that one of the reasons was that it has never been possible to study large numbers of people with suicidal thoughts in the same way that we are able to look at patients with heart disease or tuberculosis: “Psychology has not advanced as far as other sciences because we have done it so We didn’t find any important behavior and observed it. But with the advent of smartphone-based apps and wearable sensors, we have data from many different channels and, increasingly, the ability to analyze it and watch people as they go. take life”.

Tell the truth to a computer

It was around 9 pm, a few weeks into the six-month study, that the question popped up on Cruz’s phone: “How strong is your desire to kill yourself?” Without thinking, she dragged her finger to the end of the bar: 10. A few seconds later, she was asked to choose between two statements: “I’m certainly not going to kill myself today” and “I’m certainly going to kill myself today.” She opted for the second.

Fifteen minutes later, his phone rang. She was a member of the research team, who had called 911 and kept Cruz on the line until the police arrived; then she passed out. Later, when she regained consciousness, a medical team massaged her sternum, a painful procedure used to revive people who had overdosed.

Cruz has a pale, angelic face and wears a fringe of dark curls. She was studying for a nursing course when a series of mental crises caused her life to change direction. She maintains her geeky interest in science, joking that the ribcage drawn on her T-shirt is “anatomically correct”.

She soon became interested in the experiment, and dutifully answered the questions six times a day when the apps on her phone asked her about her suicidal thoughts. The notifications were intrusive but also comforting: “It felt like I wasn’t being ignored. Having someone who knows how I feel takes some of the weight off.”

On the night of her attempt, she was alone in a hotel room in Concord, Massachusetts. She didn’t have enough money for another night there, and her belongings were in garbage bags on the floor.

She confessed that she was tired “of feeling that I had nobody and nothing”. She commented that she thought technology — her anonymity and lack of judgment — made it easier to ask for help: “I think it’s easier to tell the truth to a computer.”

Recently, as the six-month clinical trial came to an end, Cruz filled out his final questionnaire with a pang of sadness. She would lose the dollar she received for each answer. And she would miss the feeling that someone was watching her, even if she was a faceless person, from a distance, through a device.

“Honestly, I feel a little safer knowing that someone cares enough to read this data every day, you know? I’ll be kind of sad when it’s over.”


WHERE TO LOOK FOR HELP IN BRAZIL

Mental Health Map

Website brings together several service initiatives: www.mapasaudemental.com.br

CVV (Life Appreciation Center)

Volunteers answer toll-free calls 24 hours a day at number 188: www.cvv.org.br.

BE CAREFUL IF SOMEONE NEAR YOU…

  • Showing a lack of hope or too much concern about their own death

  • Expressing suicidal thoughts or intentions

  • Isolating yourself from your social activities and cutting off contact with others

  • In addition: losing a job, suffering discrimination due to sexual orientation or gender identity, suffering psychological or physical aggression, decreasing self-care practices.

healthleafmental healthpreventionsmartphonesSuicide

You May Also Like

Recommended for you