In suicide prevention literature, “gatekeepers” are community members who may be able to offer help when someone expresses suicidal thoughts. It’s a loose designation, but it generally includes teachers, parents, coaches, and older coworkers—anyone with some form of authority and ability to intervene when they see something troubling.
Could it also include Google? When users search certain key phrases related to suicide methods, Google’s results prominently feature the number for the National Suicide Prevention Lifeline. But it’s not a foolproof system. Google can’t edit webpages themselves, just search results, meaning an internet user looking for information about how to kill herself could easily find it through linked pages or on forums, never having used a search engine at all. At the same time, on the 2019 internet, “run me over” is more likely to be a macabre expression of fandom than a sincere cry for help—a nuance a machine might not understand. Google’s AI is also much less effective at detecting suicidal ideation when the person searches in languages other than English.
Ultimately, search results are a useful, but very broad, area to apply prevention strategies. After all, anyone could be looking for anything for any reason. Google’s latest foray into algorithmic suicide prevention is more targeted, for people who are already asking for help. In May, the tech giant granted $1.5 million to the Trevor project, a California-based non-profit that offers crisis counseling to LGBT teenagers via a phone line (TrevorLifeline), texting service (TrevorText), and an instant-messaging platform (TrevorChat). The project’s leaders want to improve TrevorText and TrevorChat by using machine learning to automatically assess suicide risk. It’s all centered around the initial question that begins every session with a Trevor counselor: “What’s going on?”
“We want to make sure that, in a nonjudgmental way, we’ll talk suicide with them if it’s something that’s on their mind,” said Sam Dorison, Trevor’s chief of staff. “And really let them guide the conversation. Do [they] want to talk about coming out [or] resources by LGBT communities within their community? We really let them guide the conversation through what would be most helpful to them.”
Currently, those who reach out enter a first-come-first-served queue.Trevor’s average wait time is less than five minutes, but in some cases, every second counts. Trevor’s leadership hopes that eventually, the AI will be able to identify high-risk callers via their response to …read more
Source:: The Atlantic – Health