Google Using Advanced AI to Suicide Prevention

Spread the love

Google has been developing superior AI systems and says it is now putting them to use in order to keep people safe.

The search giant revealed additional details on Wednesday about how it is utilizing smart technologies to prevent suicide and domestic violence, as well as to ensure that people don’t view graphic content when they aren’t seeking for it.

When customers search for phrases like “suicide” or “domestic violence,” Google will display a box with information on how to get help. It fills these slots with phone numbers and other resources that it develops in collaboration with local groups and professionals.

However, Google discovered that not all search keywords connected to personal crises are clear, and many of them are geographically specific. There are several sites that are classified as “suicide hot spots,” for example. Previously, Google had to manually tag queries for identified hotspots so that damage reduction information boxes would appear. However, thanks to emerging AI techniques, the search engine may be able to detect that a search is about suicide — and expose these warning boxes — without the need for explicit human intervention.

People contemplating suicide in Australia, for example, would look for “Sydney suicide hot places” on Google. Google claims that its new language processing algorithms enable it to recognize that what a user is truly seeking for here are jumping spots — and that they might need assistance.

According to Anne Merritt, a Google product manager who worked on the damage reduction initiative, “not all crisis language is evident, particularly across languages and cultures.”

Similarly, information gleaned from extensive, sophisticated searches about relationships could indicate that a person is being abused. Previous systems may have had difficulty distinguishing the critical bit of data from the background noise. MUM, on the other hand, is better at understanding lengthier inquiries, so it can reveal domestic violence information boxes in these cases, according to Google.

Google’s other idea is to ensure that people don’t come across graphic content by accident if it’s not what they’re looking for. Google claims that putting a new AI system to work on this task has decreased graphic search results by 30% even without the “safe search” setting enabled.

Google used the example of a user looking for a music video to showcase the technique. Many music videos involve a lot of nudity, or at least a lot of partial nudity. As a result, unless a user specifically requests it, Google will show videos that do not contain graphic content such as nudity or violence. This may appear prudish, and it may be unjust to musicians that incorporate nudity or other graphic content into their work. However, as the arbiters of search, Google appears to have chosen to be safe rather than sorry (lest they accidentally traumatize someone).

Emma Higham, a product manager who works on safe searches, stated, “We really want to be extremely obvious that a user is searching something out before we return it.”

MUM and BERT are the names of the new AI engines that are driving these changes. MUM stands for Multitask Unified Model, which is the newest technology. MUM, according to Google, is better at understanding the purpose behind a person’s search and hence provides more nuanced replies than previous models. It is also fluent in 75 languages, allowing it to respond to inquiries using data from sources written in languages other than the one in which the user is looking. MUM will be used by Google in the “coming weeks” to help with crisis management.

MUM, according to Google, is 1,000 times more powerful than BERT, which stands for Bidirectional Encoder Representations from Transformers. But don’t dismiss BERT, which is capable of comprehending words in the context of the words that surround them, rather than merely for the meaning of a single word. That’s why it’s such a good tool for cutting down on graphic content searches.
An information box and clearer search results can only do so much to stop the stream of stress and trauma that pervades modern life, especially on the internet. But Big Tech has even more motive to invest in technical tools with applications like these.


Spread the love