Sharing depressing or suicidal thoughts via live streaming on Facebook has become a norm, and the company is now taking a huge step to resolve this issue and help such people. From now on, if users see something worrying on a Live broadcast, it will be easier for them to get help.

The company is introducing new suicide prevention measures, including crisis support and streamlined reporting. It’s aiming to help at-risk people on Facebook Live and Messenger.

If users see something concerning on a Live broadcast, they will see an option to contact an organization to get help for themselves or the broadcaster. Participating organizations are Crisis Text Line, the National Eating Disorders Association, and the National Suicide Prevention Lifeline.

The company rolled out different version of such tools last year, when they enabled users to flag worrying content for review by Facebook. Now instead users can chat directly with the organizations via Messenger.

Content is also monitored by artificial intelligence, which uses pattern recognition to spot potentially-suicidal people and reach out to them even if no one has reported them yet.

Earlier this year a 12-year-old girl committed suicide on a live-stream. Videos of her death then found their way onto Facebook and took two weeks for the company to purge. The company is making sure that such incidents don’t happen again.

Facebook Crisis Support

For U.S. users, if you or anyone you know is considering suicide or experiencing intense feelings of anxiety or depression, please call the National Suicide Prevention Lifeline at 1-800-273-8255. For a similar helpline in your country, please visit the International Association for Suicide Prevention.