Facebook, Google, Instagram, and Snapchat will work alongside Samaritans to try and limit harmful content on their platforms.
The initiative was announced at a roundtable on Monday (April. 29th) by the health secretary, Matt Hancock, the Guardian reported.
The scheme is part of the government’s ongoing effort to make social media firms more accountable for the amount of harmful content that is posted on their websites.
Facebook recently responded by updating its policies on suicide, self-injury, and eating disorders to make sure that such content is removed.
Ruth Sutherland, chief executive of the Samaritans, said: “There is no black and white solution that protects the public from content on self-harm and suicide, as they are such specific and complex issues.”
“That is why we need to work together with tech platforms to identify and remove harmful content while being extremely mindful that sharing certain content can be an important source of support for some.”
Earlier this year, Facebook-owned Instagram received intense criticism after the father of a suicide victim, 14-year-old Molly Russel, accused the site of allowing self-harm content to proliferate.
In February, Instagram said that it will ban all graphic self-harm images as part of a series of changes in response to the death of Molly Russell.
As part of its efforts to remove harmful content on social media sites, the UK government has threatened to impose fines on firms that fail to protect their users from harmful content.