
Instagram, Facebook, Snapchat and other apps face being forced by law to get rid of illegitimate content and sign a code of conduct protecting vulnerable users, as well as kids.
It’s expected that the UK’s culture and digital
minister Margot James will announce
a compulsory code of
conduct for tech giants
on Tuesday (5
February) following on from a BBC investigation around teenagers Molly Russell, who took
her own life after viewing
distressing material concerning depression
and suicide on Instagram.
Details of the code have yet to be disclosed; however many reports say James will use her speech to be given at a Safer internet Day conference to initiate a policy paper and
consultation prior introducing
the new regulatory regime.
Advertisers have reacted with alarm to the news
their own content was being surfaced alongside suicidal imagery. Household names together
with M&S, The Post office and the British Heart Foundation
were found last month to have been
inadvertently connected to such inappropriate
material.
Trade body Isba has already demanded that an independent, industry funded body be established to
certify content acceptable for
advertising.
A spokesman for
the Department for Digital, Culture, Media and Sport said: “We have heard calls for an internet
regulator and to position a
statutory ‘duty of care’ on platforms and are seriously considering all choices.
“Social media corporations clearly got to do a lot
of to confirm they’re not promoting harmful
content to vulnerable individuals.
Our forthcoming white paper will set out their responsibilities, however they should be
met and what should happen if they’re not.”
In an open letter written in the Telegraph this
week, Adam Mosseri, head of Instagram, admitted: “We are not yet where we want to be on the problems of suicide and self-harm. We have to do everything we
can to keep the most vulnerable people that use our platform
safe. To be very clear, we don’t enable posts
that promote or encourage suicide or self-harm.”
Instagram already uses engineers and
reviewers to make it harder for individuals to supply self-harm pictures. More recently, it’s been applying ‘sensitivity screens’ to blur
these photos.
The Facebook-owned app is stopping short of removing ‘cutting’ pictures entirely, though.
“We still enable individuals to share that they’re troubled even if that
content no longer shows
up in search, hashtags or account recommendations”, wrote Mosseri.
However, it does plan to supply larger support to people that are battling self-harm
or suicide by connecting them with organizations like the Samaritans.
Last month, the firm’s newly-installed head of
communications, Nick Clegg, said Facebook
would “look [at the issue] from top to bottom and
change everything we’re doing if necessary, to get it right.”
“We’re already taking steps shortly to blur pictures, block a number of hashtags that
have come to light, and thirdly to continue to work. With the
Samaritans and different organizations,”
he said.