Instagram wants to make it safer for underage people on its app
Instagram wants to launch a version for children under the age of 13, an age range that had previously been banned from the photo-sharing app.
It comes as the photo-sharing social network promised to clean up its app after criticism of predatory behaviour against teenage users.
An Instagram spokesman said: "Increasingly kids are asking their parents if they can join apps that help them keep up with their friends.
"Right now there aren’t many options for parents, so we’re working on building additional products that are suitable for kids, managed by parents.
"We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more.”
According to an internal company announcement on Thursday, first reported by BuzzFeed, Instagram is keen to launch the new version as quickly as possible.
Vishal Shah, Instagram’s vice president of product, wrote on an employee message board on Thursday that the company had “identified youth work as a priority”.
He said: “We will be building a new youth pillar within the Community Product Group to focus on two things: (a) accelerating our integrity and privacy work to ensure the safest possible experience for teens and (b) building a version of Instagram that allows people under the age of 13 to safely use Instagram for the first time.”
Pavni Diwanji, an executive who joined Instagram parent Facebook last year, will help create the new app. Ms Diwanji moved from Google where she oversaw YouTube Kids, the video site’s locked down, advertising free sister site for young children.
Regulators may be an obstacle for a launch in the UK and Europe, where Facebook has struggled to gain approval to launch a Messenger app for children as young as six.
Earlier this week, Instagram announced commitments to making children safer on the app, including blocking adults from sending messages to children.
Adults will be unable to directly message any users under the age of 18 that do not follow them as part of the new rules.
Alerts will be sent to young users to encourage them to be cautious in conversation that they might have connected with but could have demonstrated suspicious behaviour. Suspicious behaviour could be a high amount of friend or message requests to teenage users, Instagram said.
The company has been criticised for enabling bullying and abuse and exposing teenagers to suicide-encouraging material on the app.
Adam Mosseri, head of Instagram, told the Telegraph in 2019 that it was “overwhelming” to have Instagram blamed for the death of Molly Russell, 14, who took her life after viewing self harm images online, by her father Ian.
“I focused a lot on safety issues and integrity issues and well being issues at Instagram and before at Facebook,” he said.
“They are issues I have always taken very seriously but it becomes so much more real so much more intense when you have the story of an individual particularly if something tragic has happened.”
Свежие комментарии