By: Marcy Gordon
Instagram on Tuesday launched a feature that urges teenagers to take breaks from the photo-sharing platform and announced other tools aimed at protecting young users from harmful content on the Facebook-owned service.
The previously announced “Take A Break” feature encourages teens to stop scrolling if they have been on the social media platform for a while, Instagram head Adam Mosseri said in a blog post. It rolled out to the U.S., United Kingdom, Ireland, Canada, New Zealand and Australia on Tuesday and would reach the rest of the world early next year, he said.
Young users will see notifications about the feature and be urged to set reminders to take more breaks going forward, the post said. It’s one of the efforts that Facebook, renamed Meta Platforms, has touted on its platforms as it weathers backlash about not doing enough to rein in harmful content and faces new legislation looking to impose restrictions on tech giants.
Former Facebook product manager turned whistleblower Frances Haugen has testified to U.S. and European lawmakers working on those measures, citing internal company research suggesting that peer pressure generated by Instagram has led to mental health and body-image problems in young users, especially girls, and in some cases, eating disorders and suicidal thoughts.
She spoke again last week to Congress, urging U.S. lawmakers to move forward with proposals introduced after her first appearance in October. That includes restrictions on the long-standing legal protections for speech posted on social media platforms.
Haugen also has offered guidance on new online rules that are much further along in the U.K. and European Union, which has pioneered efforts to rein in big technology companies.
On Tuesday, Instagram also announced that its first tools for parents will roll out early next year, allowing them to see how much time their teens spend on Instagram and set time limits.
The social media platform also said it’s developing features that will stop people from tagging or mentioning teens that don’t follow them, nudge young users to other things if they have been focused on one topic for a while and be stricter about what posts, hashtags and accounts it recommends to try to cut down on potentially harmful or sensitive content.
On Wednesday, however, AP reported that the head of a Senate panel examining social media’s negative effects on young people has dismissed as “a public relations tactic” some safety measures announced by Facebook’s popular Instagram platform.
Adam Mosseri, the head of Instagram, on Wednesday faced off with senators angry over revelations of how the photo-sharing platform can harm some young users and demanding that the company commit to making changes.
Under sharp questioning by senators of both parties, Mosseri defended the company’s conduct and the efficacy of its new safety measures. He challenged the assertion that Instagram has been shown by research to be addictive for young people. Instagram has an estimated 1 billion users of all ages.
The parental oversight tools “could have been announced years ago,” Sen. Richard Blumenthal, D-Conn., told Mosseri. The newly announced measures fall short and many of them are still being tested, he said.
A pause that Instagram imposed in September on its work on a kids’ version of the platform “looks more like a public relations tactic brought on by our hearings,” Blumenthal said.
“I believe that the time for self-policing and self-regulation is over,” said Blumenthal. “Self-policing depends on trust. Trust is over.”
Mosseri testified as Facebook, whose parent now is named Meta Platforms, has been roiled by public and political outrage over the disclosures by former Facebook employee Frances Haugen.
The Senate panel has examined Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some Instagram-devoted teens, peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research detailed in the Facebook documents showed.
The revelations in a report by The Wall Street Journal, based on the documents leaked by Haugen, set off a wave of recriminations from lawmakers, critics of Big Tech, child-development experts and parents.
“As Head of Instagram, I am especially focused on the safety of the youngest people who use our services,” Mosseri testified. “This work includes keeping underage users off our platform, designing age-appropriate experiences for people ages 13 to 18, and building parental controls. Instagram is built for people 13 and older. If a child is under the age of 13, they are not permitted on Instagram.”
Mosseri outlined the suite of measures he said Instagram has taken to protect young people on the platform. They include keeping kids under 13 off it, restricting direct messaging between kids and adults, and prohibiting posts that encourage suicide and self-harm.
But, as researchers both internal and external to Meta have documented, the reality is different. Kids under 13 often sign up for Instagram with or without their parents’ knowledge by lying about their age. And posts about suicide and self-harm still reach children and teens, sometimes with disastrous effects.
Senators of both parties were united in condemnation of the social network giant and Instagram, the photo-sharing juggernaut valued at some $100 billion that Facebook acquired for $1 billion in 2012.
Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on Instagram. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.
Senators pressed Mosseri to support legislative remedies for social media.
Among the legislative proposals put forward by Blumenthal and others, one bill proposes an “eraser button” that would let parents instantly delete all personal information collected from their children or teens.
Another proposal bans specific features for kids under 16, such as video auto-play, push alerts, “like” buttons and follower counts. Also being floated is a prohibition against collecting personal data from anyone aged 13 to 15 without their consent. And a new digital “bill of rights” for minors that would similarly limit gathering of personal data from teens. (AP)