Technology
Harmful content: Instagram has 'protected' policies toward teens, Facebook official tells Senate
A Facebook executive on Thursday informed the Congress that the company is working to protect young people on its platforms. She disputed the way a recent newspaper story described what the research shows.
The company has recently faced outrage over its handling of internal research on harm to teens from Instagram.
“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.
Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.
Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.
The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.
For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.
SOURCE: AP