This browser does not support the Video element.
WASHINGTON - The CEO of Facebook’s Instagram is facing lawmakers angry over revelations of how the popular photo-sharing platform can harm some young users and demanding that the company commit to making changes.
Adam Mosseri is testifying Wednesday at a Senate hearing as Facebook, whose parent now is named Meta Platforms, has been roiled by public and political outrage over the disclosures by former Facebook employee Frances Haugen. She has made the case before lawmakers in the U.S., Britain and Europe that Facebook’s systems amplify online hate and extremism and that the company elevates profits over the safety of users.
Haugen, a data scientist who had worked in Facebook’s civic integrity unit, buttressed her assertions with a massive trove of internal company documents she secretly copied and provided to federal securities regulators and Congress.
RELATED: Mom says teen daughter's photos were used on explicit social media page
A Senate Commerce Committee panel has examined Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research detailed in the Facebook documents showed.
This browser does not support the Video element.
The revelations in a report by The Wall Street Journal, based on the documents leaked by Haugen, set off a wave of recriminations from lawmakers, critics of Big Tech, child-development experts and parents.
At a subcommittee hearing in September, senators of both parties were united in condemnation of the social network giant and Instagram, the photo-sharing juggernaut valued at some $100 billion that Facebook acquired for $1 billion in 2012.
The lawmakers accused Facebook of concealing the negative findings on Instagram. The panel grilled Antigone Davis, Facebook’s head of global safety, who defended Instagram’s efforts to protect young people using its platform. She disputed the way the Wall Street Journal outlined the research.
RELATED: Instagram unveils new teen safety tools ahead of congressional hearing
Sen. Richard Blumenthal, D-Conn., the subcommittee’s chairman, had called for Meta CEO Mark Zuckerberg to appear before the panel to testify on the Instagram situation. For now, it will be Mosseri fielding those questions.
"After bombshell reports about Instagram’s toxic impacts, we want to hear straight from the company’s leadership why it uses powerful algorithms that push poisonous content to children driving them down rabbit holes to dark places, and what it will do to make its platform safer," Blumenthal said in a prepared statement Wednesday.
Facebook’s public response in September to the outcry over Instagram was to put on hold its work on a kids’ version of the platform, which the company says is meant mainly for children aged 10 to 12.
RELATED: Facebook whistleblower says platform making online hate worse
On Tuesday, Instagram introduced a previously-announced feature which urges teenagers to take breaks from the platform. The company also announced other tools that it says are aimed at protecting young users from harmful content.
Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.
Beyond changes by the company, senators are pressing Mosseri to support legislative remedies for social media.
Among the legislative proposals put forward by Blumenthal and others, one bill proposes an "eraser button" that would let parents instantly delete all personal information collected from their children or teens. Another proposal bans specific features for kids under 16, such as video auto-play, push alerts, "like" buttons and follower counts. Also being floated is a prohibition against collecting personal data from anyone aged 13 to 15 without their consent. And a new digital "bill of rights" for minors that would similarly limit gathering of personal data from teens.