Instagram head Adam Mosseri has urged the creation of an industry body to determine best practices to help keep young people safe online, in his first appearance before Congress.
Mosseri, in written testimony before a Senate panel, said the industry body should address "how to verify age, how to design age-appropriate experiences, and how to build parental controls".
Photo-sharing app Instagram and its parent company Meta Platforms, formerly Facebook, have come under intense scrutiny over the potential impact of their services on the mental health and online safety of young users.
Mosseri said companies like Instagram "should have to adhere to these standards to earn some of our Section 230 protections", referring to a key US internet law which offers tech platforms protections from liability over content posted by users.
Lawmakers, who have held a series of hearings on children's online safety, said they want to discuss legislative reforms and solutions to protect kids online from harmful content, abuses and exploitative practices, including around the algorithms used by tech platforms.
In a blog post earlier this week outlining changes to the app, Mosseri said Instagram was switching off the ability for people to tag or mention teens who do not follow them on the app.
He said that starting in January, teen Instagram users would be able to bulk delete their content and previous likes and comments.
Instagram was also exploring controls to limit potentially harmful or sensitive material suggested to teens through its search function, hashtags, short-form video Reels and its 'Suggested Accounts' feature, as well as on its curated 'Explore' page.
The blog post also said that Instagram was launching its 'Take a Break' feature in the United States, United Kingdom, Canada and Australia, which reminds people to take a brief pause from the app after using it for a certain amount of time.
Republican Senator Marsha Blackburn criticised the company's product announcement as "hollow," saying in a statement: "Meta is attempting to shift attention from their mistakes by rolling out parental guides, use timers, and content control features that consumers should have had all along."
In September Instagram suspended plans for a version of the app for kids, amid growing opposition to the project.
That pause followed a Wall Street Journal report that said internal documents, leaked by former Facebook employee Frances Haugen, showed the company knew Instagram could have harmful mental health effects on teens.
In his written testimony, Mosseri echoed the company's previous statements that public reporting mischaracterised the internal research.
State attorneys general and lawmakers had also raised concerns about the kids-focused app.
Last month, a bipartisan coalition of US state attorneys general said it had opened a probe of Meta for promoting Instagram to children despite potential harms.