Meta estimated 100,000 children on its platforms receive online sexual harassment daily, according to internal company documents made public earlier in the week.
Children across Facebook and Instagram are said to have been subject to all kinds of crude messaging, harassment and in some cases pictures of adult genitalia.
The unsealed legal reports include several allegations against the company from internal Meta staff.
It's the latest to come out of a lawsuit initiated by the New Mexico Attorney-General's office on December 5, which alleges Meta's social networks have become marketplaces for child predators.
Raúl Torrez, the state's Attorney-General, has accused the company of enabling child grooming which Meta has vehemently denied, saying it "mischaracterizes our work using selective quotes and cherry-picked documents".
The documents include the story of a 12-year-old daughter of an Apple executive who was solicited via Instagram direct messages.
"This is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store," a Meta employee said in the documented communications.
In another case, a senior Meta employee came forward describing how his child had been solicited on Instagram in testimony to the US Congress late last year.
He claimed his efforts to fix the problem were ignored.
The company has since released a statement in response to Wednesday's filings, claiming all platforms are well-monitored.
"We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents. We've spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online."
The lawsuit follows a Guardian investigation that uncovered Meta's failure to report or detect the use of its platforms for child trafficking. The investigation also uncovered that Messenger, a Facebook service, is used as a platform for child sex traffickers.
Meta has again claimed that all services are monitored for child trafficking and grooming.
Yet in an internal email from 2017 the concept of scanning Facebook Messenger for "harmful content" was opposed because it would provide less privacy than other services which would put the service at a disadvantage.