By Cat Zakrzewski, Rachel Lerman
More than 40 state attorneys general are pressuring Facebook to drop its controversial plans to launch a version of Instagram for children under the age of 13.
But Facebook is plowing ahead anyway, confident in its assertion that a separate service will actually make social media safer for preteens.
In a letter to CEO Mark Zuckerberg released Monday, the attorneys general argued that social media can be detrimental to children's physical and mental health. Facebook has a checkered history of privacy incidents, and they raised concerns that the platform would not be able to protect young children online or adequately comply with existing federal children's privacy law.
"It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account," they wrote. "In short, an Instagram platform for young children is harmful for myriad reasons."
Facebook insists that its plan to make an Instagram for preteens will give parents more control than they have now, when everyone knows many kids under 13 use social media anyway.
"As every parent knows, kids are already online," Facebook spokesman Andy Stone said in a statement. "We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing."
Facebook will not show ads "in any Instagram experience we develop for people under the age of 13," Stone added.
The push and pull highlights the challenge confronting regulators who are eager to check large tech companies' political power.
State attorneys general might not prevent Facebook from charging ahead with its plans. But by putting a stake in the ground before the service even launches, they are ensuring there's greater public scrutiny of the product. That could push Facebook to make sure it doesn't make some of the same privacy missteps it has in the past.
It's also a politically popular position for the attorneys general to take, as parents and grandparents worry about the negative effects of more social media time for children during the pandemic.
The letter's signatories ranged from Washington D.C. attorney general Karl A. Racine, D, to Texas attorney general Ken Paxton, R, highlighting the growing bipartisan interest in checking the tech industry's influence on children.
Facebook pushed back against the lawmakers' concerns Monday, saying it is designing its social media features for kids by consulting with experts in child safety, privacy and mental health.
Facebook has not said when it could release an Instagram kids' app, but Instagram head Adam Mosseri told Bloomberg this month that making a separate app for kids will be a "safer and better and more sustainable outcome" than kids just using the main version.
Facebook will eventually create one place for parents to control kids' activity on both Messenger Kids, Facebook's chat service for children, and the new Instagram kids' service, according to Mosseri.
Facebook's next move could put it further in the crosshairs of the federal government, which has been increasingly kicking around the idea of more stringent tech regulation.
"If Facebook insists on plowing ahead, it's the clearest sign yet that the company views itself as accountable to no one, even when it comes to the well-being of children, and must be regulated much more rigorously," Josh Golin, executive director of Campaign for a Commercial-Free Childhood said in an email.
Lawmakers from both parties have sought to elevate concerns about children's privacy and tech addiction in Congressional hearings, and it was a key focus during the House grilling in March of Zuckerberg, Twitter CEO Jack Dorsey and Google CEO Sundar Pichai. Both Democrats and Republicans in Congress have expressed interest in expanding Children's Online Privacy Protection Act, a 1998 law known by the acronym COPPA that restricts the tracking and targeting of those younger than 13.
Monday's letter signals that these concerns have trickled down to the states, at a time when they're increasingly taking up the mantle of checking Silicon Valley's power following years of inaction in Washington. Many of the same state attorneys general who signed also brought an antitrust lawsuit against Facebook last year.
"Without a doubt, this is a dangerous idea that risks the safety of our children and puts them directly in harm's way," New York attorney general Letitia James, D, said in a statement. "Not only is social media an influential tool that can be detrimental to children who are not of appropriate age, but this plan could place children directly in the paths of predators."
Most social media apps currently require users to be 13 or older to use the mainstream version of their apps, but there are easy ways to get around these age restrictions. Kids can use an adult's account to browse, or even just lie about their birth dates.
Advocates say the Instagram app meant specifically for kids would just be another way to reel in users early, even if ads aren't shown until they are older.
"This is bad for kids because it hooks kids early," said Jim Steyer, CEO of Common Sense Media, which advocates for kids' safety online. "It's basically the classic brand marketing approach - get kids from cradle to grave."
Facebook has already launched a Messenger Kids service, which aimed to allow children to talk to users approved by their parents. But in 2019, reports surfaced that there was a design flaw that allowed children to enter group chats with unapproved strangers.
"Facebook has a record of failing to protect the safety and privacy of children on its platform, despite claims that its products have strict privacy controls," the state attorneys general wrote.
YouTube has also created a children's version of its service, sparking concerns among politicians and child safety advocates. A House committee last month began probing YouTube Kids, after accusing the company of serving up "inappropriate, low-education, highly commercial content." YouTube made significant changes to content for children last year as part of an effort to satisfy the Federal Trade Commission, which in 2019 fined the company tens of millions of dollars over alleged children's privacy violations.
Zuckerberg told lawmakers in March that the company was still considering how to deal with parental controls when inviting kids online.
"I think helping people stay connected with friends and learn about different content online is broadly positive," he said. "There were clearly issues that need to be thought through and work out, including how parents can control the experience of kids, especially kids under the age of 13. And we haven't worked through all of that yet."
In the March Congressional hearing, lawmakers accused the CEOs of making money off kids younger than 13 even when they weren't allowed to.
Rep. Kathy Castor, D-Fla, acknowledged that parents know their kids are using social media before they turn 13.
"The problem is that you know it," she said. "And you know that the brain and social development is still evolving at a young age. There are reasons in the law that we said that cutoff is at 13."
The Washington Post