“It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account,” according to an open letter signed by attorneys general from 44 U.S. states and territories, addressed to Facebook CEO Zuckerberg.
Facebook in March confirmed that it’s in the early stages of developing a restricted, special-purpose version of Instagram managed by parents for children under 13.
The attorneys general who cosigned the letter raised several key concerns about the project. They cited research showing that social media can be harmful to the emotional and mental well-being of children; agued that children “do not have a developed understanding of privacy”; and pointed to “alarming rates of cyberbullying” on Instagram in particular.
The AGs also cited a U.K. study that found more cases of “sexual grooming on Instagram than any other platform,” and noted that in 2020 Facebook and Instagram reported 20 million child sexual abuse images. In addition, the attorneys general wrote, “Facebook has a record of failing to protect the safety and privacy of children on its platform, despite claims that its products have strict privacy controls.”
“In short, an Instagram platform for young children is harmful for myriad reasons,” the AGs’ letter concluded.
In response to the AGs latter, a Facebook spokesperson said in a statement, “We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it. We also look forward to working with legislators and regulators, including the nation’s attorneys general. In addition, we commit today to not showing ads in any Instagram experience we develop for people under the age of 13.”
The “Instagram for Kids” app would fall under the U.S.’s Children’s Online Privacy Protection Act, a federal law that bars internet services from collecting data from kids under 13.
Facebook launched the Messenger Kids app for kids under 13 in 2017 — and the product immediately drew concern from consumer-privacy advocates. In 2019, a bug in Messenger Kids let children to join groups with strangers, The Verge reported; Facebook at the time said the glitch affected only a “small number of group chats.”
According to Instagram, it’s developing new artificial intelligence and machine learning technology to help it detect individual users’ ages. Despite its requirement that users be at least 13, “we know that young people can lie about their date of birth,” Instagram said in a recent blog post. “We want to do more to stop this from happening, but verifying people’s age online is complex and something many in our industry are grappling with.”