Meta Platforms, the parent company of Facebook and Instagram, has taken a strong stance, advocating for a new U.S. law that would make app stores obligated to obtain parental approvals for downloads users under 16 years of age. The move comes as Meta and other social media giants face increasing scrutiny worldwide regarding the potential harm their platforms can cause to children, as well as criticism surrounding their handling of these concerns.
In a blog post on Wednesday, Meta’s global head of safety, Antigone Davis, emphasized the need for a unified approach rather than the current patchwork of laws implemented different U.S. states. Davis stated, “Parents should approve their teen’s app downloads, and we support federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps.”
While Meta’s call for such legislation aims to address the issue directly, it has not been without criticism. The U.K.’s National Society for the Prevention of Cruelty to Children (NSPCC) accused Meta of attempting to shift responsibility. Richard Collard, an official from the NSPCC, remarked, “Meta has sat on its hands while knowing the harm their platforms can cause children and is now seeking to pass the buck when they should be getting their own house in order.”
As pressure mounts on Meta, a bipartisan group of U.S. senators has requested CEO Mark Zuckerberg to provide them with documents the end of November regarding Meta’s research on the mental and physical health risks associated with its platforms. This request follows recent testimony from a Facebook whistleblower, Arturo Béjar, who highlighted the platform’s failure to address reports of harassment and harm faced teenagers.
The discussion surrounding the safety of young users on social media has gained significant traction this year. In March, Utah became the first state to pass a law limiting teens’ access to social media platforms. Other states are now considering similar legislation, and the issue has even entered the realm of the 2024 presidential race, with Republican candidate Vivek Ramaswamy proposing a ban on individuals aged 16 and under from using social media platforms.
Despite the ongoing debate and differing opinions on the matter, it is evident that the protection of children online remains a crucial concern, prompting both regulatory actions and calls for increased accountability from social media companies.
What is Meta?
Meta Platforms is the parent company of popular social media platforms Facebook and Instagram. They are responsible for overseeing the operations and policies of these platforms.
Why is Meta advocating for a new law?
Meta is advocating for a new U.S. law that would require app stores to obtain parental approvals for app downloads users under the age of 16. This initiative is driven the ongoing concerns about the potential harm social media platforms can cause to children and the need for a unified approach to address this issue.
What criticism has Meta faced regarding their stance?
The U.K.’s National Society for the Prevention of Cruelty to Children (NSPCC) criticized Meta for attempting to shift responsibility. They suggested that Meta has not taken sufficient action to address the harms caused their platforms and is now trying to avoid accountability.
What are the proposed regulations surrounding children’s access to social media?
Utah has already implemented a law limiting teens’ access to social media platforms. Other states are considering similar legislation, and there are ongoing discussions in the U.S. House and Senate. Republican presidential candidate Vivek Ramaswamy has even proposed banning individuals under the age of 16 from using social media platforms. These efforts aim to protect young users and address the potential risks associated with social media use.