The recently passed Online Safety Act has highlighted an important distinction in regulating social media platforms. Baroness Beeban Kidron, a prominent advocate for social media regulation and child protection, emphasized the significance of recognizing that the design of these platforms is responsible for the harms inflicted upon young users. The Act addresses not only the harmful content posted individual users but also the product design itself.
This distinction becomes more pertinent when considering the recent lawsuit brought against Meta, the parent company of Facebook, 42 US states. The lawsuit accuses Meta of intentionally designing its social media products to addict children, drawing parallels to the practices of the Big Tobacco industry.
While social media companies often claim to prioritize safety, Kidron raised concerns about their resistance to regulations. She recounted a conversation with a senior figure in the tech industry who dismissed her efforts, suggesting that she simply disliked advertising. In response, Kidron highlighted the fundamental difference between social media and traditional media. Social media platforms have the ability to monitor users, target them with personalized advertisements, and influence their behavior sharing their attention with other users.
Although the Online Safety Act primarily focuses on protecting children, Kidron expressed disappointment that adults are not afforded the same protections, as they too are susceptible to the negative impacts of social media.
The Conscious Thinking Live event, hosted the Conscious Advertising Network (CAN), aimed to address the ethical challenges faced the advertising and media industry. The event raised concerns about brands supporting social media platforms that promote hate speech and misinformation. While some agency representatives acknowledged the risks associated with certain platforms, they admitted that client preferences often dictate advertising decisions, even when evidence suggests it may not be safe or effective.
The event also emphasized the importance of economic influencers, such as advertisers and investors, taking responsibility for holding social media platforms accountable. Alexandra Pardel, the campaigns director at Digital Action, encouraged the audience to lobby for more responsible practices. However, addressing the issue requires comprehensive regulation and political leaders taking on the responsibility to act.
In conclusion, examining the design of social media platforms and its impact on young users is essential for ensuring their safety. It is crucial for industry stakeholders, advertisers, and political leaders to work together to establish effective regulations that protect users from the harmful consequences of social media engagement.
What is the Online Safety Act?
The Online Safety Act is a piece of legislation aimed at addressing the harms caused social media platforms. It emphasizes the responsibility of these platforms’ product design and not just the content posted individual users.
What is the distinction made the Online Safety Act?
The Online Safety Act recognizes that the design of social media platforms plays a significant role in causing harm to young users, not solely the actions of individual users.
Why are social media platforms compared to Big Tobacco?
Social media platforms have been compared to Big Tobacco because they are accused of intentionally designing their products to addict users, particularly children.
What concerns were raised at the Conscious Thinking Live event?
The event raised concerns about the advertising and media industry supporting social media platforms that promote hate speech and misinformation, as well as the difficulties in convincing brands to avoid advertising on such platforms.
What role do economic influencers play in holding social media platforms accountable?
Economic influencers, such as advertisers and investors, have the power to hold social media platforms accountable advocating for responsible practices and supporting regulations that prioritize user safety.