In the wake of the tragic death of his daughter Molly, Ian Russell has become an advocate for holding social media companies accountable for the harmful content that is being circulated on their platforms. Molly, only 14 years old at the time of her death, was exposed to over 2,000 suicide and self-harm posts on Instagram in the last six months of her life.
The discovery of the extent of the torture Molly endured at the hands of social media algorithms has prompted Ian Russell to become a vocal supporter of the newly passed Online Safety Act. This legislation grants Ofcom the power to regulate and fine social media companies that fail to fulfill their duty of care towards their users.
The Online Safety Act is an important first step in addressing the issue of harmful material online. However, this is just the beginning of a long process, as tech companies have a significant advantage over regulators with their two decades of experience. There is a risk that these powerful companies will attempt to manipulate evidence and obstruct the regulator in order to protect their business models.
Ofcom needs to act swiftly and decisively in order to ensure that social media platforms are fundamentally safe-by-design. The regulator must be guided the public interest and not influenced pressure from Silicon Valley. Algorithms, which play a crucial role in determining what content users see, must be closely scrutinized and monitored to prevent the dissemination of harmful material.
The ultimate goal of the Online Safety Act is to protect children and teenagers from harm. It is imperative that young people are not drawn into distorted realities where self-harm and suicide are normalized. The act must succeed in preventing children from being led down dangerous paths, just as Molly was.