The recent Senate hearings on social media were both acrimonious and compelling. Senators faced CEOs from major companies such as Meta, X, TikTok, Snap, and Discord posing tough questions and demanding accountability for the platforms’ impact on young users. Adding a poignant backdrop to these proceedings were the parents seated behind the tech leaders, whose children’s deaths have been linked to social media usage. Their heart-wrenching stories lent a deeply personal and tragic dimension to the discussions.
Social media companies are under fire for their perceived indifference to the harm they inflict. The consequences of their operations extend to a range of serious issues including bullying, teen suicide, eating disorders, violent behavior, and radicalization, among others.
In response to these pressing concerns, the Senate has been proactive in crafting the Kids’ Online Safety Act (KOSA), a comprehensive piece of legislation aimed at addressing the myriad dangers children face online. This act, the result of years of deliberation and numerous revisions, represents a legislative effort to compel social media companies to take more responsibility for the safety and well-being of their youngest users.
But is this enough?
Not transformational, but a creditable first step
Without micro-analyzing KOSA, it’s clear the act introduces innovative measures, notably defining a “duty of care” that mandates platforms to reduce risks to minors. However, KOSA’s reach is limited.
Should Congress enact KOSA without further action, its deficiencies might allow the adverse effects of social media on young users to proliferate. Specifically, KOSA does not prevent adults from targeting children through these platforms, as it only restricts adult content for users identified as minors, without implementing mandatory age verification—a provision likely to stir significant controversy.
Despite these limitations, KOSA represents a positive initial step towards safeguarding children online. Its flaws are not irreparable. Importantly, the legislation should be seen not as a final solution but as the beginning of a sustained, multi-year effort to reform social media practices and diminish their harmful impacts on children. The journey towards a safer online environment for minors requires more than a one-off legislative effort; it demands ongoing commitment and adaptation.
Strong opposition comes with the territory
KOSA began in the right place, the US Congress. But given the global reach of these platforms, effective regulation will require federal and transnational support, such as those by the European Union, to ensure comprehensive oversight. Without such legislative backing, it’s unlikely that social platforms will voluntarily implement changes that could potentially diminish their engagement metrics among younger demographics.
Federal legislation, even on a modest scale, offers a more unified approach compared to a disparate collection of state laws, which could enable attorneys general to further political objectives. A federal framework ensures a level playing field for all platforms across states, preventing compliant companies from facing competitive disadvantages. However, crafting such legislation is a delicate process, as it must withstand legal challenges from various quarters, including rights activists, major social media companies, and providers of adult content, all of whom are prepared to defend their interests vigorously.
The challenge of preempting legal pushback is compounded by the reluctance of stakeholders to compromise. A radical, though potentially effective, strategy might involve forcing a dialogue between diverse parties, such as the ACLU, rights activists, constitutional lawyers, and child safety advocates, with a directive that no one leaves until a consensus is reached.
The question of how legislation should govern the use of technology for age or identity verification is pivotal. Comparing social media to utilities underscores the argument for stringent regulation: while they provide essential services, they also pose significant risks. This analogy invites a reevaluation of social media’s role and functionality, especially considering how algorithms can drive users towards increasingly extreme content, fueled by the pursuit of higher engagement and advertising revenue. This dynamic can lead to children isolating themselves in online echo chambers that exacerbate hate and discontent, further alienating them from healthier perspectives.
But sweeping change in social media won’t happen in a single event. KOSA represents an important initial step, yet it is just one piece in a complex puzzle. It has the potential to bring about change, but it will happen in stages.
It’s a marathon, not a sprint.
The intricacies of ensuring online safety while upholding constitutional freedoms is complicated. Success will likely be achieved through incremental, thoughtful progress over several years.
Collaboration, compromise, and consensus-building will be critical to KOSA’s success. It’s an admirable goal, but achieving consensus in one fell swoop is unlikely. A more practical expectation is for KOSA to undergo continuous refinement and enhancement through annual updates. These adjustments will be informed by the previous year’s experiences, adapting to shifts in technology, patterns of misuse, and allowing the industry adequate time to adjust to new regulations.
Ideally, the first round, KOSA 2024, would encompass content ratings, age verification and opt-out/in, warnings and censorship by specifying:
- What content is unacceptable and/or illegal;
- What content can and must be blocked by platforms;
- Precisely how to label content that is toxic but cannot be blocked;
- How to warn users and parents, and what barriers to put around sensitive content;
- Opt-out (of content blocks) default settings.
Algorithm reform: controversial yet potentially transformational
The next phase of KOSA in 2025 will focus on enhancing accountability and establishing stricter penalties for platforms and individuals who engage in or facilitate illegal activities. This aims to curb not just the spread of illegal content but also to address behaviors that contribute to the mental health crisis among youth, such as excessive doom-scrolling and plunging into harmful online environments.
Looking further ahead, subsequent iterations could mark a pivotal shift in the very operation of social media platforms, potentially centering around “reversing the algorithms” that currently guide users, especially young ones, towards negative and risky online spaces. The ambition here is to not just prevent exposure to harmful content but to actively steer users towards safer, more positive interactions online.
While potentially contentious, reversing the algorithms opens up an avenue for platforms to reinvent themselves. By anticipating these changes, social media companies can prepare to adapt their business models. The goal is to remain profitable while fostering an environment that prioritizes the well-being of its users, especially the younger demographic. This forward-thinking strategy suggests a win-win scenario: safeguarding users’ mental health and ensuring the long-term viability of social platforms by cultivating a healthier, more engaging online community.
Change is long overdue
The testimony of families at Senate hearings underscores the need for more than incremental changes to social media regulation. A robust overhaul, starting with KOSA 2024, is essential to guard against the evolving threats of artificial intelligence and external influences. The process will require ongoing adjustments, akin to that of the SEC and FDA.
But inaction is not an option.
A focused, long-term strategy is critical to ensuring the safety of our youth on social media platforms. By initiating comprehensive reforms and continually refining these measures, we can mitigate harm and finally deliver on social media’s original promise — to better our lives through connection.