After over a year of preparation, Ireland has finally launched a new law, the “Online Safety Code,” under pressure from the European Union to strengthen regulation of video-sharing platforms in order to better protect the public, especially minors, from online content such as pornography, violence, and hate.
The newly established Irish media and internet regulatory body, the Media Commission (Coimisiún na Meán), released the final version of the “Online Safety Code” on October 21. This regulation will apply to major tech companies with European headquarters in Ireland, including Meta and ByteDance.
According to the Media Commission’s announcement, the new law primarily enhances requirements for the protection of minors, aiming to reduce harm such as pornography and violence against children. For instance, social media platforms must consider implementing age verification mechanisms, not just asking users if they are over 18; they may verify users through their passport photos to restrict children’s access to adult websites.
The “Online Safety Code” mandates that video-sharing platforms must prohibit the uploading or dissemination of harmful content, including promoting cyberbullying, self-harm or suicide, encouraging eating disorders or anorexia, as well as inciting hate, violence, terrorism, child sexual abuse, racism, and xenophobia.
The new regulation also requires that video platforms provide parental monitoring or control options for any content that could harm the physical, mental, or moral development of children under 16.
Video-sharing platforms found to violate the new regulations could face fines of up to €20 million or 10% of their annual turnover, whichever is higher.
Ireland began drafting the “Online Safety Code” last year, releasing a draft for public feedback in December. In February of this year, due to the government’s delay in implementing the law, the European Court imposed a €2.5 million fine on Ireland. In addition to this substantial one-time fine, the judgment also required Ireland to pay €10,000 per day to the EU starting from February 29 until the infringement behavior is “ended.”
Ten companies were listed as video-sharing platforms designated to comply with this law. This list includes major social media platforms globally, from Meta’s Facebook and Instagram, TikTok under the Chinese company ByteDance, X (formerly Twitter), YouTube under Google’s parent company, to LinkedIn under Microsoft. Additionally, some video and text social networking sites rising in popularity among Chinese users in recent years, such as the online learning platform Udemy, image-sharing company Pinterest, the blog site Tumblr, and the social news forum Reddit, are also included in the regulatory list.
Several companies have attempted to challenge this regulation through legal means. In January of this year, Reddit filed a lawsuit in the High Court, and weeks later, Tumblr, headquartered in New York, also formally submitted a complaint, arguing that they should not be designated as video-sharing platforms.
In June, the High Court dismissed both cases, stating that the Media Commission had the authority to include these two platforms in the regulatory list. While the Media Commission welcomed this decision, Reddit subsequently claimed that the company does not fall under the jurisdiction of the new regulations because its European operations are centered in the Netherlands; since registering in the Netherlands in 2021, its local subsidiary has become Reddit’s operational hub within the EU. According to local media reports, the regulatory body has not yet made a decision regarding Reddit.
However, activists have pointed out that the new law does not include provisions regarding algorithms, which can guide social media platforms to recommend content based on users’ search history, shopping habits, age, and location, among other personal data. Researchers have previously expressed concerns about algorithms on platforms like TikTok, believing that they can facilitate the spread of harmful content.
Dr. Johnny Ryan, a senior researcher at the Irish Council for Civil Liberties (ICCL), expressed disappointment with the new law. Dr. Ryan stated that regulatory agencies should have clear rules to intervene and prevent “toxic algorithms” from recommending self-deprecating and suicidal content to children. However, no such provisions are included in this law.
The Media Commission stated that it is aware of the potential negative impacts of recommendation systems on users, especially children, but it plans to address these system’s potential risks through the implementation of the EU’s Digital Services Act.
“With the implementation of the ‘Online Safety Code,’ all elements of our online safety framework are now in place,” said Jeremy Godfrey, the Chairperson of the Media Commission in a statement. “Our current focus is on fully implementing this framework and guiding online activities towards positive changes.”
Godfrey emphasized that this framework sends a clear message to users, stating, “If you encounter something you believe to be illegal or a violation of platform rules, you should report it directly to the platform.”