Starting Monday, tech companies operating in the UK must implement stricter measures to prevent the spread of illegal content, including child sexual abuse images, as authorities step up
enforcement of online safety regulations.
The UK's media regulator, Ofcom, has mandated that major platforms such as Meta's Facebook, ByteDance's TikTok, and Alphabet's YouTube enhance moderation, streamline content reporting processes, and introduce built-in safety features to curb criminal activity and improve platform security.
"Platforms must act swiftly to comply with their legal responsibilities, and our codes are designed to support them in doing so," stated Suzanne Cater, Ofcom's enforcement director.
The Online Safety Act, which became law in 2023, imposes stricter regulations on online platforms, prioritizing child protection and the removal of illegal content.
In December, Ofcom released its first set of codes of practice under the new law, giving companies until March 16 to evaluate the risks posed by illegal content on their platforms.
Failure to comply could result in hefty penalties, with fines reaching up to £18 million ($23.31 million) or 10% of a company's global annual revenue.
Ofcom has also identified file-sharing and storage services as particularly vulnerable to misuse for distributing child sexual abuse material. As a result, the regulator has launched a separate enforcement initiative to assess these services' safety measures.
Companies providing file-storage services have been asked to submit their risk assessments by March 31. Those failing to comply may face the same financial penalties as other tech firms under the new regulations. Photo by Solen Feyissa, Wikimedia commons.