Breaking News
In a significant step toward regulating the digital landscape, Britain has announced the introduction of its first-ever codes of practice for technology companies under its new online safety regime. These guidelines aim to create a safer and more accountable online environment by addressing harmful content, protecting users, and ensuring compliance with safety standards.
The UK’s Office of Communications (Ofcom), which oversees the enforcement of these rules, stated that the codes focus on curbing illegal online content such as child sexual exploitation, terrorism, and hate speech. They also emphasize the protection of children from exposure to harmful material like cyberbullying, explicit content, and self-harm material.
Tech companies, including social media giants like Meta's Facebook and ByteDance's TikTok to take action to tackle criminal activity on their platforms and make them safer by design. Also guidelines mandate search engines, and video-sharing platforms to take proactive steps to remove harmful content from their platforms. They must implement robust systems to identify and address risks while demonstrating transparency in their safety practices. Failure to comply could result in hefty fines of up to £18 million or 10% of global annual turnover, whichever is higher.
The introduction of these codes is part of the UK’s broader Online Safety Bill, aimed at holding tech firms accountable for the content they host. It marks a global benchmark, as the UK becomes one of the first nations to implement such stringent measures for digital safety.
The UK’s Office of Communications (Ofcom), which oversees the enforcement of these rules, stated that the codes focus on curbing illegal online content such as child sexual exploitation, terrorism, and hate speech. They also emphasize the protection of children from exposure to harmful material like cyberbullying, explicit content, and self-harm material.
Tech companies, including social media giants like Meta's Facebook and ByteDance's TikTok to take action to tackle criminal activity on their platforms and make them safer by design. Also guidelines mandate search engines, and video-sharing platforms to take proactive steps to remove harmful content from their platforms. They must implement robust systems to identify and address risks while demonstrating transparency in their safety practices. Failure to comply could result in hefty fines of up to £18 million or 10% of global annual turnover, whichever is higher.
The introduction of these codes is part of the UK’s broader Online Safety Bill, aimed at holding tech firms accountable for the content they host. It marks a global benchmark, as the UK becomes one of the first nations to implement such stringent measures for digital safety.
Media regulator Ofcom has released its first codes of practice aimed at addressing illegal harms, including child sexual abuse and encouraging suicide. The new guidelines require websites and apps to assess the risks posed by illegal content to both children and adults on their platforms, with a deadline of March 16, 2025, for compliance.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.