UK websites and apps that host pornography and adult material – such as OnlyFans and PocketStars – must put in place strict age-verification processes or face severe financial penalties, the communications watchdog has said. Video-sharing platforms (VSPs) established in the UK – including TikTok, Snapchat, Vimeo and Twitch – face fines of £250,000 or 5% of applicable turnover, whichever is greater, for breaches of regulations, fresh guidance from Ofcom states. Platforms that have under-18 users but do not specialise in pornographic material or which ban adult content under the terms of their service, such as TikTok and Snapchat, will still be expected to put measures in place to protect younger users from harmful content, such as age-estimation techniques. Age estimation refers to methods that can estimate a person’s age, usually by algorithmic means. Melanie Dawes, Ofcom’s chief executive, said the likes of TikTok and Snapchat could not address the new rules by setting up youth-specific platforms and had to focus on their main services. She said: “It is not, in our view, any good to introduce a youth site like a young Instagram … you have got to address the issues on the main site.” The government has stated its intention for the VSP regime in the UK to be superseded by rules in the online safety bill undergoing pre-legislative scrutiny in parliament. Ofcom enforces the current regulations but unlike its broadcasting work, it cannot assess individual videos. Instead, the laws focus on the measures providers must take to protect their users – and afford companies flexibility in how they do that. The guidance says the providers should: Have clear, visible terms and conditions which prohibit uploading content relating to terrorism, child sexual abuse material or racism and enforce them effectively. Implement tools that allow users to flag harmful videos easily. They should signpost how quickly they will respond, and be open about any action taken. Restrict access to adult sites. VSPs that host pornographic material should have robust age verification in place, to protect under-18s from accessing such material. Ofcom said it also expected VSPs to put in place registration processes and subsequent checks that are strong enough to significantly reduce the risk of child sexual abuse material being uploaded and shared on their platforms. Dawes said: “Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them. “The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.” YouTube and Facebook are expected to fall under the Irish regulatory regime, which will regulate on behalf of EU member states. But those sites will come under the scope of the online safety bill, currently being scrutinised by UK MPs and peers, once it becomes law. The bill will impose a duty of care on internet companies to protect users from harmful content.
مشاركة :