Government gives Ofcom the ambitious task of regulating the internet

The government is to appoint broadcasting regulator Ofcom as a new internet watchdog, with the ability to fine social media companies that do not protect users from harmful content.

Culture Secretary Baroness Nicky Morgan and Home Secretary Priti Patel said Ofcom’s existing position as a regulator made it suitable to enforce rules to keep the internet safe.

The decision was published as part of an initial response to a consultation on the government’s Online Harms White Paper which was released last year and called for a statutory duty of care for internet companies to protect users against potentially harmful content.

Those proposals suggested allowing the regulator to issue fines against platforms and websites it judges to have failed to protect users from seeing harmful videos such as those depicting violence or child abuse.

The government’s response said the regulator would have the responsibility of making sure online companies have the systems and processes in place to fulfill the duty of care to keep people using their platforms safe.

Baroness Morgan said: ‘With Ofcom at the helm of a proportionate and strong regulatory regime, we have an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the UK.

‘We will give the regulator the powers it needs to lead the fight for an internet that remains vibrant and open but with the protections, accountability and transparency people deserve.’

Ofcom’s interim chief executive, Jonathan Oxley, said: ‘We share the government’s ambition to keep people safe online and welcome that it is minded to appoint Ofcom as the online harms regulator.

‘We will work with the government to help ensure that regulation provides effective protection for people online and, if appointed, will consider what voluntary steps can be taken in advance of legislation.’

The regulator has also announced the appointment of a new chief executive, civil servant Dame Melanie Dawes, as part of its preparation for a new, wider role.

The government’s response says platforms will need to ensure that illegal content is removed quickly and minimise the risk of it appearing, with particularly strong action needed on terrorist content and online child sexual abuse.

It also says future legislation will protect freedom of expression by not targeting or punishing individuals who access content which is legal, but may be offensive.

Instead, the proposals suggest that ‘companies will be required to explicitly state what content and behaviour is acceptable on their sites in clear and accessible terms and conditions, and enforce these effectively, consistently and transparently’.

The government said the proposed legislation will only apply to companies that allow the sharing of user-generated content – such as images, videos and comments.
Internet giants including Facebook, YouTube and Instagram are seen as the main targets of the proposed new rules, which are part of government plans to make the UK the ‘safest place in the world to be online’.

Responding to the proposals, Facebook’s head of UK public policy, Rebecca Stimson, said: ‘Keeping people safe online is something we take extremely seriously. We have clear rules about what is and isn’t allowed on our platforms and are investing billions in safety.

‘Over the last few years we’ve tripled the size of our safety and security team to 35,000 and built artificial intelligence technology to proactively find and remove harmful content.

‘While we recognise we have more to do, our regular transparency reports show we are removing more and more harmful content before anyone sees it and reports it to us.

‘We look forward to carrying on the discussion with Government, Parliament and the rest of the industry as this process continues.’

However, the proposals have been criticised by some in Parliament.

MP Julian Knight, chairman-elect of the Commons Digital, Culture, Media and Sport (DCMS) Committee, said Wednesday’s development on the online harms white paper ‘fails to demonstrate the urgency that is required’.

‘The DCMS Committee in the last parliament led calls for urgent legislation to prevent tech companies walking away from their responsibilities to tackle harmful content on their sites,’ he said. ‘Today’s statement fails to demonstrate the urgency that is required. We called for the new regulator to be completely independent from Government which is why we demanded a right of veto over the appointment.

‘The regulator must take a muscular approach and be able to enforce change through sanctions that bite. That means more than a hefty fine – it means having the clout to disrupt the activities of businesses that fail to comply and, ultimately, the threat of a prison sentence for breaking the law.

‘I would expect the DCMS Committee to be given the opportunity to scrutinise all aspects of the forthcoming Bill before it becomes law.’

Source: Read Full Article