In its report published in March, Regulating in a digital world, the Committee recommended that online services which host user-generated content should be subject to a statutory duty of care.
The duty of care should ensure that providers take account of safety in designing their services to prevent harm. This should include providing appropriate moderation processes to handle complaints about content. The Committee recommended that the duty of care should be overseen by Ofcom in the first instance.
Yesterday, the Government published its Online Harms White Paper in which it states that it will “establish a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services. Compliance with this duty of care will be overseen and enforced by an independent regulator”.
Lord Gilbert of Panteg, Chairman of the Committee, said: “I welcome the Government’s White Paper and its recognition of the need to establish regulation to protect users from online harms using a statutory duty of care. While the internet has clearly created huge benefits, self-regulation by online platforms has not gone far enough to address abuses such as hate speech, the dissemination of fake news and extremist content.
“Major platforms have failed to invest in their moderation systems, leaving moderators overstretched and inadequately trained. There is little recourse for a user to seek to reverse a moderation decision against them. A duty of care is therefore needed to implement minimum standards and to give effect to human rights including freedom of expression.
“The need for further regulation of the digital world goes beyond online harms, however. A comprehensive new approach to regulation is needed to address the diverse range of challenges that the internet presents, such as misuse of personal data and the concentration of digital markets.”