MPs and lords launch a joint parliamentary committee to scrutinize the government’s forthcoming Online Safety Bill. The new committee is already seeking input from the public about their views of the legislation, which the government claims will safeguard freedom of expression online, increase the accountability of tech giants and protect users from harm online.
Under the Bill’s statutory “duty of care”, tech companies that host user-generated content or allow people to communicate will be legally obliged to proactively identify, remove and limit the spread of both illegal and legal but harmful content – such as child sexual abuse, terrorism, and suicide material – or they could be fined up to 10% of their turnover by the online harms regulator, now confirmed to be Ofcom.
The joint committee is chaired by MP Damian Collins, the former chair of the House of Commons DCMS Select Committee, who previously led an inquiry into disinformation and “fake news” that concluded by calling for an end to the self-regulation of social media firms.
“The Online Safety Bill is about finally putting a legal framework around hate speech and harmful content and ultimately holding the tech giants to account for their technology’s role in promoting it,” said Collins.
“The next step in this process is the detailed scrutiny of the draft Bill. This is a once-in-a-generation piece of legislation that will update our laws for the digital age,” he said.
“We now have a super committee of MPs and peers, highly experienced in this area, who will work together to go through this Bill line by line to make sure it’s fit for purpose. Freedom of speech is at the heart of our democracy, but fighting against movements that seek to harm and dehumanize people. In the social media age, we have not yet got that balance right, and now is the time to fix it.”
The committee is set to report its findings to the government on 10 December 2021. It will also seek views on how the draft Bill compares to online safety legislation in other countries.
On 22 July, a report from the House of Lords Communications and Digital Committee said that although it welcomes Bill’s proposals to oblige tech platforms to remove illegal content and protect children from harm, it does not support the government’s plan to make companies moderate content that is legal but may be objectionable to some.
Instead, the Lords argued that existing laws should be adequately enforced, such as those on harassment or grossly offensive publications. Any severe harm not already made illegal should be criminalized.
“We are not convinced they are workable or could be implemented without unjustifiable and unprecedented interference in freedom of expression. If a type of content is seriously harmful, it should be defined and criminalized through primary legislation,” peers wrote.
“It would be more effective – and more consistent with the value which has historically been attached to freedom of expression in the UK – to address legal content, but some may find distressing through strong regulation of the design of platforms, digital citizenship education, and competition regulation.”
Joint committee
The Communications and Digital Committee Chair, Lord Gilbert, is also a member of the new joint committee being launched. At the end of June 2021, the newly formed campaign group Legal to Say. Legal to Type also critiqued the Bill for being overly simplistic and ceding too much power to Silicon Valley firms over freedom of speech in the UK.
Speaking at a press conference launching the group, Conservative MP David Davis, who characterized the Bill as a “censor’s charter”, said: “Silicon Valley providers are being asked to adjudicate and censor ‘legal but harmful’ content. Because of the criteria’s vagueness and the fine’s size, we know what they’re going to do – they’re going to lean heavily into the side of caution.
“Anything that can be characterized as misinformation will be censored. Silicon Valley mega-corporations are going to be the arbiters of truth online. The effect on free speech will be terrible.”