Menu My portfolio: 0

Articles and Downloads

DCMS final report: Disinformation and “fake news”, Some initial thoughts by Zoe McCallum

Related Member(s):
Zoe McCallum
Related Practice Area(s):
Media and Information Law

The final report of the Commons Digital, Culture, Media and Sport Committee on Disinformation and Fake News [pdf] was published on 18 February 2019. It is the product of an 18 month process – far in excess of the normal timeframe for Select Committee inquiries, involving 170 written submissions and 73 oral witnesses.

The Committee’s interim report generated worldwide media attention (see Des Freedman’s Inforrm post here) and became most viewed HTML on the parliamentary website this calendar year. The DCMS Committee also formed what it (imperiously) described as an “International Grand Committee” of 24 MPs from nine different countries, which came together last October to proclaim a set of “International Principles on the Regulation of Disinformation and Fake News” (appended to the Report).

Criticism of Facebook

The DCMS Committee of 11 Parliamentarians – five Conservative, five Labour, one SNP and no Lib Dems – were praised in their interim report for pulling no punches against Facebook for its role in scandals involving election campaigning. The theme continues in the final report, including:

  • A swipe at Mark Zuckerberg personally for failing to respond to an invitation to appear in front of the International Grand Committee, accusing him of contempt: [29]. (Murdoch, it points out in a rare moment of praise, was much more cooperative during the Phone Hacking Inquiry);
  • The accusation that Facebook used its “opaque business structure” to “conceal knowledge and responsibility for specific decisions” and frustrated the Committee’s work by deliberately sending representatives it had “not properly briefed on crucial issues”, then by failing to follow up in writing: [14].
  • The expression of doubt over the legitimacy of Facebook’s participation in the UK Council for Internet Safety, a new government-sponsored organisation to keep children safe online and in particular “concerns about the good faith of the business and its capacity to participate in the work…in the public interest, as opposed to its own interests”:[35].

The Report’s key themes

The Final Report is not limited to fake news and disinformation (incidentally, rebranded to include “disinformation” as well “fake news” owing to the politicisation of the latter term, not least by the US President in reference to the mainstream media). The Final Report covers:

  • User control over data online (emphasising the need for access and control over the advertising profiles that Facebook creates on its users);
  • Social media’s role in criminality (including hate speech, harassment and revenge porn) and the need for greater legal liability for tech companies which fail to respond to takedown requests;
  • The need for greater regulation to tackle criminality and other “online harms”, defined to include mental health issues associated with the use of social media by children and vulnerable adults.  The report recommends a regulator for tech companies in line with Ofcom and IPSO and an enhanced role for the ICO, paid for by a levy on tech giants;
  • “Digital literacy”, which the Report defines principally in terms of the ability of social media users to make informed judgments about the adverts and information presented to them via algorithms. The report stresses the need for technological solutions to enable users to know the source of what they are reading, who has paid for it and why the information has been sent to them.

None of these are new ideas – but the hardline stance is, particularly regarding the regulation of tech giants. The Committee’s call for greater funding and greater teeth follows a description of Germany’s Network Enforcement Act 2018, under which tech companies can be fined up to €20 million for failure to respond to requests to takedown hate speech on its platform. The Committee heard evidence that one in six of Facebook’s moderators now works in Germany and described the Act as “practical evidence that legislation can work”: [24]. The Report also commends for consideration France’s new law allowing Judges to order the immediate removal of articles constituting disinformation during election campaigns, accompanied by sanctions for intentional breach of up to a year in prison and a fine of €75,000: [25].

The government is due to publish a White Paper on Online Harm very soon. It remains to be seen how much of the Final Report’s recommendations it will adopt – but it has already accepted a number of the interim report recommendations.  These require some careful thought and analysis.

Zoe McCallum is a barrister at Matrix Chambers, practising in the field of media and information law.