Harms Reduction: A Six-Step Program to Protect Democratic Expression Online
Executive Summary
Following nine months of study and deliberations, the Canadian Commission on Democratic Expression has settled on a series of principles and recommendations that can lead to a practical course of action. What we set forth is a series of functional steps to enable citizens, governments and platforms to deal with the matter of harmful speech in a free and democratic, rights-based society like Canada. We recognize the complexity of the issues at play and offer these as a path forward and with the knowledge they will be subject to further debate and molding.
PRINCIPLES
- Free speech is fundamental to a democratic society and that the internet enables more people to participate in public discussions and debates.
- The rise of hatred, disinformation, conspiracies, bullying and other harmful communications online is undermining these gains and having a corrosive impact on democratic expression in Canada.
- The status quo of leaving content moderation to the sole discretion of platforms has failed to stem the spread of these harms and that platform companies can find themselves in conflict between their private interests and the public good.
- We find fault with the notion that platforms are neutral disseminators of information. Platforms curate content to serve their commercial interests and so must assume greater responsibility for the harms they amplify and spread.
- Government must play a more active role in furthering the cause of democratic expression and protecting Canadians from online harms.
- Any policy response must put citizens first, reduce online harms and guard against the potential for over-censorship of content in putting forth remedies. This requires a balanced and multi-pronged approach.
These principles have led the Commission to an integrated program of six scaffolding recommendations.
RECOMMENDATIONS
1. A new legislated duty on platforms to act responsibly.
Establishment by Parliament of a statutory Duty to Act Responsibly imposing an affirmative requirement on platforms under legislation and regulation, including social media companies, large messaging groups, search engines and other internet operators involved in the dissemination of user-generated and third-party content. In addressing harms, the details of this duty must take account of principles such as the fundamental nature of free speech.
2. A new regulator to oversee and enforce the Duty to Act Responsibly.
Creation of a new regulatory body, operating within legislated guidelines, that represents the public interest and moves content moderation and platform governance beyond the exclusive preserve of private sector companies. The regulator would oversee a Code of Conduct to guide the actions of parties under its supervision, while recognizing that not all platforms can be treated in precisely the same manner. Regulatory decisions will be judicially made, based in the rule of law and subject to a process of review.
3. A Social Media Council to serve as an accessible forum in reducing harms and improving democratic expression on the internet.
Ensuring an inclusive dialogue on ongoing platform governance policies and practices, including content moderation, through a broadly based social media council that places platforms, civil society, citizens and other interested parties around the same table.
4. A world-leading transparency regime to provide the flow of necessary information to the regulator and Social Media Council.
Embedding significant, world-leading transparency mechanisms at the core of the mandate for the regulator and Social Media Council – on data, ads, bots and the right to compel information. This will also assist researchers, journalists and members of the public with access to the information required for a publicly accountable system.
5. Avenues to enable individuals and groups to deal with complaints of harmful content in an expeditious manner. An e-tribunal to facilitate and expedite dispute resolution and a process for addressing complaints swiftly and lightly before they become disputes.
Creating rapid and accessible recourse to content-based dispute settlement by a dedicated e-tribunal charged with addressing online content disputes in a timely manner. And creating a process that enables targets of harms to compel platforms to make creators aware of a complaint.
6. A mechanism to quickly remove content that presents an imminent threat to a person.
Development of a quick-response system under the authority of the regulator to ensure the rapid removal of content – even temporarily – that creates a reasonable apprehension of an imminent threat to the health and safety of the targeted party.
▪ ▪ ▪
The Commission considered imposing takedown requirements on platforms as some nations have done. These generally identify offending categories of content, provide a fixed window, such as 24 hours, for it to be removed and may levy significant penalties. We are concerned that such systems could create incentives for over-censorship by platform companies. Our recommendations do less to circumscribe speech in recognition of the fact that harmful speech and the unjustified denial of freedom of expression are both problematic.
The standards of the Duty to Act Responsibly are purposely left vague at this point to give government, the regulator, and the Social Media Council an opportunity to flesh it out as part of a Code of Conduct. To be sure, we are not recommending the creation of a self-standing new tort as the basis for a cause of action, but rather imposing affirmative requirements on the platforms to be developed under legislation and regulation. We expect the Code will evolve given the recentness of the problem and the rapid evolution of the internet.
For an even wider perspective, read the views of the Citizens’ Assembly on Democratic Expression.
Read the Citizens’ Assembly report