Skip to main content
opinion

Taylor Owen is the director of The Centre for Media, Technology and Democracy at McGill University and the host of the Big Tech podcast.

The recent hate crime in London, Ont., has sparked renewed calls for the government to do more about Islamophobia and other forms of hate. As part of this response, a wide range of civil society organizations, including prominent anti-hate leaders, are calling for the government to introduce the online harms legislation it has long promised. This week they took an initial step by introducing amendments to the criminal code to better address online hate.

Canada is not alone. Governments around the world are stepping – albeit slowly and cumbersomely – into the perilous space of governing online speech.

What is Bill C-10 and why are the Liberals planning to regulate the internet?

Bill C-10 has ‘zero’ chance of becoming law by summer, senator says

For understandable historical reasons, Germany banned online hate (including Nazi) speech and forces platforms to take it down within 24 hours or face fines of up to €50-million. In response to the livestreamed Christchurch mass shooting, Australia banned the sharing of violent material. Addressing a concern about content that targets and exploits children, Britain has created new rules aimed at minors and mandates the removal of material depicting child sexual abuse or promoting suicide. Even in the U.S., the right to free speech online is not absolute. While platforms are broadly protected from being responsible for the behaviour of their users, they are liable for publishing child pornography and terrorist content.

At its core, the problem these laws all seek to address is relatively straightforward: There are a lot of awful things on the internet. And while some of this is a result of there being lots of awful people, the problem is magnified by the very way social-media platforms are designed. The problem of online hate is a difference in kind, not just degree.

By deciding who is seen and heard, by shaping what we read and watch, and by controlling the means of collective action, social-media platforms don’t just facilitate free speech, they shape the character of it. And as such, they bear some responsibility for the ensuing harms.

What’s more, because these companies now look and behave like traditional monopolists, they have little incentive to self-regulate. So citizens are left with little choice other than to accept the harms embedded in the system. Put simply, the free market has failed.

Fortunately, this is not a new problem. When a market leads to social or economic harm and the private sector is unwilling or unable to self-regulate, then that is precisely when we have traditionally looked to governments to govern. We do so with the financial sector, the pharmaceutical and petrochemical industries, and for food and aviation safety. In each we have developed intricate and at times highly invasive means to minimize the downside risks of these industries while maximizing their benefits.

Platforms are no different. And platforms can be regulated in a wide number of ways that both address the root causes (the business model and abuses of data) and the symptoms (the harmful speech itself). Enter the Canadian government.

Bill C-10 began as an effort to update our regime of cultural protectionism. Whether or not one agrees with these policies, what is undeniable is that our current broadcast and CanCon regulations were built for a different era and need to be either updated or scrapped. The government chose the former. Reasonable people can disagree on this.

What drew C-10 into the online speech debate were last minute changes lifting the exemption on user generated content (to ensure YouTube music would be included) and empowering the CRTC to regulate discoverability (to prioritize Canadian content in our feeds). What’s worse, the government has shamefully tried to shut down debate and force through legislation on a bill that at least ostensibly touches on free speech.

The unfortunate irony, however, is that by clumsily and unnecessarily expanding C-10 into the domain of free speech, the government has shown itself ill prepared to defend – and has put in jeopardy – the passage of separate legislation that explicitly deals with speech, its planned online harms bill.

Learning from similar efforts in Europe, the government is thought to be developing plans to force platforms to remove already illegal speech and are considering a regulator to enforce this new take-down policy as well as to potentially implement a range of additional accountability measures such as mandatory transparency reports and algorithmic audits. Other measures, such as dispute resolution mechanisms to allow citizens adjudication of take down decisions, are being discussed.

All of this is important, but is easier said than done. Regulating speech is far more difficult than updating competition policy or data privacy reform (neither of which this government has delivered on to date).

The difficulty is in part because each country weighs the balance between the right to speech and the right to be protected from speech differently. The world will not uniformly adopt Silicon Valley’s, or America’s, definition of free speech. This means that as much as platforms might like to have one set of rules for the whole planet, there will be different approaches taken in different countries. It’s going to be messy.

It is also not always clear what is an act of online speech. Is it the comment typed by the user, or the algorithm that amplifies that speech to an audience, or the recommendation that a user join a group where that speech is posted? All of these are arguably speech acts but demand very different governance tools to regulate.

And once you have decided what counts as online speech, you need to determine who should be liable for it. Should platforms be viewed as neutral hosts for the speech of their users (like a utility), or should they be liable for the content that they distribute and profit from on their sites (like a publisher)? The answer is a bit of both.

Perhaps most worryingly, speech regulation is further complicated by the reality that illiberal leaning regimes around the world, including Poland, Hungary, the Philippines, Turkey, Brazil and India, are increasingly using very similarly worded laws to crack down on the media, civil society and the free expression of their citizens.

For example, the Indian government has recently imposed a new set of sweeping regulations, what it calls “IT Rules.” While these rules sound familiar, as they require, among other things, that platforms remove content deemed defamatory, obscene and harmful to children, they are widely seen as a means of cracking down on the speech and political activities of Indian civil society.

Which begs the question of whether democratic governments should abandon their efforts to govern online harms because their initiatives may be instrumentalized by illiberal regimes?

I would argue that the opposite is true.

Those of us lucky enough to live in democratic societies have a responsibility to show how online speech can be governed in a manner that both minimizes harm and prioritizes and reinforces core democratic principles.

It is undeniable that social-media platforms increasingly govern like states, some more responsibly than others. But they invariably do so in a manner driven by private rather than public interests, without the norms and institutions of democratic accountability we have rightly placed on our governments. And this has come at a significant cost.

Few people are calling for a completely deregulated platform ecosystem, where terrorists and child pornographers have free rein. So the debate is not really whether online content on social media should be regulated, but rather what the extent of that regulation should be.

While the question of how we govern online speech will rightly spark important and heated debates, doing it in a manner that prioritizes free expression and accountability will make the internet more democratic, not less.

Keep your Opinions sharp and informed. Get the Opinion newsletter. Sign up today.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe