Subscribe to get latest news delivered straight to your inbox


    Should User-Generated Content (UGC) be Exempt from Law and Regulation? Should Internet Platforms Bear any Responsibility for UGC they Distribute?

    • 24.05.2021
    • By Hugh Stephens
    Hugh Stephens Blog

    Should user-generated content (UGC) on social media platforms be free from any regulation and the rule of law, simply because it is user-generated? Should social media platforms be given a pass when it comes to any responsibility for the UGC that they distribute? That seems to be the message from those busy attacking Canadian Heritage Minister Steven Guilbeault for proposing legislative amendments to broadcasting regulation (Bill C-10), and for promising future legislation that will require the platforms to control “online harms”, another form of UGC. Bill C-10 (in its current form) would subject the platforms that host UGC (Youtube–owned by Google–being the prime example) to some regulation with respect to that content. The “online harms” legislation has yet to be introduced although Guilbeault has made clear it is coming this spring.  (Online harms refers to child sexual exploitation, hate speech, revenge porn and incitement to violence.) That Bill’s exact provisions remain to be determined.

    Bill C-10

    With respect to Bill C-10, the issue is whether the online platforms will be considered “broadcasters” when disseminating video content posted by users. If so, and if Guilbeault’s proposed legislation is enacted, that content would be subject to “discoverability” criteria established by Canada’s broadcast and telecommunications regulator, the CRTC (“the Commission”) to ensure that Canadian content is promoted. The legislation has run into a buzz-saw of opposition from various quarters and has quickly become politicized. Guilbeault has been accused of wanting to “censor” the internet.

    Strangely, considering the focus of the criticism, the primary objective of the Bill is not to regulate user generated content. Rather, it is to bring online streaming services under the purview of the broadcasting regulator to ensure that Canadian content is promoted and made “discoverable”, among other obligations.

    Should there be a UGC Carve-out?

    The original version of the Bill included an explicit carve out for user-generated content in order to reassure consumers that they were not being targetted, but once review by Parliamentary Committee began it was quickly realized this would create a massive loophole that could be exploited by the social media platforms. They could have used the UGC exception to avoid obligations being imposed on other streaming services, such as Spotify for example, with respect to Canadian music. An amendment was therefore proposed dropping the explicit exclusion for UGC. This prompted critics to charge the government with interfering with free speech and dictating what Canadians can post on social media. This is total hyperbole and the critics from the main opposition party, the Conservatives, are surely aware of this, but politics is politics.

    Intense Criticism

    It has not helped that Guilbeault has struggled to explain clearly the intent of the legislation, which is targetted at the platforms, not consumers. Some of the criticisms have come close to becoming a personal vendetta, with Michael Geist of the University of Ottawa leading the charge, accusing Guilbeault of giving “disastrous” interviews that should lead to him being fired. Geist has been on a campaign for weeks to discredit the legislation, Guilbeault, and the government’s agenda to confront the web giants, publishing almost daily attacks on his blog. Geist is particularly unhappy that Guilbeault and the Heritage Ministry have been given the file to run with rather than the usually more Silicon Valley-friendly Ministry of Innovation, Science and Economic Development. In other words, the “culture” mavens seem have priority over the techies who guard “industry” interests.

    What’s the Real Issue?

    With regard to the policy intent of Bill C-10 (Amendments to the Broadcast Act), one can legitimately question whether Canadian content “discoverability” requirements are needed, or indeed whether streaming services should be treated as broadcasters. Even the whole question of Canadian content quotas can be debated. I, for one, remain to be convinced that enhanced discoverability requirements are needed to get Canadians to watch more Canadian content (Cancon). And then there is the question as to what constitutes Cancon, but that is another entire topic. But just to give one example of the arcane rules that govern Canadian content, a project produced by Netflix with a Canadian story, Canadian actors and Canadian writers will not qualify as Cancon if it is fully financed by Netflix. Why? Somehow, the money is not “Canadian” enough. Go figure. Since establishing its Canadian operation in 2017, Netflix has spent over $2.5 billion on production in Canada but much of that does not count toward content quotas. (An earlier blog I wrote on this topic, “Netflix in Canada: Let No Good Deed Go Unpunished explains how difficult it is for companies like Netflix to qualify.)

    In my view, the answer to getting Canadians to watch more Cancon is to produce more good quality Canadian content. (Schitt’s Creek is a prime example of successful Canadian programming that does not need to be “discovered”). However, putting the Cancon question aside for a moment,the issue is now whether a level playing field will be established for all streaming services. If discoverability requirements are going to be applied to streaming services, then social media platforms should not be given a pass simply because they host user generated content.

    Is UGC Sacrosanct?

    There is nothing sacrosanct about UGC that puts it into a separate universe. For the most part, it should be left alone as it forms part of the free expression of society, but where and when it crosses the line of the law, or falls into an area subject to regulation, there is no reason why UGC should be treated differently from any other content. The killer of 51 people at two mosques in Christchurch , New Zealand, live-streamed the shootings on Facebook. That live-stream was 100% UGC. Some critics claim that subjecting UGC appearing on Youtube to CRTC oversight will impair free speech rights and would be contrary to the Canadian Charter of Rights and Freedoms. This, despite an opinion from the Department of Justice, backed up by testimony from the Minister of Justice (himself a distinguished legal scholar), explicitly dismissing claims that any provisions of the Bill would violate Charter freedoms.

    Why Include Youtube?

    Why extend the content discoverability requirements to Youtube? Because Youtube is a major distributor of music and video, and in fact acts as an online broadcaster—although the content is user-generated. (There are more than 35 million Youtube channels, most of them with an admittedly small following). According to a Ryerson University study (quoted in the Toronto Star), 160,000 Canadians post content on Youtube, with 40,000 of them earning revenue. Would subjecting this “broadcast content” to discoverability requirements be an impairment of free speech rights? Why would it be? Nothing is censored, nothing is “taken down” or “buried”. Users are free to post what they wish. Indeed, that is part of the problem. Sometimes what they post is illegal, infringing or libellous.

    The fact that content is user-generated is no reason to exempt it from regulation deemed to be in the public interest (although there may be different viewpoints as to what constitutes the public interest).  Where it falls within regulation, user-generated content—especially when done for commercial purposes such as ad-supported Youtube channels—should not have an unfair advantage over other forms of content.

    Net Neutrality

    Another argument against applying any regulation to the distribution of UGC is that CRTC oversight will undermine net neutrality. Vocal C-10 critic Michael Geist claims that Guilbeault’s bill shows the Canadian government has abandoned its support for this principle. This is an old canard regularly trotted out by opponents of any internet regulation. By Geist’s own admission, net neutrality requires ISPs to avoid practices that would unfairly give preference to certain content over others through discriminatory charges. In particular they are required to not favour content in which they have a financial interest over other content that may compete with it. Net neutrality is founded in the common carrier concept that emerged from the telegraph era when companies like Western Union prevented competing news services from using their telegraph system to file competing news stories. The principle is the same today. But net neutrality has never meant that there should be no regulation of internet content. The best example of the need for regulation is the question of “online harms”, the next Guilbeault shoe set to drop.

    Expected “Online Harms” Legislation

    Right now, Bill C-10 is the target of the critics, but I am sure that when the “online harms” legislation is tabled (shortly), we will hear the same complaints about how it interferes with freedom of expression on the internet. This raises yet again the fundamental question as to whether government has any role in regulating what appears on social media. The answer, surely, must be yes—subject to the normal protections regarding freedom of expression. In Canada this is done through the Charter of Rights and Freedoms. Section 2(b) of the Charter protects, “freedom of thought, belief, opinion and expression, including freedom of the press and other media of communication”. But that is subject to limitations. The Canadian government’s explanation of the Charter says, right up front with respect to the freedoms that it guarantees, “The rights and freedoms in the Charter are not absolute. They can be limited to protect other rights or important national values. For example, freedom of expression may be limited by laws against hate propaganda or child pornography.”

    That is apparently what the online harms legislation will do. Michael Geist doesn’t like that legislation either. In an opinion piece in Macleans (once described as Canada’s national news magazine), Dr. Geist attacked the online harms legislation because it will likely include a mechanism to block illegal content hosted by websites outside Canada that are beyond the reach of Canadian law. According to him, this would “dispense with net neutrality”. If net neutrality means protecting the rights of offshore websites to disseminate hate speech, material that sexually exploits children and incites violence and terrorism, most of it UGC by the way, then net neutrality is not worth protecting. But of course, this has nothing to do with the meaning of net neutrality. Net neutrality as a huge umbrella protecting everything on the internet exists only in the minds of the cyber-libertarian claque.

    An “Internet Firewall”?

    Disabling access by consumers to illegal content hosted offshore is not some Orwellian plot. It is a reasonable application of the law to rogue sites that thumb their nose at national legislation because they are hosted somewhere in cyberspace. Opponents of any form of site-blocking claim that it creates an “internet firewall”, with obvious comparisons to the “Great Firewall of China”. What China is doing to limit access to online content by Chinese citizens parallels other censorship and behaviour control measures instituted by the authorities in China. But China is China. Canada is Canada. To equate targetted blocking of content that is illegal under the Canadian criminal code with the kind of thought control techniques exercised by the Communist Party in China is fanciful. Another potential use for targetted site-blocking, subject to all the requirements of due process—application, hearing, appeal, etc.—is to disable access by consumers to offshore sites hosting illegal, pirated, copyright infringing content. See my recent blog “Site-blocking for “Online Harms” is Coming to Canada: Similar Measures to fight Copyright Infringement Should Follow”.

    Expeditious Takedown

    In the same op-ed, Prof. Geist also objects to the fact that platforms will likely be required to takedown illegal content within 24 hours. This is similar to Australian legislation passed after the Christchurch killings that requires platforms to “expeditiously” take down “abhorrent violent material” when notified to do so. Geist claims this approach substitutes speed for due process. But sometimes speed is precisely what is needed when the harm is so egregious that action must be taken immediately. One would expect the platforms to exercise their own oversight in such cases, but experience has shown that they often will not act unless required to do so.

    Holding Big Tech Accountable

    At the end of the day, the key question comes down to whether UGC has some special place as a form of speech that cannot be regulated or subjected to lawful oversight, and to what extent the social media platforms that host and thrive on UGC should bear any responsibility for the content they allow to be posted. For all too long, the platforms have hidden behind the pretence that they are just neutral “bulletin boards” with no responsibility to vet what goes up on those boards. They employ terms such as “net neutrality” and “freedom of speech” to duck any responsibility for offensive and illegal content that they are happy to monetize—and on occasion even encourage. Some of this is copyright infringing content, which is why I am writing about UGC on this copyright blog. By sprinkling magic dust on UGC to make it “different”, the big tech platforms have tried to duck their share of responsibility for allowing and exploiting infringing content, shifting all the burden to the users which they enable.

    One thing is certain. Change is coming. Platforms are being increasingly held to account for the content they carry, in Australia, the EU and in Canada.  In the US, serious reconsideration of Section 230 of the 1996 Consumer Decency Act, the “get out of jail free” card that the internet platforms have used for years to avoid any responsibility for online content that they host and distribute, is coming under serious scrutiny. Those opposed to any change in the status quo are fighting a furious rear-guard action, invoking hallowed and sacrosanct concepts such as free expression (the First Amendment in the US, the Charter in Canada), net neutrality, lack of due process, and so on, all in a vain attempt to avoid any restrictions on big tech and to hold it more accountable.

    Conclusion: UGC Must Comply with Laws and Regulation

    I cannot predict at this stage what the final shape of Bill C-10 will look like, or whether Steven Guilbeault will be able to withstand the furious attacks by opponents seeking to strip user-generated content (UGC) out of the legislation. As for the online harms legislation, we will have to wait to see how it deals with harmful and illegal content on the internet, much of it generated by users. If it requires platforms to expeditiously take down harmful material, that will be a good thing. If it provides a mechanism to prevent consumers from accessing purveyors of illegal content who avoid Canadian law by locating their servers offshore, that would also be a good outcome.

    With regard to C-10, although you can question the necessity for bringing streaming services under the broadcasting regulator and applying Canadian content and discoverability requirements to them, if that is the policy direction, then there is no reason to give Youtube a pass simply because it commercializes user-generated content. Laws and regulation must apply to UGC, subject to constitutional limitations, just as they do to other forms of content. To act otherwise creates a massive loophole that undermines policy delivery, is unfair to other content services, and tilts the playing field by impeding fair market competition.

    This article was originally published in Hugh Stephens Blog.

    ABOUT THE AUTHOR

    Hugh Stephens

    Distinguished Fellow, Asia Pacific Foundation of Canada