As I noted in my year-end wrap up a couple of weeks ago, some of the copyright and content related issues that were under discussion in Canada in 2021 will likely move forward in a more aggressive way this year. The federal election last fall put on hold a number of copyright-related issues that were in process. Parliament lost several months of work, plus all pending legislation died at the time of the election call and needs to be re-introduced into the new (44th) Parliament. So far, the current Parliament has met for just a few weeks, sitting from November 22 to December 17, 2021, primarily to outline new legislative priorities. Mandate letters for ministers were released on December 16, and among the issues tasked to Pablo Rodriguez, the Minister for Canadian Heritage, are four big files concerning content industries. Rodriguez is instructed to;
That is a lot to have on one’s plate and one cannot help but wonder how much of this will actually get done. These issues have all been around for some time, and most have already been the subject of online consultation and in some cases, Parliamentary review through committee. Let’s look at each in turn.
On the first item, a Copyright Act update is overdue. In theory the Act is supposed to be reviewed every five years. The last significant legislative update was in 2012. In 2019 two Parliamentary committees reviewed the Act and issued somewhat conflicting recommendations, but to date no changes have been introduced. Last year there were several public consultation documents issued regarding copyright, the first on implementation of Canada’s commitment under the USMCA/CUSMA to extend the term of copyright protection, a second discussing a modern copyright framework for online intermediaries and the third on copyright and artificial intelligence and the Internet of Things.
The copyright term extension question is the most pressing, as Canada is required to implement the twenty-year extension agreed to in the USMCA trade agreement no later than December 31, 2022. The consultation paper addressed a number of implementation issues, such as orphan and out of commerce works, while also seeming to dismiss proposals for the institution of an additional registration requirement in order to access the longer period of protection as advocated by some opponents of extending copyright duration in Canada. The implementation of Canada’s USMCA/CUSMA obligation, hopefully done in a straightforward way without the imposition of additional registration barriers, could be rolled into a broader copyright reform bill, or could be bundled into some other omnibus legislation.
The second consultation paper dealt with issues such as safe harbours for internet intermediaries and possible regulations regarding site-blocking of pirate websites, a measure already upheld on appeal by the courts in Canada. The AI paper raises questions such as ownership of works created by AI and the addition of possible additional copyright exceptions to address data mining, among other topics.
Another copyright issue that needs to be addressed, in addition to the introduction of an Artists Resale Right mentioned in the mandate letter, is the question of mandatory tariffs to fix the disastrous decision by the Supreme Court in July 2021 upholding the Federal Court of Appeal’s (FCA) ruling that mandatory tariffs covering unlicensed use of copyrighted content are not reciprocally binding on rights-holders and users alike. The FCA found that for users they are only optional, thus undermining one of the pillars of Canada’s collective licensing regime. Parliament needs to fix this. The wording in the mandate letter instructing the minister to amend the Copyright Act “to further protect artists, creators and copyright holders” offers some hope. The pendulum has swung so far in favour of unlicensed uses that some rebalancing is badly needed.
The next item on Mr. Rodriguez’s “to do” list is reintroduction of “reforms” to the Broadcasting Act, known as Bill C-10 in the last Parliament. This is a hot potato for a couple of reasons. In the last Parliamentary session, the legislation passed in the House but failed to get through the Senate before the election was called. In the parlance of the mandate letter, it targets “foreign web giants” to ensure that they “contribute” to the production of “Canadian stories and music”. Put more bluntly, it is designed to extract funding from foreign players like Netflix, Amazon Prime, Disney Plus, Spotify and other online streaming platforms to support Canadian production. Not production in Canada, of which there is plenty, some of it supported by these same platforms, but “Canadian production”.
The question of what qualifies as “Canadian production” is, to say the least, arcane. Currently, a production using a Canadian story with a Canadian director and Canadian actors does not qualify as Canadian content if the financing is not Canadian controlled (i.e. it is not produced by a Canadian-owned production company) and, under the draft legislation, if the copyright of the production is not held by a Canadian. What is the goal? Is it to tell more Canadian stories in Canadian settings to Canadian and global audiences or is it to ensure that more money is put into the hands of “qualified” (i.e. Canadian) producers by extracting from streaming services a “tax” similar to that which is imposed on conventional broadcasters? Those broadcasters must, as a condition of licence, spend at least 30 percent of their aggregate revenue in the previous year on Canadian content programming. While full foreign funding automatically disqualifies a production from meeting Canadian content requirements, there seem to be no qualms about requiring foreign producers to pay into a fund that would then be used to finance Canadian production. In effect the foreign funds are laundered through a Canadian production house. Finding the right balance to promote the creation of Canadian content (and sensibly defining what that is) while incentivizing the telling and distribution of Canadian stories by international production houses is a major challenge.
The other controversial aspect of the previous Bill C-10 was its application to platforms like YouTube, Twitter, Facebook and Instagram through inclusion of user-generated content (UGC) in “discoverability” requirements imposed on the platforms. To have omitted UGC would have created a massive loophole. YouTube, for example, is one of the primary music and video distribution platforms in the country. There is no reason to grant UGC an exemption from the application of law and regulation, as long as individual expression is subject to the normal protections afforded by the courts and the Charter. Critics accused the government of empowering the regulator, the CRTC, to censor content posted to social media by individual Canadians. This was a canard and a misunderstanding of the intent of the legislation since the obligations would have applied only to the platforms and would have had no impact on individual freedom of expression. We will have to wait to see whether and in what form the UGC issue is addressed in the 2022 version of the legislation.
Rodriguez’s instruction on this issue could not be clearer. “Swiftly introduce legislation” to require payment by digital platforms to Canadian news providers when the platforms generate revenue from that content, “modelled on the Australian approach”. This is clearly an idea whose time has come. One option the government had been considering was following the EU model of granting news publishers an ancillary copyright in their content, but it has now opted for the successful Australian model of using competition law to deal with the issue of free-riding by the platforms. With the Australians having taken on Google and Facebook and brought them to heel, the task should be considerably easier for the Canadian government. (The US government is also studying the issue.) Just the threat of legislation has inspired Google to reach content deals with many Canadian news providers. This legislation will provide the needed legal backstop.
This will be a big issue in 2022. There is no question that social media platforms and online services (Facebook, Instagram, Twitter, YouTube, TikTok, and Pornhub are specifically identified) need to be held to greater account for the socially harmful content posted by users that they knowingly host (and from which they often profit). The trick is to define “harms” in such a way that is clear and legally sustainable. This is easier to do for some harms than for others. In this regard, the offline world can provide precedents. A consultation paper was released in the fall of 2021 outlining the five categories of harms that would be regulated; terrorist content; content that incites violence; hate speech; non-consensual sharing of intimate images; and child sexual exploitation content.
The consultation paper proposes that a Digital Safety Commission be established to regulate platforms, with strong enforcement powers. Platforms would be required to establish reasonable monitoring mechanisms, assess flagged content, and remove harmful content within 24 hours, subject to appeal. They would also be required to establish a flagging and appeal process. Legislation would require greater transparency from platforms and impose an obligation to notify law enforcement in the case of “imminent serious harm” or potential criminal conduct. The paper elicited a number of comments, some negative, many of them from “internet freedom” advocates traditionally opposed to any meaningful regulation of the internet. Objections range from the requirement for a takedown within 24 hours to the obligation to share information with law enforcement to opposition to site-blocking powers. While one must be careful to target only behaviour and content that is truly harmful and illegal, it is also time to increase the pressure on platforms to exercise greater responsibility.
A recent, egregious example of the kind of harmful material found on the internet was this report in the New York Times about a “how to” suicide website. Site-blocking for online harms would be one effective way to deal with such outrageous content, given that the search engines refuse to delist the website. Those who oppose Canada extending its regulatory reach to international internet platforms, like University of Ottawa law professor Michael Geist, cite Article 19.17 of the USMCA/CUSMA as an obstacle. Geist claims that Canada agreed to provisions in the USMCA/CUSMA that “look very similar” to Section 230 of the 1996 US Consumer Decency Act and makes the dubious claim that if Canada enacts online harms legislation that creates new liability for the platforms, the US might take retaliatory trade action. Section 230 is the much-criticized US legislation that absolves internet platforms of any civil liability for user content on their platforms. It has been much abused over the years by the platforms who have used it as a shield to avoid taking action to moderate or remove harmful content.
Dr. Geist’s conclusion is inaccurate for several reasons. First, Article 19.17 may contain some phrases that are similar to parts of Section 230, but it is quite different in terms of its effect. As I wrote in an earlier blog posting (here), it imposes no obligations on Canada to enact any laws that would entrench Section 230 immunities in Canadian law because Canada protected its ability to implement 19.17 through its “laws, regulations, or application of existing legal doctrines as applied through judicial decisions”. Second, Article 19.17 is subject to a “public morals and public order” exception (embedded in Annex 19-A), an exception that the US has itself used in the past (Antigua online gambling case). If tackling online harms such as terrorism, child sexual exploitation, incitement to violence etc. doesn’t fall within the ambit of protecting public morals or maintaining public order, then I don’t know what does. Third, Section 230 and indeed Article 19.17 deal only with civil liability. Article 19.17 has an additional provision, subsection 4 (c) which states that;
“Nothing in this Article shall…be construed to prevent..(i) a Party from enforcing any criminal law; or (ii) a supplier or user of an interactive computer service from complying with a specific, lawful order of a law enforcement authority”
The online harms legislation in Canada will involve the Criminal Code. Raising USMCA Section 19.17 as a potential obstacle to introducing online harms legislation in order to hold the platforms more accountable for harmful content they allow to be distributed to users is just one more red herring dangled by opponents of the legislation.
The online harms bill is to be introduced “as soon as possible” while reflecting feedback received during the recent consultation. That suggests that it may not have as high a priority as some of the other items on Minister Rodriguez’s task list. We shall see.
In addition to the four “to do” items included in the Heritage Minister’s mandate letter, there are some other issues in the content field on the 2022 agenda, the primary one being the proposed acquisition of Shaw Communications by Rogers, a subject I plan to write about in a subsequent blog posting. This merger has both significant content and telecommunications impacts.
It promises to be a busy year as Justin Trudeau’s minority government tries to steer several key pieces of legislation through a Parliament where it will require the support of at least one major opposition party to get anything done. The stakes are high, particularly for rights-holders and copyright industries, and there are hopes–and expectations—that the new 44th Parliament will achieve more than the unfinished business of the last one.
This article was first published on hughstephensblog