Categories
Piracy

Tightening The Screws On Pirate Websites Through Dynamic Website Blocking Injunctions

A pirate site is blocked through a court order yet like a chameleon it changes its colour (and IP address or URL) and is back up again tomorrow under a different guise. This is the reality that rights-holders have to face repeatedly in dealing with slippery pirate operators. But relief is coming.

In an important new development in India, the Delhi High Court recently issued a decision that allows rights-holders to seek “dynamic injunctions” against Indian ISPs. This requires them to block access to the spin-off “mirror” websites that typically appear as a result of the blocking of a primary offshore site that is providing copyright infringing content. Dynamic injunctions avoid the classic “whack-a-mole” problem where no sooner has a court issued an injunction against a specified website, than a clone hosted in some other unreachable jurisdiction pops up providing the same pirated content. Sometimes users seeking the original “free” content are even redirected to the mirror site. The Washington DC-based Information Technology and Innovation Foundation (ITIF) has prepared a detailed report of the Indian decision and its impact on India’s important film industry, focussing particularly on the dynamic injunction aspect. According to the ITIF;

 

“Just as website blocking is a pragmatic reflection of a country’s efforts to use injunction orders to get local ISPs to block access to piracy websites hosted overseas (and outside its jurisdiction), dynamic injunctions reflect the fact these same operators can subvert a court’s decisions by shifting targeted piracy operations to alternative websites. The goal of using dynamic injunctions as part of a website blocking system is not just to combat online piracy, but also to change consumers’ behavior by raising the cost—in terms of time and willingness to find alternatives sites and circumvention tools—to make the legal sources of content more appealing.

 

The intellectual property website IP Kat has also covered the case and notes that;

 

The judgement marks a significant advancement in curbing the menace of online piracy. It introduces certain novel ways of tackling the problem (such as) grant(ing) the power to the plaintiffs, with the approval of the Joint Registrar, to update the list of blocked websites by adding the names of mirror/redirect/alphanumeric websites.

 

IP Kat continues by saying;

 

This is a very practical solution, as one of the most apparent difficulties in tackling online piracy is the ability of pirated websites to produce mirror websites within seconds. As the power to update the list of blocked websites is now available without extensive procedures required for a new application, this will make blocking mirror websites easier and more effective. This is the most important aspect of the judgment as it substantially reduces the resources required for blocking every mirror infringing website.

 

The Indian court decision builds on precedents in Australia, the UK and Singapore. In Australia, new legislation, the Copyright Amendment (Online Infringement) Act, 2018, came into force in December of last year. It does a couple of important things to tighten up Australia’s already quite effective site-blocking legislation, Section 115A of the Copyright Act, introduced in 2015. That legislation introduced measures to enable copyright owners to seek an injunction from the Federal Court to require ISPs (known as Carriage Service Providers in Australia) to block access to offshore pirate websites that have the primary purpose of infringing or facilitating the infringement of copyright. Since that time, Australian content owners and carriage providers worked out a modus vivendi that saw a nominal fee agreed upon for the blocking process, while the providers ceased to oppose the orders.

The amendments introduced last year carried this a step further by adding “primary effect” to “primary purpose”, by extending the provision to search engines and by allowing for more responsive orders to be issued by the Court. What does “more responsive orders” mean exactly?

One Australian legal website explains it thus;

 

As noted in the explanatory memorandum to the Copyright Amendment (Online Infringement) Bill 2018, one of the limitations of the earlier legislation is that operators of online locations could attempt to avoid injunctions under section 115A by using another domain name, creating a new URL for the same content or obtaining a new IP address. To address this, the Act includes new provisions which allow the Court to make more responsive orders as part of an injunction application.”

 

In other words, the court will widen the application of the injunctions to capture mirror websites without the rights-holders having to go back to the court to initiate a new application each time the injunction is modified (subject to overall court oversight). This streamlining, as in India, is the key to effective disabling of pirate sites. Recently the Court also agreed to allow content providers to reduce the lead time required when it notifies ISPs of renewals of blocking orders.

Another element of the new 2018 legislation was to extend the application of the law to search engines, requiring them to de-index sites blocked by court order. Google fought long and hard against this measure, arguing that it was unnecessary. However, with the passage of the law (and the re-election of the government that enacted it), Google has undergone a sudden conversion (or is it a tactical shift?) to voluntarily get with the program. Better to do it voluntarily rather than be forced to do it by law seems to have been Google’s calculation.

The UK has also experimented successfully with flexible application of site blocking, particularly with regard to sports broadcasting, where rights-holders can seek court orders requiring ISPs to block pirated streaming feeds of games, like English Premier League soccer, in real-time. This requires a broad blocking order that will cover proxy and primary sites as well as servers streaming pirate content. ITIF has an excellent blogpost explaining exactly how this works technically.

Does this frustrate soccer fans who have tried to avoid paying their local content provider to watch the big game? Absolutely, and that is the point. It is very frustrating to see the screen go dark just as the winning goal is about to be scored. The same principle applies to dynamic site blocking injunctions. As the frustration level of users rises each time the proxy site they go to returns a “not found” result, the more likely they are to accept the inevitable solution—one that is the default in the offline world. Pay for the content that you consume.

While dynamic injunctions are a relatively new phenomenon, the principle of disabling access to offshore pirate websites is well established with more than 40 countries having implemented an administrative or legal process to enable this to happen. Most recently, the Parliamentary Committee holding hearings on the review of Copyright Act in Canada recommended that;

 

the Government of Canada consider evaluating tools to provide injunctive relief in a court of law for deliberate online copyright infringement”.

 

As I commented in a blog posting on the Committee’s report, this is a positive step forward, albeit not as concrete as advocates of site blocking in Canada would have preferred. A broad coalition of content providers (and some ISPs) had earlier made a proposal that an administrative mechanism for piracy site blocking be established under the auspices of the Broadcasting and Telecommunications regulator, the CRTC. The CRTC however declined to accept the proposal, arguing that it did not have jurisdiction, and punted the issue to Parliament to deal with under Copyright Act review. Now the Committee reviewing the legislation has taken a position, recognizing that there is a problem that needs to be addressed:

 

“The Committee…agrees that there is value in clarifying within the Act that rights-holders can seek injunctions to deny services to persons demonstrably and egregiously engaged in online piracy, provided there are appropriate procedural checks in place.”

 

If such a process is established, it is unlikely to include dynamic injunctions, at least not initially, but the experience of other countries is that once a site blocking mechanism is in place, it will not only prove its worth, it will demonstrate that the various arguments deployed against it (e.g. it will undermine net neutrality, it will damage the functioning of the internet, it will be too expensive to implement, it will lead to unwarranted censorship etc.) are unfounded. As the use of site blocking becomes more routine, fine-tuning it to avoid the evasive tactics of the pirate content providers is the next step, which is why dynamic blocking injunctions are becoming more widely accepted.

Maybe one day, Canada (and the US for that matter) will catch up with India, Australia and others in the application of reasonable legal remedies to combat the tactics of distributors of pirated, infringing, unlicensed content. These offshore operators have long played a cat-and-mouse game with providers of legitimate content. It is now time for the cat to sharpen its claws.

© Hugh Stephens, 2019. All Rights Reserved.

This article was originally published in Hugh Stephens Blog

Featured Photo by Matt Artz on Unsplash

Categories
Platform Accountability

Google Is Monetizing Human Tragedy: Why Aren’t They Held Accountable?

My wife and I had just been visiting our daughter in her new home when we turned on the car radio. It was an interview on CBC with Andy Parker, whose daughter Alison had been murdered, live on TV, by a deranged former employee, back in 2015. The killer recorded and later uploaded video of Alison Parker’s death to the internet, in addition to the live broadcast of the tragedy. The radio story was about the trials of a father who was being trolled by hate-mongers and conspiracy theorists, and about his ongoing efforts to get the videos of his daughter’s murder taken down by YouTube. My heart went out to him. I understood what was driving him, what helps him get up each morning. It is to give some meaning to his daughter’s death by trying to make the world a slightly better place. And one of those things, in addition to better gun control, is to try to bring Google, owner of YouTube, to account for its actions, or rather, its non-action.

One wonders why a corporation of this size and influence, one with such reach and the power to influence people’s lives for the better, doesn’t get it. When Parker first learned that videos of Alison’s death were circulating on YouTube, he contacted them and was informed that according to the company’s “moment of death” policy, the content could be removed. There is an online form available that states;

 

If you’ve identified content showing your family member during moment of death or critical injury, and you wish to request removal of this content, please fill in the information below. We carefully review each request, but please note that we take public interest and newsworthiness into account when determining if content will be removed. Once you’ve submitted your report, we’ll investigate and take action as appropriate.”

 

So far, so good. But then Parker found out that he would have to flag each and every posting of the atrocity in order to get YouTube to act. Videos taken down today could be up again tomorrow, posted by people ranging from conspiracy theorists to plain vindictive sociopaths. YouTube refused to institute a blanket ban on the video, even though it had the technical means to do so. Moreover the algorithms that recommend content to viewers continue to serve up content related to the video. In frustration, Parker is now bringing a lawsuit against YouTube.

One has to ask why YouTube could not take the necessary steps to police its own content. Under pressure from copyright owners it has instituted a system of sorts that will take down all videos of a proven copyrighted work. While the system is unsatisfactory to many, at least there is a functioning system to take down copyright infringing works, as YouTube is required to do under the DMCA in order to keep its safe harbour. And there is other content that YouTube is required by law to block, and by and large it manages to do so, such as child porn, and sex trafficking. In addition, there are other forms of undesirable content that the platforms, YouTube among them, ought to block, as a matter of common sense, but here they do a poor job. Facebook’s slow-off- the-mark response to block the dissemination of the filmed violence against the mosque and worshippers in Christchurch, New Zealand, is but one example, as is the ongoing issue of hate speech and incitement to violence and terrorism as witnessed on the website 8Chan.

What really upsets Mr. Parker is that not only does YouTube require him to constantly police its site to identify postings of his daughter’s death (just as copyright owners have to spend the time to notify YouTube of infractions, although some of this is automated through ContentID), the clicks that it attracts enable YouTube to monetize the hateful video. In effect, YouTube is monetizing misery. Moreover each time that a takedown request is submitted to YouTube, the requester must cite the reason for the requested removal. Should a bereaved father have to do this on a daily basis? Parker, understandably, refuses to contemplate ever watching the video and has enlisted the support of others who have been in a similar position to identify and request the removals. (I have not watched it, nor will I ever do so).

While YouTube’s own Terms of Service indicate it will remove videos showing a moment of death scene (subject to the onerous and repetitive procedures described above), Parker has found that one of the more effective tools to use for removal is the use of copyright. The footage of Alison’s murder on YouTube comes from two sources; the actual footage of the atrocity broadcast on the day of the murder and the videocam footage shot by the killer. In the case of the former, although the station, WDBJ in Roanoke, Va. tried to limit broadcast of the footage, it has spread on the internet. Nonetheless, WDBJ owns the copyright in that footage and has assigned its digital rights to Parker. As the rights-holder, Parker asserts his DMCA right to takedown, and YouTube will comply—although as noted it is a thankless and repetitive task to have to continually flag offending content. With regard to the footage shot by the killer, the copyright strategy doesn’t work, yet YouTube is unwilling to enforce its own policies on highly offensive content that has been brought to its attention multiple times. There is really something wrong here.

In the face of this obduracy, or just plain shirking of normal moral responsibility as happened in the case of the mosque shooting in Christchurch, governments world-wide have started to re-examine very carefully the safe harbour protection that the platforms hide behind. In the US, the shield of choice, Section 230 of the Communications Decency Act (1996) has come under close scrutiny for the abuses it has allowed by giving platforms carte-blanche to host just about any content. Other countries, notably Australia, have taken robust action in aftermath of Christchurch. In April Australia passed a law holding senior executives of internet platforms personally criminally responsible (in addition to the corporation being corporately responsible and subject to massive fines) if the platform fails to act expeditiously to take down footage depicting “abhorrent violent content” once so directed by the authorities. The definition of such content includes, “videos depicting terrorist acts, murders, attempted murders, torture, rape or kidnap”.

Google claims that it is using machine technology to catch problematic content but is putting the onus on government to make it clearer to tech companies where the boundaries lie, for example, what constitutes unacceptable online hate speech and violent content. Google Canada’s head of government relations and public affairs is quoted as stated, in testimony to the House of Commons Justice Committee, that;

 

“I think the first step is to actually have a clear idea of what the boundaries are for terms like hate speech and violent extremist content…Because we’re still as a company interpreting and trying to define our perception of what society finds acceptable and what you as legislators and government find acceptable. The first step for us would rather be what is a clear definition, so we can act upon it.”

 

That sounds to me like passing the buck. While there may be grey areas when it comes to what constitutes illegal hate speech, surely that excuse doesn’t fly when we look at Alison Parker’s case. If Google wants to know what constitutes unacceptable violent content, look at the Australian legislation. No responsible broadcaster would disseminate that kind of material. Why should YouTube, Facebook or other platforms be able to get away with it? The videos that Andy Parker is trying to get permanently blocked on YouTube clearly violate the company’s Terms of Service, apart from anything else, and clearly constitute unacceptable violent content. Yet he is getting nothing but the runaround.

As noted above, Parker is taking legal action against Google, with the assistance of the Georgetown Law Civil Rights Clinic. He is also taking aim at Section 230 because more recently Google has cited this provision as the reason why they are not legally required to remove the content. Maybe this, and the publicity that he is generating by speaking out, will generate some action on the part of this multi-billion dollar company. Perhaps, but I am not holding my breath. Above all it seems that the most important consideration for Google is maximizing profit regardless of the human cost. Google performs an important role for consumers with its various businesses, above all online search and access to content. It has been given a largely free-ride by government with respect to regulation and oversight, with the results that we now see. The time for some accountability is long overdue.

© Hugh Stephens, 2019. All Rights Reserved.

This article was first published in Hugh Stephens Blog.

Featured Photo by Rajeshwar Bachu on Unsplash