A recent Wired article details the algorithmic tweaks that YouTube implemented after the 2017 mass-shooting in Las Vegas sparked a wave of false-flag videos on its platform. The terrible behavior of some of its users following a national tragedy compelled YouTube to embark on nothing less than a “grand… project to teach its recommendation AI how to recognize the conspiratorial mindset and demote it”.
The difficulty of this mission cannot be understated. “Demoting” content is a vastly different endeavor than outright removing content that violates YouTube’s terms of service. The project involved clarifying what is “borderline” – or bad enough to be pushed down in search results and recommendations, yet not bad enough to be outright eliminated. It’s a hugely subjective distinction and one on which it is virtually impossible to find general agreement.
But YouTube went through with its plan anyway – and by early 2019, following a years-long, Herculean effort, the frequency with which its algorithm recommended conspiracy videos began to fall significantly.
Unfortunately, the triumph was short-lived. Since then, conspiracy theorists have learned how to game the system – and continue to reach millions of viewers with garbage that spreads harmful misinformation and undermines democracy. “If YouTube completely took away the recommendations algorithm tomorrow, I don’t think the extremist problem would be solved”, one researcher told Wired. “Because they’re just entrenched”.
It is tempting to look at this problem and think, “You made your bed, YouTube, now you have to lie in it.” But this is a problem that affects all of us. Misinformation, hate speech, pedophilia, and all the other toxic content riddling YouTube poses harm to individuals, communities, and even entire ethnic populations in areas of certain countries. We should all be rooting for YouTube to score victories over this plague wherever it can get them.
Now that YouTube has invested enormous amounts of time and resources in developing an algorithm that can detect extremely subjective borderline content, it should be a relatively simple matter to extend this processing power to remove content that is plainly illegal: pirated copyright-protected works. In fact, YouTube already implemented a suite of anti-piracy tools years ago – but has dithered over who should be granted access to them.
Its Content ID system, for instance, is a robust and effective content protection tool, matching all uploaded content with a copyright database and alerting copyright owners when their works have surfaced without permission. The owner can then decide if they want the asset removed, monetized, or left as is. They also have the powerful option to automatically flag uploads that match previously identified copyrighted work.
Unfortunately, the tool is available to just a small number of companies and individuals. YouTube has the sole power to decide who has access to it, and its selection process is anything but transparent.
Smaller-scale rightsholders might instead be granted access to YouTube’s Content Verification Program, which gives users access to a navigation panel in which they can search for their copyrighted works and quickly act upon any pirated versions that turn up. But, again, there is no transparency as to whom YouTube does or does not provide access, or the reasons why.
Another rung down the ladder, some copyright owners are offered access to Copyright Match, which can only locate full uploads of their videos (and does not help at all with music piracy). The tool is also only available to creatives whose channels have amassed “more than 4,000 valid public watch hours in the last 12 months” and have more than 1,000 subscribers. The millions of creatives who do not meet this threshold are not given access even to this tool.
All of these restrictions imposed by YouTube ensure that the vast majority of creatives lack any of YouTube’s content protection tools at all. Instead, they are left to locate misappropriations of their work themselves, on their own time, then file individual takedown notices using a cumbersome webform. The process is arduous and subject to counter-notices that can delay the takedown, while the illegal upload continues to rack up views. If the offending video does get taken down, it can be immediately re-uploaded somewhere else on YouTube – in which case, the creative must file a takedown notice all over again. On a platform that receives more than 500 hours of new video content every hour, this task is untenable for any human being.
At a recent hearing reassessing the Digital Millennium Copyright Act (DMCA) – the decades-old law that set the template for this highly inefficient takedown process – filmmaker Jonathan Yunger (Co-President of Millennium Media) professed to finding more than 200 pirated versions of his movies on YouTube, which include franchises such as The Expendables and Hellboy. “These films had been viewed more than 110 million times. In just one month!” he said. At the time, Millennium Media had no access to any of YouTube’s anti-piracy offerings, and had been left to police their own copyrighted works on the platform.
Yunger’s plight is more than enough evidence that YouTube is rife with stolen content, adding to a streaming piracy problem that is already costing the U.S. economy at least $29.2 billion per year, and at least 230,000 jobs. Moreover, YouTube has the means to put a major dent in this dire statistic immediately. It could have made its content protection tools widely available years ago – but for some reason, it has not. Why?
It could be because the safe harbors afforded by the DMCA simply give YouTube no incentive to do so. Under the law, as long as YouTube responds to each individual takedown notice, it has no obligation to take preemptive measures to clean up piracy for good.
Or, it could be something more sinister. After all, YouTube does collect ad revenue from pirated videos for their duration, and Google’s own executives have called the platform a “pirate” site. Perhaps the money generated by stolen content is too important to YouTube’s bottom line for it to take any meaningful action.
Whatever the reason for YouTube’s neglect of this problem, it does not change the fact that piracy is low-hanging fruit compared to its vastly more difficult battle against toxic content. If YouTube would just reach out and pluck it, that would give an immediate boost to millions of Americans who work in the creative industries (film and television alone employs more than 2.6 million people), and who see their works pirated every day.
With a growing pile of challenges bearing down on YouTube and its parent, Google, the platform could use an easy win so it can move on with the more difficult content management challenges. With piracy, it has one. Will YouTube take the W?