Subscribe to get latest news delivered straight to your inbox


    Google Is Monetizing Human Tragedy: Why Aren’t They Held Accountable?

    • 12.08.2019
    • By Hugh Stephens
    Hugh Stephens Blog

    My wife and I had just been visiting our daughter in her new home when we turned on the car radio. It was an interview on CBC with Andy Parker, whose daughter Alison had been murdered, live on TV, by a deranged former employee, back in 2015. The killer recorded and later uploaded video of Alison Parker’s death to the internet, in addition to the live broadcast of the tragedy. The radio story was about the trials of a father who was being trolled by hate-mongers and conspiracy theorists, and about his ongoing efforts to get the videos of his daughter’s murder taken down by YouTube. My heart went out to him. I understood what was driving him, what helps him get up each morning. It is to give some meaning to his daughter’s death by trying to make the world a slightly better place. And one of those things, in addition to better gun control, is to try to bring Google, owner of YouTube, to account for its actions, or rather, its non-action.

    One wonders why a corporation of this size and influence, one with such reach and the power to influence people’s lives for the better, doesn’t get it. When Parker first learned that videos of Alison’s death were circulating on YouTube, he contacted them and was informed that according to the company’s “moment of death” policy, the content could be removed. There is an online form available that states;

     

    If you’ve identified content showing your family member during moment of death or critical injury, and you wish to request removal of this content, please fill in the information below. We carefully review each request, but please note that we take public interest and newsworthiness into account when determining if content will be removed. Once you’ve submitted your report, we’ll investigate and take action as appropriate.”

     

    So far, so good. But then Parker found out that he would have to flag each and every posting of the atrocity in order to get YouTube to act. Videos taken down today could be up again tomorrow, posted by people ranging from conspiracy theorists to plain vindictive sociopaths. YouTube refused to institute a blanket ban on the video, even though it had the technical means to do so. Moreover the algorithms that recommend content to viewers continue to serve up content related to the video. In frustration, Parker is now bringing a lawsuit against YouTube.

    One has to ask why YouTube could not take the necessary steps to police its own content. Under pressure from copyright owners it has instituted a system of sorts that will take down all videos of a proven copyrighted work. While the system is unsatisfactory to many, at least there is a functioning system to take down copyright infringing works, as YouTube is required to do under the DMCA in order to keep its safe harbour. And there is other content that YouTube is required by law to block, and by and large it manages to do so, such as child porn, and sex trafficking. In addition, there are other forms of undesirable content that the platforms, YouTube among them, ought to block, as a matter of common sense, but here they do a poor job. Facebook’s slow-off- the-mark response to block the dissemination of the filmed violence against the mosque and worshippers in Christchurch, New Zealand, is but one example, as is the ongoing issue of hate speech and incitement to violence and terrorism as witnessed on the website 8Chan.

    What really upsets Mr. Parker is that not only does YouTube require him to constantly police its site to identify postings of his daughter’s death (just as copyright owners have to spend the time to notify YouTube of infractions, although some of this is automated through ContentID), the clicks that it attracts enable YouTube to monetize the hateful video. In effect, YouTube is monetizing misery. Moreover each time that a takedown request is submitted to YouTube, the requester must cite the reason for the requested removal. Should a bereaved father have to do this on a daily basis? Parker, understandably, refuses to contemplate ever watching the video and has enlisted the support of others who have been in a similar position to identify and request the removals. (I have not watched it, nor will I ever do so).

    While YouTube’s own Terms of Service indicate it will remove videos showing a moment of death scene (subject to the onerous and repetitive procedures described above), Parker has found that one of the more effective tools to use for removal is the use of copyright. The footage of Alison’s murder on YouTube comes from two sources; the actual footage of the atrocity broadcast on the day of the murder and the videocam footage shot by the killer. In the case of the former, although the station, WDBJ in Roanoke, Va. tried to limit broadcast of the footage, it has spread on the internet. Nonetheless, WDBJ owns the copyright in that footage and has assigned its digital rights to Parker. As the rights-holder, Parker asserts his DMCA right to takedown, and YouTube will comply—although as noted it is a thankless and repetitive task to have to continually flag offending content. With regard to the footage shot by the killer, the copyright strategy doesn’t work, yet YouTube is unwilling to enforce its own policies on highly offensive content that has been brought to its attention multiple times. There is really something wrong here.

    In the face of this obduracy, or just plain shirking of normal moral responsibility as happened in the case of the mosque shooting in Christchurch, governments world-wide have started to re-examine very carefully the safe harbour protection that the platforms hide behind. In the US, the shield of choice, Section 230 of the Communications Decency Act (1996) has come under close scrutiny for the abuses it has allowed by giving platforms carte-blanche to host just about any content. Other countries, notably Australia, have taken robust action in aftermath of Christchurch. In April Australia passed a law holding senior executives of internet platforms personally criminally responsible (in addition to the corporation being corporately responsible and subject to massive fines) if the platform fails to act expeditiously to take down footage depicting “abhorrent violent content” once so directed by the authorities. The definition of such content includes, “videos depicting terrorist acts, murders, attempted murders, torture, rape or kidnap”.

    Google claims that it is using machine technology to catch problematic content but is putting the onus on government to make it clearer to tech companies where the boundaries lie, for example, what constitutes unacceptable online hate speech and violent content. Google Canada’s head of government relations and public affairs is quoted as stated, in testimony to the House of Commons Justice Committee, that;

     

    “I think the first step is to actually have a clear idea of what the boundaries are for terms like hate speech and violent extremist content…Because we’re still as a company interpreting and trying to define our perception of what society finds acceptable and what you as legislators and government find acceptable. The first step for us would rather be what is a clear definition, so we can act upon it.”

     

    That sounds to me like passing the buck. While there may be grey areas when it comes to what constitutes illegal hate speech, surely that excuse doesn’t fly when we look at Alison Parker’s case. If Google wants to know what constitutes unacceptable violent content, look at the Australian legislation. No responsible broadcaster would disseminate that kind of material. Why should YouTube, Facebook or other platforms be able to get away with it? The videos that Andy Parker is trying to get permanently blocked on YouTube clearly violate the company’s Terms of Service, apart from anything else, and clearly constitute unacceptable violent content. Yet he is getting nothing but the runaround.

    As noted above, Parker is taking legal action against Google, with the assistance of the Georgetown Law Civil Rights Clinic. He is also taking aim at Section 230 because more recently Google has cited this provision as the reason why they are not legally required to remove the content. Maybe this, and the publicity that he is generating by speaking out, will generate some action on the part of this multi-billion dollar company. Perhaps, but I am not holding my breath. Above all it seems that the most important consideration for Google is maximizing profit regardless of the human cost. Google performs an important role for consumers with its various businesses, above all online search and access to content. It has been given a largely free-ride by government with respect to regulation and oversight, with the results that we now see. The time for some accountability is long overdue.

    © Hugh Stephens, 2019. All Rights Reserved.

    This article was first published in Hugh Stephens Blog.

    Featured Photo by Rajeshwar Bachu on Unsplash