How Slamdance Winner Nienke Deutz Used Innovative Animation To Create A Powerful Short Film On Female Friendship

Puberty is not for the faint of heart. We know this. We remember the hormones and the emotions, and the changes in our bodies that sometimes made us feel like strangers to ourselves. It’s a challenging, discomforting rite of passage that we all go through – and yet rarely does puberty get its proper due in the cinematic arts.

That’s why Dutch filmmaker Nienke Deutz’ animated short film, “Bloeistraat 11,” is such a revelation. Telling a wordless tale of two young female friends on the brink of adolescence, Deutz renders her girl characters with a shimmering stop-motion translucence that quite literally exposes their fraught vulnerability. It’s a deceptively languid, sneakily heart-wrenching series of vignettes set in a single location, and packing more emotion, history, and character development into 10 minutes than many feature films do in 90.

Deutz’ potent combination of nuanced storytelling and artful craft was more than enough to earn “Bloeistraat 11” the CreativeFuture Innovation Award at this year’s Slamdance Film Festival. It was one of many honors the film has picked up on the festival circuit, including a coveted Cristal for best short film at France’s renowned Annecy animation fest. This fall, it will screen in select theaters across the U.S. as part of the GLAS Animation Festival’s ANIMATION NEXT compilation.

Making this film very physical was a way to explore how we can empathize with animated characters,” Deutz told CreativeFuture from her apartment in Rotterdam. It was but one nugget of many in a lengthy interview about how she developed and funded the fascinating, beautiful technique that helps make “Bloeistraat 11” feel like “the epitome of what you can do as a child when you’re crafting.”

Bloeistraat 11 teaser from Nienke Deutz on Vimeo.


JUSTIN SANDERS: First of all, I must ask, how on earth do you pronounce “Bloeistraat 11”?

NIENKE DEUTZ: [Laughs] It’s “blow-ee strawt” and then “eleven” in Dutch is “elf.” I like diversity of languages so I thought, let’s go for a Dutch name that nobody can pronounce.

JS: And what does it mean?

NK: The full English translation is simply, “Bloom Street 11.” It’s the made-up address of the house where the film takes place. I liked it because there is a very obscure Dutch dialect in which “Bloei” also means “to bleed.” And blood, as you know, features in the film at key junctures.

JS: Yes, the characters in “Bloeistraat 11” are rendered as transparent figures, and we sometimes see emotional beats manifest as literal flesh and blood. You have another, much older short on your Vimeo page that is also called, perhaps not coincidentally, “Bloom,” and that also revolves around a similar anatomical motif. Where do you think this interest in visually “opening up” the inner workings of the human body comes from?

NK: Good question. I guess I am very much fascinated by our relation towards our bodies – that we are not really connected to them even though we would like to think we are. Even the most body-conscious people can grow sick without noticing. I can connect to my skin and my outside, but how often do I actually think about my liver or any other organ? I find this very fascinating.

In the case of “Bloeistraat 11,” I had an extra reason to use the body so explicitly: I think that it can sometimes be a bit hard to feel empathy for an animated character. We do not easily see the animated character as a representation of a human, so things can feel a bit far off.

I wanted to see if there are ways to feel more engaged with an animated figure. I had this idea that by showing the physicality of a character very explicitly and even making them go through unpleasant stuff, you can create a way to connect more easily with them. The unpleasantness can almost cause a mirroring reaction in our own body, which makes us understand that what we see on screen is not just lines or a puppet, but the representation of a human body.

Making this film very physical was a way to explore how we can emphasize with animated characters.

JS: And that physicality takes on an extra layer of vulnerability because you are presenting two women on the brink of adolescence.

NK: For me, the friendship between two girls at the onset of puberty is like a symbol for a relationship between two people and how we are very unable to communicate sometimes. There is an under-the-skin feeling that you have sometimes when you are very close to someone at that stage of life. It is kind of the first moment when you make a relationship with another human being entirely on your own. Before that, relationships are not something you actively choose. And these first friendships that you make in puberty are kind of like the tryout for other relationships that you have later in life.

One of the main reasons why I chose this specific time period was because I knew from the beginning that I wanted to work with crafting materials in the animation. I think the aesthetics of very simple crafting materials fits very well with the beginning of puberty – with these moments at the end of childhood. I realized that this film should almost symbolize the epitome of what a child might be able to achieve when crafting.

JS: You bring those “crafty” materials to life in a very unusual way in “Bloeistraat 11.” How did you develop the animation technique utilized here?
ND: What I really love to do is build sets, and models. It was very important for me to make the space of the house where this friendship would take place feel very present – for the viewer to understand the physical space of the house.

So, I was designing this set, building it from cardboard, and at the same time, I was drawing, making puppets, trying to figure out the look and feel of the characters. I started doing some very simple black line drawings on a piece of plastic, in pencil, and put them inside one of the sets. I animated the figures on just a two-second loop and noticed that light would flicker behind the clear plastic. It was beautiful, but very inconsistent. The body during puberty is different almost every day. You develop breasts, hair grows in strange places. Your body doesn’t have a fixed form, and this animated plastic, with the light behind it, didn’t have a fixed form either. That’s when I thought, “This is great. I can work with this.”

making of Bloeistraat 11 from Nienke Deutz on Vimeo.

JS: When you say “animated the figures,” what do you mean in technical terms? Is it stop-motion? 3D animation? A combination?

ND: Everything you see in the final version is done in camera – there is no 3D used. A common way to work in stop-motion is by using puppets. You change their position every frame and, when played at speed, this forms the illusion of movement. But another way of working in stop-motion is by using “replacement.” This means you have, depending on the frame rate of your animation, 12 or 24 “versions” of a physical character or object where each version differs slightly from the previous one.

I used the replacement technique, which meant that, at 12 frames per second, we had around 6500 cut-out plastic frames representing the characters across all 10 minutes of the film. These frames were first drawn in an animation program called TVPaint, and as a reference for the sets, I used simple models made in a 3D program called Blender. But this was just to get an idea of how the characters related to the space.

When it was time to do the in-camera 2D animation, all of the 6500 separate TVPaint drawings were exported and placed on a big image file to print onto 100x140cm plastic sheets.

Once printed out, these images were then cut out from the plastic, painted, and sometimes even outfitted with another object or piece of fabric. Then they were placed in the set and filmed, one by one, frame by frame.

JS: Here is where my mind goes when you describe such a fascinating process: How the heck did you print and cut out 6500 plastic figures?

ND: They were cut out by a machine. At I first I was very purist about it: “Everything has to be done by hand.” But the very first test that I cut by hand was insanely difficult, so then I thought, “Actually, I don’t want to kill myself. What can we do with automation?”

I found a place in this tiny Belgian village, a company that just does cutting and printing with plastic. I had to search for it for a long time!

JS: Speaking of time, how long did it take you to make this film overall?

ND: From the start, it took me three years. One year of development in between doing other stuff to make money, and then one year waiting to get all the funding ready. And then the last year was a full year of production.

It’s crazy right? This is how animation works. If someone would have said to me beforehand, “Hey, you’re going to spend three years of your life making a 10-minute short about two girls in the beginning of puberty,” I would have laughed and said, “Never!”

JS: How did you get the funding to sustain such a long production window – and for an experimental animation short, no less?

ND: I was actually really lucky. When I finished my master’s degree in animation, I sent my thesis film to a festival where it was automatically placed into what’s known as the “Wild Card” competition, which includes entries from all the Flemish film schools, in categories such as live-action, documentary, and of course, animation. The winner in each category wins a subsidy package for a future project.

My film, which was about kids playing a Belgian game similar to “Red Light Green Light” in America, was very experimental. But I was lucky to have a jury that was interested in something a little different – and I won!

Then of course I thought, “Oh, this is so much money,” but when I went to a producer about a year later, they said, [laughs] “This is not nearly enough money.”

So, we then successfully applied for subsidies from the Netherlands Film Fund and the Wallonia-Brussels Federation’s cinema and audiovisual center. It took a while for the money to come in, but still, I feel very lucky to live and work in countries that support filmmakers in this way!

JS: How did you go about finding a producer for the project?

ND: I just thought about which people I would like to work with, and there was a company called Lunanime that I thought would be a nice fit. Their main producer, Annemie Degryse, had been on the Wild Card jury, so they were aware of my work, and I just wrote them an email.

I had never worked with a production company before so I did not know what to expect. But Lunanime, and Annemie, gave me a lot of trust. She let me do what I wanted to do even though it was not the most efficient or easy or safe way to go.

JS: What is in it for Lunanime to partner with you on a project like this? Obviously, it’s an interesting project with a cool artist like yourself, but creative satisfaction aside, how can they justify devoting time and financial resources to an experimental short?

ND: This is a good question because for sure there is very little financial benefit in it for them. It can give them some good exposure at festivals if the film does well, but mostly, short films are a way to give new filmmakers the time and experience to develop. I think some countries in Europe still believe in nurturing this kind of non-commercial filmmaking. A lot of bigger production studios also do smaller, more experimental stuff on the side. And who knows? Maybe in the future the up-and-coming directors they partner with will develop a bigger project with them, like a series or feature film.

JS: For a feature film backed by a studio or production company, there is generally a release date they are trying to finish in time for. But how does it work for an experimental short? Did Lunanime give you a hard deadline for the completed film?

ND: The only deadline I had was that we had booked studio time to do the stop-motion part, but it took a lot more time than we had anticipated. Another production was coming in and we were going to have to leave, so I had to work long days to finish in time. We managed in the end, but it was very stressful. Also, I was just so tired, working around the clock, and it’s hard to be creative in that state.

JS: Were you working a day job to boot?

ND: No! I was able to do this full time. Very nice! Even though I was dying from exhaustion every day, I felt very privileged.

JS: How about now that you have completed the film? Do you get to work on your next one full time?

ND: I will once we get our funding and start production. But making short films is not really going to make you a living, so that’s why in between productions I also work as a freelance animator, director and teacher of animated film.

JS: Where are you from originally?

ND: I’m from a small city in the middle of Holland called ‘s-Hertogenbosch. It’s a funny name, even in Dutch!

JS: Did you always want to be a filmmaker growing up?

ND: I wanted to be either an inventor or a writer, so I’m kind of close now!

But actually, I was never someone who really had a specific aim of what I wanted to be. After high school, I did a one-year theater study where I took classes in everything from set design to costume making, to acting and singing. I loved all of it – except the acting and singing – but mostly I loved coming up with stories and thinking of how to execute them using sets and costumes.

JS: Did you think you would have a career in theater?

ND: Yes, I wanted to go to theater school to become a director. But I had to do entrance auditions and was extremely nervous for the acting bit, and I did not get in. I was devastated actually. But then I got accepted to art school and decided to try that. When I learned that there was a video department at my school, things fell more into place. I really feel, working as an animated film director, that I am precisely where I want to be and doing what I really love.

JS: How so?

ND: With animation, you get to think on a completely different time scale. The outcome of “Bloeistraat 11” is just one 10-minute film, but for me it felt almost like I did six different projects, each one requiring a different set of skills. I researched a lot, and then there’s the writing, the storyboarding, the directing, the working with other people. There are always more layers than I am expecting in animation, from painting a lot of frames to the narrative content. It is never boring, and it’s very nice to pursue each different layer at a slow pace.

I worked for a newspaper last summer, making animated gifs and other content for their website – and the pace is so fast. It was interesting to do but I really craved the luxury of being able to work slower and really dissect something completely. Sometimes I will see how much content other creatives generate, and I feel like [laughs], “cool, I’m doing this 10-minute film that will be done in three years.” But that’s what animation is. You make 12 or 24 frames a second, and it’s all super fascinating and a very big challenge. And I really love doing it.

JS: What advice would you have for other aspiring filmmakers who are thinking about testing the animation waters?

Do it! Experiment! Try different stuff and find your own voice. For inspiration, look around you, talk to people, watch movies, go to museums and read books, look at the world and not just at other animated films. Stay close to yourself and what brought you your initial enthusiasm. Animation can have a long learning curve because of its slow pace, so have patience. Also surround yourself every once in a while with people who share your interest, and geek out together!

This article was originally published in Creative Future.


Tightening The Screws On Pirate Websites Through Dynamic Website Blocking Injunctions

A pirate site is blocked through a court order yet like a chameleon it changes its colour (and IP address or URL) and is back up again tomorrow under a different guise. This is the reality that rights-holders have to face repeatedly in dealing with slippery pirate operators. But relief is coming.

In an important new development in India, the Delhi High Court recently issued a decision that allows rights-holders to seek “dynamic injunctions” against Indian ISPs. This requires them to block access to the spin-off “mirror” websites that typically appear as a result of the blocking of a primary offshore site that is providing copyright infringing content. Dynamic injunctions avoid the classic “whack-a-mole” problem where no sooner has a court issued an injunction against a specified website, than a clone hosted in some other unreachable jurisdiction pops up providing the same pirated content. Sometimes users seeking the original “free” content are even redirected to the mirror site. The Washington DC-based Information Technology and Innovation Foundation (ITIF) has prepared a detailed report of the Indian decision and its impact on India’s important film industry, focussing particularly on the dynamic injunction aspect. According to the ITIF;


“Just as website blocking is a pragmatic reflection of a country’s efforts to use injunction orders to get local ISPs to block access to piracy websites hosted overseas (and outside its jurisdiction), dynamic injunctions reflect the fact these same operators can subvert a court’s decisions by shifting targeted piracy operations to alternative websites. The goal of using dynamic injunctions as part of a website blocking system is not just to combat online piracy, but also to change consumers’ behavior by raising the cost—in terms of time and willingness to find alternatives sites and circumvention tools—to make the legal sources of content more appealing.


The intellectual property website IP Kat has also covered the case and notes that;


The judgement marks a significant advancement in curbing the menace of online piracy. It introduces certain novel ways of tackling the problem (such as) grant(ing) the power to the plaintiffs, with the approval of the Joint Registrar, to update the list of blocked websites by adding the names of mirror/redirect/alphanumeric websites.


IP Kat continues by saying;


This is a very practical solution, as one of the most apparent difficulties in tackling online piracy is the ability of pirated websites to produce mirror websites within seconds. As the power to update the list of blocked websites is now available without extensive procedures required for a new application, this will make blocking mirror websites easier and more effective. This is the most important aspect of the judgment as it substantially reduces the resources required for blocking every mirror infringing website.


The Indian court decision builds on precedents in Australia, the UK and Singapore. In Australia, new legislation, the Copyright Amendment (Online Infringement) Act, 2018, came into force in December of last year. It does a couple of important things to tighten up Australia’s already quite effective site-blocking legislation, Section 115A of the Copyright Act, introduced in 2015. That legislation introduced measures to enable copyright owners to seek an injunction from the Federal Court to require ISPs (known as Carriage Service Providers in Australia) to block access to offshore pirate websites that have the primary purpose of infringing or facilitating the infringement of copyright. Since that time, Australian content owners and carriage providers worked out a modus vivendi that saw a nominal fee agreed upon for the blocking process, while the providers ceased to oppose the orders.

The amendments introduced last year carried this a step further by adding “primary effect” to “primary purpose”, by extending the provision to search engines and by allowing for more responsive orders to be issued by the Court. What does “more responsive orders” mean exactly?

One Australian legal website explains it thus;


As noted in the explanatory memorandum to the Copyright Amendment (Online Infringement) Bill 2018, one of the limitations of the earlier legislation is that operators of online locations could attempt to avoid injunctions under section 115A by using another domain name, creating a new URL for the same content or obtaining a new IP address. To address this, the Act includes new provisions which allow the Court to make more responsive orders as part of an injunction application.”


In other words, the court will widen the application of the injunctions to capture mirror websites without the rights-holders having to go back to the court to initiate a new application each time the injunction is modified (subject to overall court oversight). This streamlining, as in India, is the key to effective disabling of pirate sites. Recently the Court also agreed to allow content providers to reduce the lead time required when it notifies ISPs of renewals of blocking orders.

Another element of the new 2018 legislation was to extend the application of the law to search engines, requiring them to de-index sites blocked by court order. Google fought long and hard against this measure, arguing that it was unnecessary. However, with the passage of the law (and the re-election of the government that enacted it), Google has undergone a sudden conversion (or is it a tactical shift?) to voluntarily get with the program. Better to do it voluntarily rather than be forced to do it by law seems to have been Google’s calculation.

The UK has also experimented successfully with flexible application of site blocking, particularly with regard to sports broadcasting, where rights-holders can seek court orders requiring ISPs to block pirated streaming feeds of games, like English Premier League soccer, in real-time. This requires a broad blocking order that will cover proxy and primary sites as well as servers streaming pirate content. ITIF has an excellent blogpost explaining exactly how this works technically.

Does this frustrate soccer fans who have tried to avoid paying their local content provider to watch the big game? Absolutely, and that is the point. It is very frustrating to see the screen go dark just as the winning goal is about to be scored. The same principle applies to dynamic site blocking injunctions. As the frustration level of users rises each time the proxy site they go to returns a “not found” result, the more likely they are to accept the inevitable solution—one that is the default in the offline world. Pay for the content that you consume.

While dynamic injunctions are a relatively new phenomenon, the principle of disabling access to offshore pirate websites is well established with more than 40 countries having implemented an administrative or legal process to enable this to happen. Most recently, the Parliamentary Committee holding hearings on the review of Copyright Act in Canada recommended that;


the Government of Canada consider evaluating tools to provide injunctive relief in a court of law for deliberate online copyright infringement”.


As I commented in a blog posting on the Committee’s report, this is a positive step forward, albeit not as concrete as advocates of site blocking in Canada would have preferred. A broad coalition of content providers (and some ISPs) had earlier made a proposal that an administrative mechanism for piracy site blocking be established under the auspices of the Broadcasting and Telecommunications regulator, the CRTC. The CRTC however declined to accept the proposal, arguing that it did not have jurisdiction, and punted the issue to Parliament to deal with under Copyright Act review. Now the Committee reviewing the legislation has taken a position, recognizing that there is a problem that needs to be addressed:


“The Committee…agrees that there is value in clarifying within the Act that rights-holders can seek injunctions to deny services to persons demonstrably and egregiously engaged in online piracy, provided there are appropriate procedural checks in place.”


If such a process is established, it is unlikely to include dynamic injunctions, at least not initially, but the experience of other countries is that once a site blocking mechanism is in place, it will not only prove its worth, it will demonstrate that the various arguments deployed against it (e.g. it will undermine net neutrality, it will damage the functioning of the internet, it will be too expensive to implement, it will lead to unwarranted censorship etc.) are unfounded. As the use of site blocking becomes more routine, fine-tuning it to avoid the evasive tactics of the pirate content providers is the next step, which is why dynamic blocking injunctions are becoming more widely accepted.

Maybe one day, Canada (and the US for that matter) will catch up with India, Australia and others in the application of reasonable legal remedies to combat the tactics of distributors of pirated, infringing, unlicensed content. These offshore operators have long played a cat-and-mouse game with providers of legitimate content. It is now time for the cat to sharpen its claws.

© Hugh Stephens, 2019. All Rights Reserved.

This article was originally published in Hugh Stephens Blog

Featured Photo by Matt Artz on Unsplash

Platform Accountability

An Open Letter Correcting Five Passages From Facebook’s ‘Community Standards’

Dear Facebook Human Resources,

We were recently partaking in some light summer reading by combing through the Community Standards section of your website – and were surprised to discover five key passages that must have been written by someone who doesn’t know your company. We wanted to bring the problem to your attention immediately.

We know that you are currently very busy dealing with data leaks and other scandals, negotiating multi-billion-dollar fines (congratulations on getting that one done!), and fending off criminal investigations, so it’s understandable that you may have missed these important details. Fortunately, we have a top-notch Communications team (albeit teeny-tiny) here at CreativeFuture that has taken the liberty of editing these passages for you. You’ll see them below as red-lined corrections to the original document. We are happy to help you put forward a much more accurate portrayal of the Facebook “ethos” at this critical juncture.


Every day, people come to Facebook to share their stories, see the world through the eyes of others, and connect with friends and causes, spread hatred and misinformation, peddle illegal goods, bully and troll other users, and corrode the foundations of democracy and civil discourseThe What passes for conversations that happen on Facebook reflects the diversity chaos of an uncontrollable community of more than two billion people communicating across countries and cultures and in dozens of languages, posting everything from text to violent and disturbing and infringing and otherwise troubling photos and videos.

We recognize how important it is for Facebook to present ourselves be as a place where people feel “empowered to communicate” (please stop asking us what that means – the PR experts said it sounded good), and we take our role in keeping pretending like there’s a chance in hell of ever keeping abuse off our service seriously. That’s why we have developed a set of Community Standards that outline what is and is not allowed on Facebook. Our Standards apply around the world to all types of content. They’re designed to be comprehensive – for example because it would be really hard to formulate a separate, specific set of guidelines on, say, content that might not be considered hate speech may still be removed for violating our bullying policies in every single country.

Sure, the goal of our Community Standards is to encourage expression and create a safe environment, but do you know how hard it would be to account for all the world’s linguistic and cultural nuances so that Facebook truly is safe for everyone? (Seriously, have you tried to figure out what passes for offensive diatribe in, like, Myanmar? Impossible. That’s why we’ve just translated these Standards that were written by Americans to whatever language it is they speak there – and it’s working out great. Don’t even worry about it. Seriously, don’t look it up – DO NOT.) Just know that we base our policies on input from our community and from experts in fields such as technology and public safety. Who are these experts, you ask? Who cares! They are EXPERTS. What else do you need to know?

2.) Privacy Violations and Image Privacy Rights
Pretending to care about privacy while the protection of madly harvesting personal information so that we can churn it into billions of dollars in ad revenue are fundamentally important values for Facebook. We work hard to convince you that we are keeping your account secure and safeguarding your personal information in order to protect you from potential physical or financial harm and not just shopping it around to the highest bidder. Sure, we utterly failed to protect your data that one time. Oh, right, and that other time, too. Okay, yeah, there was also that one other time. And then, sure, if you want to get nitpicky, then yes, this also happened. And also this. And, fine, yes there was this. Plus, there was… hey look! It’s one of your friend’s birthdays! Better go send them one of our world-famous Facebook birthday greetings! Did you know you can send them a birthday story now? How cool is that!?

3.) Misrepresentation
Authenticity Misrepresentation is the cornerstone of our community, and it starts at the top: Our own fearless leader makes grand proclamations about “building a global community” even though his percentage of Facebook voting shares actually makes him a kind of dictator who doesn’t have to listen to anyone. Why do we let him get away with it? Because we believe that people are more multibillion-dollar internet companies should not be held accountable for their statements and actions when they use their authentic identities that occur on their platforms. That’s why we make a big deal out of our requirement that people to connect on Facebook using the name they go by in everyday life. Even though we’ve actually removed billions of fake accounts – a problem that is only getting worse – having a “real name”-policy makes it look like we’re a trustworthy friend who genuinely cares about authenticity, and who doesn’t spend millions of dollars every year to shape policies that erode your privacy rights and preserve the safe harbor laws that let us off the hook for most of the terrible, toxic things that happen on our platform every minute of every day. The truth is, our authenticity misrepresentation policies are intended to create a safe environment for Facebook, not you! where People can trust and hold one another accountable on their own time.

4.) False News
Reducing the spread of false news on Facebook is a responsibility that we take seriously. We also recognize that this is a challenging and sensitive issue. We want to help people stay informed without stifling productive public discourse, but our business model depends on viral content that foments outrage and controversy. There is also a fine line between false news – such as articles that spread misinformation about vaccinations for children – and satire or opinion – such as all those hilarious “thought pieces” alleging that the government is forcing you to vaccinate your kids so they can “control them,” whatever that means.

What’s a giant corporation to do whose shareholders don’t like it when our traffic dips down? For these reasons It’s simple: we don’t remove false news from Facebook but instead, significantly reduce its distribution by showing it lower in the News Feed. That way, we can still make lots of money from it, but you won’t have it shoved in your face all the time – just sometimes. It’s a solution that benefits everyone: The trolls can keep on publishing hate speech and misinformation; our users can turn a blind eye to the rot and decay at the heart of our platform; and our investors can still swim in blood money. Win-win-win!

5.) Intellectual Property
Facebook takes intellectual property rights seriously and believes they are important to promoting expression, creativity, and innovation in our community. You own all of the content and information you post on Facebook, and you control how it is shared through your privacy and application settings. What’s that you say? There are numerous groups on our platform with tens of thousands of members who are freely sharing illegally downloaded films and music? That’s why we ask that However, before sharing content on Facebook, please be sure you have the right to do so. We also ask that you respect other people’s copyrights, trademarks, and other legal rights… Facebook’s Terms of Service do not allow people to post content that violates someone else’s intellectual property rights, including copyright and trademark. Okay? Happy now? Look, you’re lucky we even offer that – thanks to the Digital Millennium Copyright Act, we actually don’t have to take any proactive measures to seek out pirated content, if we don’t want to… But yes, upon receipt of a report from a rights holder or an authorized representative, we will are, lucky for you, obligated against our will to remove or restrict content – eventually. Like, when we have a free moment (we’re very busy, so don’t hold your breath), or when the illegal link has stopped making us boatloads of unearned money. You’re welcome.

Thank you for reviewing our suggested changes to Facebook’s Community Standards. We are excited to see them implemented so that your company can show the world how much you really do value transparency and community input. (Maybe don’t tell Zuck though. Just update everything when he’s out of the office – probably in front of some legislative body somewhere in the world apologizing yet again for what Facebook does and promising to “to better.”  He won’t notice the changes. When was the last time he even looked at these anyway, right?)



Platform Accountability

Google Is Monetizing Human Tragedy: Why Aren’t They Held Accountable?

My wife and I had just been visiting our daughter in her new home when we turned on the car radio. It was an interview on CBC with Andy Parker, whose daughter Alison had been murdered, live on TV, by a deranged former employee, back in 2015. The killer recorded and later uploaded video of Alison Parker’s death to the internet, in addition to the live broadcast of the tragedy. The radio story was about the trials of a father who was being trolled by hate-mongers and conspiracy theorists, and about his ongoing efforts to get the videos of his daughter’s murder taken down by YouTube. My heart went out to him. I understood what was driving him, what helps him get up each morning. It is to give some meaning to his daughter’s death by trying to make the world a slightly better place. And one of those things, in addition to better gun control, is to try to bring Google, owner of YouTube, to account for its actions, or rather, its non-action.

One wonders why a corporation of this size and influence, one with such reach and the power to influence people’s lives for the better, doesn’t get it. When Parker first learned that videos of Alison’s death were circulating on YouTube, he contacted them and was informed that according to the company’s “moment of death” policy, the content could be removed. There is an online form available that states;


If you’ve identified content showing your family member during moment of death or critical injury, and you wish to request removal of this content, please fill in the information below. We carefully review each request, but please note that we take public interest and newsworthiness into account when determining if content will be removed. Once you’ve submitted your report, we’ll investigate and take action as appropriate.”


So far, so good. But then Parker found out that he would have to flag each and every posting of the atrocity in order to get YouTube to act. Videos taken down today could be up again tomorrow, posted by people ranging from conspiracy theorists to plain vindictive sociopaths. YouTube refused to institute a blanket ban on the video, even though it had the technical means to do so. Moreover the algorithms that recommend content to viewers continue to serve up content related to the video. In frustration, Parker is now bringing a lawsuit against YouTube.

One has to ask why YouTube could not take the necessary steps to police its own content. Under pressure from copyright owners it has instituted a system of sorts that will take down all videos of a proven copyrighted work. While the system is unsatisfactory to many, at least there is a functioning system to take down copyright infringing works, as YouTube is required to do under the DMCA in order to keep its safe harbour. And there is other content that YouTube is required by law to block, and by and large it manages to do so, such as child porn, and sex trafficking. In addition, there are other forms of undesirable content that the platforms, YouTube among them, ought to block, as a matter of common sense, but here they do a poor job. Facebook’s slow-off- the-mark response to block the dissemination of the filmed violence against the mosque and worshippers in Christchurch, New Zealand, is but one example, as is the ongoing issue of hate speech and incitement to violence and terrorism as witnessed on the website 8Chan.

What really upsets Mr. Parker is that not only does YouTube require him to constantly police its site to identify postings of his daughter’s death (just as copyright owners have to spend the time to notify YouTube of infractions, although some of this is automated through ContentID), the clicks that it attracts enable YouTube to monetize the hateful video. In effect, YouTube is monetizing misery. Moreover each time that a takedown request is submitted to YouTube, the requester must cite the reason for the requested removal. Should a bereaved father have to do this on a daily basis? Parker, understandably, refuses to contemplate ever watching the video and has enlisted the support of others who have been in a similar position to identify and request the removals. (I have not watched it, nor will I ever do so).

While YouTube’s own Terms of Service indicate it will remove videos showing a moment of death scene (subject to the onerous and repetitive procedures described above), Parker has found that one of the more effective tools to use for removal is the use of copyright. The footage of Alison’s murder on YouTube comes from two sources; the actual footage of the atrocity broadcast on the day of the murder and the videocam footage shot by the killer. In the case of the former, although the station, WDBJ in Roanoke, Va. tried to limit broadcast of the footage, it has spread on the internet. Nonetheless, WDBJ owns the copyright in that footage and has assigned its digital rights to Parker. As the rights-holder, Parker asserts his DMCA right to takedown, and YouTube will comply—although as noted it is a thankless and repetitive task to have to continually flag offending content. With regard to the footage shot by the killer, the copyright strategy doesn’t work, yet YouTube is unwilling to enforce its own policies on highly offensive content that has been brought to its attention multiple times. There is really something wrong here.

In the face of this obduracy, or just plain shirking of normal moral responsibility as happened in the case of the mosque shooting in Christchurch, governments world-wide have started to re-examine very carefully the safe harbour protection that the platforms hide behind. In the US, the shield of choice, Section 230 of the Communications Decency Act (1996) has come under close scrutiny for the abuses it has allowed by giving platforms carte-blanche to host just about any content. Other countries, notably Australia, have taken robust action in aftermath of Christchurch. In April Australia passed a law holding senior executives of internet platforms personally criminally responsible (in addition to the corporation being corporately responsible and subject to massive fines) if the platform fails to act expeditiously to take down footage depicting “abhorrent violent content” once so directed by the authorities. The definition of such content includes, “videos depicting terrorist acts, murders, attempted murders, torture, rape or kidnap”.

Google claims that it is using machine technology to catch problematic content but is putting the onus on government to make it clearer to tech companies where the boundaries lie, for example, what constitutes unacceptable online hate speech and violent content. Google Canada’s head of government relations and public affairs is quoted as stated, in testimony to the House of Commons Justice Committee, that;


“I think the first step is to actually have a clear idea of what the boundaries are for terms like hate speech and violent extremist content…Because we’re still as a company interpreting and trying to define our perception of what society finds acceptable and what you as legislators and government find acceptable. The first step for us would rather be what is a clear definition, so we can act upon it.”


That sounds to me like passing the buck. While there may be grey areas when it comes to what constitutes illegal hate speech, surely that excuse doesn’t fly when we look at Alison Parker’s case. If Google wants to know what constitutes unacceptable violent content, look at the Australian legislation. No responsible broadcaster would disseminate that kind of material. Why should YouTube, Facebook or other platforms be able to get away with it? The videos that Andy Parker is trying to get permanently blocked on YouTube clearly violate the company’s Terms of Service, apart from anything else, and clearly constitute unacceptable violent content. Yet he is getting nothing but the runaround.

As noted above, Parker is taking legal action against Google, with the assistance of the Georgetown Law Civil Rights Clinic. He is also taking aim at Section 230 because more recently Google has cited this provision as the reason why they are not legally required to remove the content. Maybe this, and the publicity that he is generating by speaking out, will generate some action on the part of this multi-billion dollar company. Perhaps, but I am not holding my breath. Above all it seems that the most important consideration for Google is maximizing profit regardless of the human cost. Google performs an important role for consumers with its various businesses, above all online search and access to content. It has been given a largely free-ride by government with respect to regulation and oversight, with the results that we now see. The time for some accountability is long overdue.

© Hugh Stephens, 2019. All Rights Reserved.

This article was first published in Hugh Stephens Blog.

Featured Photo by Rajeshwar Bachu on Unsplash


“Like A Toddler With A Book Of Matches”: 13 More Reasons We Need #PlatformAccountability

Throughout their relatively brief yet terrible reigns, both Facebook and Google have compiled enough scandals to fell most companies many times over. And yet, following both companies’ most recent earnings reports, their stocks only continue to rise.

This is disheartening, frustrating, enraging, and terrifying. What does it say about us as a society that we keep rewarding internet giants that have proven time and again to have virtually no regard for our collective well-being? It can feel pretty hopeless, but there are glimmers of hope cracking through their seemingly impenetrable armor.

The U.S. government is finally ramping up the pressure, opening antitrust investigations into the biggest technology companies, and dishing out fines left and right. The FTC penalized Facebook an unprecedented $5 BILLION over privacy breaches – but, in a clear sign of just how much people fear Facebook’s power to harm, many experts have complained it’s not enough. One group, the Electronic Privacy Information Center (EPIC), has even filed a motion to block the settlement on the grounds that it “fails to ensure consumer privacy”.

A new era of platform scrutiny is upon us, and though we’re still a long way from true #PlatformAccountability, we get a little bit closer every day. Here are 13 more reasons why it can’t come soon enough, culled from across the spectrum of political, cultural, and sociological discourse.



1. Because deception and corruption have come to rule the day on our biggest internet platforms.
“The internet was supposed to not only democratize information but also rationalize it—to create markets where impartial metrics would automatically surface the truest ideas and best products, at a vast and incorruptible scale. But deception and corruption, as we’ve all seen by now, scale pretty fantastically too.”   – Wired

2. Because human dignity and universal rights are what should be ruling the day.

“Instead of aspiring, unrealistically, to make [platforms] value-neutral meeting places—worldwide coffee shops, streaming town squares—we could see them as forums analogous to the United Nations: arenas for discussion and negotiation that have also committed to agreed-upon principles of human dignity and universal rights.”

– The New Yorker

3. Because how can we expect dignity when these platforms are letting this happen?
“For the six months after he was hired, Speagle would moderate 100 to 200 posts a day. He watched people throw puppies into a raging river, and put lit fireworks in dogs’ mouths. He watched people mutilate the genitals of a live mouse, and chop off a cat’s face with a hatchet. He watched videos of people playing with human fetuses, and says he learned that they are allowed on Facebook ‘as long as the skin is translucent.’ He found that he could no longer sleep for more than two or three hours a night. He would frequently wake up in a cold sweat, crying.”

– The Verge

4. Because this guy literally penned YouTube’s community guidelines – and he’s not impressed.
“Bulls**t is infinitely more difficult to combat than it is to spread. YouTube should have course-corrected a long time ago.”

– Micah Schaffer, technology adviser who wrote YouTube’s first community guidelines

5. Because the old internet rules don’t apply anymore.
“The rules governing the internet made sense in the dot-com era. They don’t anymore.”

– Margaret O’Mara, author of The Code: Silicon Valley and the Remaking of America

6. Because the time for platform self-regulation has passed.
“Zuckerberg’s announcement that he plans to alter Facebook to focus on groups – and also launch a cryptocurrency – are downright alarming. There’s no reason to believe these changes will make user data any more secure. After all, Facebook hasn’t changed its fundamental business model. But the changes will make it harder for authorities and civil society groups to track and counter illegal activity on the platform. Groups are already the epicenter of black-market activity on Facebook. The firm’s continued negligence in the moderation of criminal content makes clear that the time for self-regulation has passed.”

– Alliance to Counter Crime Online

7. Because the agency that is supposed to be regulating platforms is terrifyingly overmatched.
“The [FTC] has a budget of $300 million a year and around 1,100 full-time staffers, 40 of whom are dedicated to privacy enforcement. That’s dwarfed by Facebook’s resources: The company is worth a half-trillion dollars and has nearly 30 employees for every one of the FTC’s.”

– Politico

8. Because even a $5 billion privacy settlement with the FTC can’t fix Facebook’s problems:
“It doesn’t fix the incentives causing these repeat privacy abuses. It doesn’t stop [Facebook] from engaging in surveillance or integrating platforms. There are no restrictions on data harvesting tactics — just paperwork… Mark Zuckerberg, Sheryl Sandberg, and other executives get blanket immunity for their role in the violations. This is wrong and sets a terrible precedent. The law doesn’t give them a special exemption.”

– Rohit Chopra, FTC Commissioner

9. Because if you’re so big that $5 billion is just a tap on the wrist, then something needs to change.
“The terrible message sent by this tap on the wrist is that enforcement of privacy protections is a hollow paper tiger in our nation… It has to be structural and behavioral and not just monetary, and this amount of money is way too low.”

– Senator Richard Blumenthal (D-Connecticut)

10. Because we live in a surveillance state, owned and operated by Google.
“Seen from the inside, [Google’s] Chrome browser looks a lot like surveillance software. My tests of Chrome vs. Firefox unearthed a personal data caper of absurd proportions. In a week of Web surfing on my desktop, I discovered 11,189 requests for tracker “cookies” that Chrome would have ushered right onto my computer but were automatically blocked by Firefox. These little files are the hooks that data firms, including Google itself, use to follow what websites you visit so they can build profiles of your interests, income and personality. Chrome welcomed trackers even at websites you would think would be private.”

– Geoffrey A. Fowler, Washington Post technology columnist

11. Because Facebook’s proposed cryptocurrency, Libra, is anti-democratic. 
“A permissionless currency system based on a consensus of large private actors across open protocols sounds nice, but it’s not democracy. Today, American bank regulators and central bankers are hired and fired by publicly elected leaders. Libra payments regulators would be hired and fired by a self-selected council of corporations. There are ways to characterize such a system, but democratic is not one of them.”

– Matt Stoller, fellow at Open Markets

12. Because they’re still finding novel new ways to put our children in danger.

We found a technical error that allowed [CHILD]’s friend [FRIEND] to create a group chat with [CHILD] and one or more of [FRIEND]’s parent-approved friends. We want you to know that we’ve turned off this group chat and are making sure that group chats like this won’t be allowed in the future. If you have questions about Messenger Kids and online safety, please visit our Help Center and Messenger Kids parental controls. We’d also appreciate your feedback.”

– Facebook user alert about Messenger Kids, where a design flaw let thousands of children join chats with unauthorized users

13. Because it’s time to take the matches out of the toddler’s hands.
“Now Facebook may not intend to be dangerous, but surely they don’t respect the power of the technologies they’re playing with… Like a toddler who has gotten his hands on a book of matches, Facebook has burned down the house over and over and called every arson a learning experience.”

– Senator Sherrod Brown (D-Ohio), at a House Judiciary Committee antitrust hearing

It is this last quote from Senator Brown that stings most of all – not because he compares Facebook’s childish recklessness to a toddler (that association was already readily apparent), but because he reminds us that one of the world’s most powerful internet companies operates from a place of disrespect.

Brown’s claim that “they don’t respect the power of the technologies they’re playing with” doesn’t only apply to Facebook, but to Google, as well. Both companies disrespect the power their technologies wield, and they disrespect the people who use them.

They disrespect creatives – and the creative works that get relentlessly pirated across their platforms. They disrespect children – and the families that must cope with the online harms that befall them. They disrespect regulators – by fighting to preserve their dominance at every turn. And, they disrespect the moderators they hire to endure unfathomable horrors for poverty-level wages.

But, what else should we have expected from their business model that treats every human being as little more than a data point to be monetized? Why should we be surprised when Facebook’s early, unofficial motto was “Move fast and break things”? How was it not obvious to EVERYONE, after Google literally removed its “Don’t Be Evil” slogan from its code of conduct, that in both attitude and action, these companies have never shown anything but a carefully glossed-over disdain for their users?

They have never respected us – any of us – and until #PlatformAccountability truly becomes a reality, they never will.

This article was first published in Creative Future.