YouTube CEO Susan Wojcicki: ‘Where Do You Draw the Lines of Free Speech’

In this Tuesday, Feb. 28, 2017, file photo, YouTube CEO Susan Wojcicki speaks during the i
AP Photo/Reed Saxon

In a recent wide-ranging interview with The Guardian, YouTube CEO Susan Wojcicki discussed many issues faced by the Google-owned platform such as asking where she should “draw the lines of free speech?”

In an interview with Emine Saner, a feature writer for the Guardian, YouTube CEO Susan Wojcicki discussed the many issues that Google’s video-sharing platform faces. A key issue faced by the firm is the restriction of extremist content on the platform, which includes everything from footage of the New Zealand mosque attacks to violent or sexual videos disguised to look like content for children, replicating animated cartoons such as Peppa Pig.

Breitbart News has reported extensively that legitimate conservative content has been caught up in these efforts to restrict extremist content, such as videos created by PragerU which have been placed behind an age restriction warning and have resulted in a lawsuit. YouTube announced new anti-hate speech policies in June in an attempt to address issues that the platform was facing. Wojcicki commented on this stating:

We’ve always had an anti-hate speech policy. YouTube has had community guidelines from the very beginning and a number of guidelines involving hate and [incitement to] violence. What we have found is that, with every policy we make, there is content that will become borderline, or will find ways to skirt around those policies. What we were doing was tightening [rules] we already had.

When asked about individuals such as Alex Jones being allowed on YouTube’s platform, Wojcicki commented that Jones was removed from the platform last year stating:  “I think it’s important to remember that news or news commentary [is] a very small percentage of the number of views we have. The vast majority of YouTube is a combination of influencers who are focused in areas like comedy, beauty, how-to, gaming… If you look at some of the content you referenced, that is an extremely small part of the platform.”

The Guardian asked if such content has been “dangerously influential” to which Wojcicki replied:

Look, it’s a very small percentage of our views, and the way that we think about it is: ‘Is this content violating one of our policies? Has it violated anything in terms of hate, harassment?’ If it has, we remove that content. We keep tightening and tightening the policies. We also get criticism, just to be clear, [about] where do you draw the lines of free speech and, if you draw it too tightly, are you removing voices of society that should be heard? We’re trying to strike a balance of enabling a broad set of voices, but also making sure that those voices play by a set of rules that are healthy conversations for society

Wojcicki is a mother with children ranging in ages between four to their late teens, she notes that while they do watch YouTube Wojcicki and her husband do restrict the amount of screen time that the children get. Wojcicki notes that her children were not given phones or tablet devices from an extremely young age, waiting until middle school to trust them with such technology. Wojcicki stated that her daughter only given a cell phone after getting stuck somewhere without a way to contact her parents:

There are moments when it becomes important for them to have a phone. I think middle school [from about the age of 11] is a reasonable point to start educating them about it, but also a lot of times you can take it away. High school is harder – you’re dealing with children who are getting close to going to college, and you have zero control when that happens.

Wojcicki states that she feels she has a commitment to solve the challenges that YouTube faces and cares about the legacy the firm leaves behind:

I care about the legacy that we leave and about how history will view this point in time. Here’s this new technology, we’ve enabled all these new voices. What did we do? Did we decide to shut it down and say only a small set of people will have their voice? Who will decide that, and how will it be decided? Or do we find a way to enable all these different voices and perspectives, but find a way to manage the abuse of it? I’m focused on making sure we can manage the challenges of having an open platform in a responsible way.

Read the full interview in The Guardian here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com

COMMENTS

Please let us know if you're having issues with commenting.