Categories
Perspectives

Borealis was ‘algorithmed’ out by YouTube: is this ok?

Are we ok with companies like YouTube using algorithms to decide what is ok to view and what is not on their platforms?

Unless you have been asleep for the last few months you will know that many social media companies and platforms – YouTube, Twitter, Facebook, Telegram, etc. – have been coming under increasing pressure to ensure that ‘unwanted’ material is not posted on their sites. This pressure is coming from governments and the public alike.

So, what constitutes ‘unwanted’ material? Well, there is the obvious of course: child pornography (but not all pornography, depending on your location obviously), graphic and disturbing images, calls for violence, etc. Then there are the less obvious: opposition to government, criticism of ruling parties, etc. There is unfortunately a tendency of despotic regimes to shut down sites and platforms where anyone who disagrees with the rulers tries in at least a small way to register their voices.

Image result for social media platforms
And the list goes on and on and on….. (Photo: Shutterstock)

I am going to go out on a limb and say that most of us are ok with removing the former (child pornography) but not the latter (protest movements). But then there is a huge middle ground. Companies are deciding, probably through algorithms as there is far too much material for human eyes and ears to keep up on, what is ok and what is not ok.

Are we ok with this?

A recent podcast on this very site, episode 139 What is next for QAnon?, was taken off YouTube. Here is the automated email that was generated and sent to me (an excerpt):

  • Our team has reviewed your content, and, unfortunately, we think it violates our Community Guidelines…We know that this might be disappointing, but it’s important to us that YouTube is a safe place for all. If content breaks our rules, we remove it. If you think we’ve made a mistake, you can appeal and we’ll take another look…Content glorifying or inciting acts of violence is not allowed on YouTube.

So, let us get this straight. A podcast making fun of terrorism and terrorists and which tries to bring a level of analysis to the threat was removed because it was “glorifying or inciting acts of violence”??? Am I reading this right??

Why was my content removed?

One of my colleagues pointed out that the episode may have been identified as objectionable as it took issue, and lampooned, one of the QAnon front men, Jake Angeli, who dressed ridiculously in buffalo horns and a chest painted like the US flag. Yes, he is a moron, and I called him so. For the record, he calls himself a QAnon ‘shaman’.

Go figure.

For this sin on my part – perhaps – my podcast was removed. I apparently was guilty of hurting Mr. Angeli’s feelings or something along those lines. By the way, he was picked up by US police and is currently in custody on charges including violent entry and disorderly conduct. I would think that makes him a public figure and hence open to criticism. Maybe I am wrong?

OK, so maybe my words could be construed as an insult and a slur. If so, mea maxisimma culpa. But was that enough to justify removing the entire podcast? Could the offending part not have been excised, or is that technically impossible?

What worries me, among other things, are:

a) how can any of us talk about terrorism and violent extremism if some algorithm is going to interpret this as ‘supportive’ of these phenomena?

b) who is writing the algorithms and who reviews the results?

c) who gave these platforms the power to decide what passes and what does not? Is it any puzzle why China and Russia and North Korea take liberties with what to allow and what to not on their own internal platforms?

The Takeaway

I do not want to make a mountain out of a molehill. If YouTube does not like my material and has concluded that a person who worked in counter terrorism for 15 years and who has written six peer-reviewed books on the subject is a sympathiser of these actors, so be it. There are other platforms (Podbean, Buzzsprout, etc.) which may be easier to use.

For those who follow me because you like the content I have good news. Irrespective of what YouTube or others think, all my podcasts will continue to appear on this Website. If you cannot find them there, you will be able to find them here. Furthermore, if you think YouTube has erred in this regard you might want to let them know.

If it happened to me it can happen to you.

Read More About QAnon

By Phil Gurski

Phil Gurski is the President and CEO of Borealis Threat and Risk Consulting Ltd. Phil is a 32-year veteran of CSE and CSIS and the author of six books on terrorism.

6 replies on “Borealis was ‘algorithmed’ out by YouTube: is this ok?”

Phil – was the algorithm machine generated based on a “threat dictionary”? Regardless, it is nonsense to pull off the podcast from YouTube. Keep up the great work

It is a commonplace that free speech is the most vital ingredient of our democracy. It is essential, therefore, that any threat(s) to that must be countered immediately. Those extremist groups, like Q Anon and the Proud Boys etc, represent a serious threat, not just to peace and good order, but to our democratic rights and freedoms. We must advocate for perfect freedom to criticize their words and deeds as the best antidote to their poison. Let us all flood YouTube with our views on their reaction to Phil’s piece. Could you provide us with the correct contact information? Thank you. John Carrick Greene, Ottawa.

Thanks. I use Youtube a lot, mainly for the lectures. But I could not find any obvious place for leaving comments. Should we use facebook to do it? Sincerely, John

Leave a Reply