Unless you have been asleep for the last few months you will know that many social media companies and platforms – YouTube, Twitter, Facebook, Telegram, etc. – have been coming under increasing pressure to ensure that ‘unwanted’ material is not posted on their sites. This pressure is coming from governments and the public alike.
So, what constitutes ‘unwanted’ material? Well, there is the obvious of course: child pornography (but not all pornography, depending on your location obviously), graphic and disturbing images, calls for violence, etc. Then there are the less obvious: opposition to government, criticism of ruling parties, etc. There is unfortunately a tendency of despotic regimes to shut down sites and platforms where anyone who disagrees with the rulers tries in at least a small way to register their voices.
I am going to go out on a limb and say that most of us are ok with removing the former (child pornography) but not the latter (protest movements). But then there is a huge middle ground. Companies are deciding, probably through algorithms as there is far too much material for human eyes and ears to keep up on, what is ok and what is not ok.
Are we ok with this?
- Our team has reviewed your content, and, unfortunately, we think it violates our Community Guidelines…We know that this might be disappointing, but it’s important to us that YouTube is a safe place for all. If content breaks our rules, we remove it. If you think we’ve made a mistake, you can appeal and we’ll take another look…Content glorifying or inciting acts of violence is not allowed on YouTube.
So, let us get this straight. A podcast making fun of terrorism and terrorists and which tries to bring a level of analysis to the threat was removed because it was “glorifying or inciting acts of violence”??? Am I reading this right??
Why was my content removed?
One of my colleagues pointed out that the episode may have been identified as objectionable as it took issue, and lampooned, one of the QAnon front men, Jake Angeli, who dressed ridiculously in buffalo horns and a chest painted like the US flag. Yes, he is a moron, and I called him so. For the record, he calls himself a QAnon ‘shaman’.
For this sin on my part – perhaps – my podcast was removed. I apparently was guilty of hurting Mr. Angeli’s feelings or something along those lines. By the way, he was picked up by US police and is currently in custody on charges including violent entry and disorderly conduct. I would think that makes him a public figure and hence open to criticism. Maybe I am wrong?
OK, so maybe my words could be construed as an insult and a slur. If so, mea maxisimma culpa. But was that enough to justify removing the entire podcast? Could the offending part not have been excised, or is that technically impossible?
What worries me, among other things, are:
b) who is writing the algorithms and who reviews the results?
c) who gave these platforms the power to decide what passes and what does not? Is it any puzzle why China and Russia and North Korea take liberties with what to allow and what to not on their own internal platforms?
I do not want to make a mountain out of a molehill. If YouTube does not like my material and has concluded that a person who worked in counter terrorism for 15 years and who has written six peer-reviewed books on the subject is a sympathiser of these actors, so be it. There are other platforms (Podbean, Buzzsprout, etc.) which may be easier to use.
For those who follow me because you like the content I have good news. Irrespective of what YouTube or others think, all my podcasts will continue to appear on this Website. If you cannot find them there, you will be able to find them here. Furthermore, if you think YouTube has erred in this regard you might want to let them know.
If it happened to me it can happen to you.
Read More About QAnon
Quick Hits 178 – How serious are conspiracy theories? Borealis looks at the infamous ‘Q’ of QAnon to delve into this issue.
Terrorists like to portray themselves as masterminds of violence. Some are pretty dumb.
Are we ok with companies like YouTube using algorithms to decide what is ok to view and what is not on their platforms?