Terrorism by the numbers

I have decided to reread the Sherlock Holmes collection of stories (you can randomly make those kinds of decisions once you have retired).  Sir Arthur Conan Doyle’s books and some of the series/movies based (sometimes loosely) on them have always appealed to me (NB I do prefer the Jeremy Brett interpretation over all others) and it is high time I got re-aquainted with them.

Imagine my surprise, and delight, then to come across the following pronouncement by the great English detective in the very first of his many cases, The Scandal in Bohemia:

  • “It is a capital mistake to theorise before one has data.  Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.”

There is a lot of wisdom in those words and a particularly important message for our understanding of terrorism in our societies.  There is an awful lot of theorising about terrorism that is based on very small data sets.  Generally speaking, the tinier the sampling the more unreliable the theory.

Terrorism as a field of study does not lend itself to large data collections for the simple reason that it is and will most likely remain  a rare event.  At least that is certainly true in Canada, the US and most of the West.  It does happen with alarming frequency in other parts of the world, but even there it is usually not an everyday occurrence.

As I noted, when you theorise without having bothered to look at the data available, or extract from small samples to make large generalisations you are in dangerous waters.  We see this when we read that the “lost souls” – whatever that means – are susceptible to radicalisation (n=0).  Or that Michael Zehaf-Bibeau, the National War Memorial/Parliament attacker back in 2014, was mentally ill and he is the prototypical terrorist (n=1). Or that rampant Islamophobia in Quebec played an important role in the decision of a few Muslims in that province to go join IS in Syria (n=11).   Compare that to a study referenced in today’s Globe and Mail that hot weather makes us cranky (yes, yes I know that this is stating the obvious but the findings were based on decades of research and an n=1.9 MILLION).  Now that is some serious data!

We also learned today that 42 percent of Americans think that the threat of terrorism in the US is higher than it was before 9/11.  True, it is not easy to dismiss fear, one of the most powerful emotions we as humans have, but this “theory” is not based on any “facts”.  Even if we consider recent terrorist attacks in Orlando and San Bernardino, terrorism is still inconsequential in the US.  The fear is not linked to reality.  Then again, crime rates are the lowest on record and people rank fear of crime at the top of their lists of things they worry about, so go figure.

There is merit in coming up with a theory or hypothesis, gathering information with which to test the theory/hypothesis and confirming/tweaking/discarding the theory/hypothesis if the data shows it to be hopelessly inaccurate.  At least that is the way it works in scientific fields.  I am not sure that this is how it is being applied to terrorism but then again perhaps I am not reading the right literature.

So my appeal is once again to be careful with terrorism “theories”.  If you are going to make them, be sure you have robust and statistically significant data to support them.  If not, you are not making any real contribution to our understanding of violent extremism and you are only adding to all the junk science out there.  And, as we have seen, junk science, whether it is the bogus link between vaccinations and autism or the denial of climate change, does not make us better – or safer.


By Phil Gurski

Phil Gurski is the President and CEO of Borealis Threat and Risk Consulting Ltd. Phil is a 32-year veteran of CSE and CSIS and the author of six books on terrorism.

Leave a Reply