The notion that Facebook was unable to detect the Christchurch mosque gunman's livestream because its contents were not "particularly gruesome" will be abhorrent to many of us.
Facebook's policy director for counter-terrorism this week reportedly told US Congress members that its algorithm did not detect the massacre livestream because there was "not enough gore".
The video, since declared an objectionable publication in New Zealand, was streamed for 17 minutes.
An algorithm is a preloaded trigger that detects, in this case, objectionable material, and shuts it down. In a way, it allows a computer to "think".
And in simpler terms, it is like the setting you can place on your computer, so that certain material does not show up when you or the kids do a Google search.