Many of the sites tried to remove the videos as they were uploaded but were overwhelmed. facebook said so deleted 1.5 million videos within 24 hours of the incident, although many managed to evade detection. On Reddit, a post with the video was viewed more than a million times before it was removed. Google said the speed at which the video was shared was faster than after any tragedy it had previously seen, according to the New Zealand government report.
In the days that followed, some people began discussing ways to circumvent the platforms’ automated systems to keep Christchurch video online. On Telegram on March 16, 2019, people who were part of a group related to white supremacy sought ways to manipulate the video so that it would not be removed, according to discussions seen by The Times.
“Only change the aperture,” one user wrote. “Accelerate by 2x and the [expletive] I can not find it.”
Within days, clips of the shooting were posted on 4chan, a fringe online message board. In July 2019, a 24-second clip of the murders also appeared on Rumble, according to The Times review.
In the months that followed, the New Zealand government identified more than 800 variations of the original video. Officials asked Facebook, Twitter, Reddit and other sites to devote more resources to removing them, according to the government report.
New copies or links to the video were uploaded online whenever the Christchurch shooting was in the news or on anniversaries of the event. In March 2020, about a year after the shooting, nearly a dozen tweets appeared on Twitter with links to variations of the video. More videos surfaced when the gunman was sentenced to life in prison in August 2020.
Other groups stepped in to pressure tech companies to take down the video. Tech Against Terrorism, a United Nations-backed initiative that develops technology to detect extremist content, sent 59 alerts about Christchurch content to tech companies and file hosting services from December 2020 to November 2021, said Adam Hadley, founder and group manager. That represented about 51 percent of the right-wing terrorist content the group was trying to remove online, he said.