The ripple effect of the New Zealand Mosque shootings has continued with internet and telecommunications giants having questions to answer as regards the role they have played so far.
The role they have played is not far fetched from the fact that people have reacted to the terrible video of the mosque massacre in Christchurch, which was massively viewed on various internet and social media platforms.
Horrific video of the mosque massacre in Christchurch was viewed live fewer than 200 times on Facebook but that was enough to unleash it across the internet.
Now New Zealand, other governments and business leaders are calling for Facebook, Google and Twitter to do much more to rid their platforms of extremist content.
Vodafone and two other telecommunications operators, which provide internet access for most New Zealanders, said on Tuesday they want Facebook CEO Mark Zuckerberg, Twitter CEO Jack Dorsey and Google CEO Sundar Pichai to take part in an “urgent discussion” on how to keep harmful content off their platforms.
The three US tech companies have faced heavy criticism after they failed to identify and stop the spread of a video of Friday’s attack in which 50 people at two mosques were killed.
The CEOs of Vodafone New Zealand, Spark and 2degrees said they had taken the unprecedented step of jointly identifying and suspending access to sites that were hosting video footage taken by the attacker.
They called on authorities to require tech companies to take down terrorist-linked content within a specific period of time and fine them if they fail to do so.
“Although we recognize the speed with which social network companies sought to remove Friday’s video once they were made aware of it, this was still a response to material that was rapidly spreading globally and should never have been made available online,” they said in an open letter published on their company websites.
Germany introduced a law in 2018 that gives authorities the power to fine social media platforms if they fail to quickly remove hate speech.
And the European Commission is considering rules that would require platforms to remove terror content within an hour of it being flagged by authorities or risk fines up to 4% of global revenue.
More world leaders are demanding that the tech companies step up their game.
New Zealand Prime Minster Jacinda Ardern said Tuesday that her government will investigate the role social media played in the deadly attack.
“We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published,” Ardern said in a speech to parliament.
“They are the publisher. Not just the postman,” she said. “There cannot be a case of all profit no responsibility.”
Australian Prime Minister Scott Morrison on Tuesday criticized the “continuing and unrestricted role” played by internet technology in the New Zealand shooting and other terrorist attacks.
He laid out his concerns in an open letter to Japanese Prime Minister Shinzo Abe, who this year holds the presidency of the G20, an organization that brings together the world’s biggest economies.
“It is unacceptable to treat the internet as an ungoverned space,” Morrison said in the letter, calling for the issue to be discussed at the G20 summit in Osaka in June.
Governments around the world need to ensure that tech firms filter out and remove content linked to terrorism, and are transparent about how they do so, he said.