The French Council of the Muslim Faith (CFCM) has filed legal papers against the two tech companies over their response to a video of the terror attack in the city of Christchurch, in which 50 people were killed.
CFCM President Ahmet Ogras told CNN that the organization is taking legal action against Facebook for not removing the video fast enough.
"This not admissible, Facebook must take their part of responsibility in this and must do everything to anticipate these livestreams, as much as [they do with] hate messages and Islamophobia on their networks," Ogras told CNN.
Abdallah Zekri, president of the Observation Center Against Islamophobia, which is part of the CFCM, confirmed the legal action targets the French offices of both Facebook and YouTube.
"We can't have these videos online just like movies of shootings ... YouTube and Facebook must take measures to avoid this in the future," Zekri told CNN.
The council has filed a complaint with prosecutors in Paris and said it is suing Facebook and YouTube for "broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor," according to the AFP news agency, which received a copy of the complaint.
Under French law this is punishable by up to three years' jail time and a €75,000 ($85,000) fine.
Facebook didn't immediately respond to CNN's request for comment Monday. In a statement in the wake of the attack, Mia Garlick, Facebook's director of policy for Australia and New Zealand, said: "New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter's Facebook and Instagram accounts and the video."
A spokesman for YouTube declined to comment on the complaint and referred CNN to its previous statements. Following the attack, a Google spokesperson told CNN that YouTube removes "shocking, violent and graphic content" as soon as it is made aware of it. YouTube declined to comment at the time on how long it took to first remove the video.
A spokesman for the CFCM told CNN that if the tech companies do pay a fine as a result of the complaint, they would like families of the victims of the Christchurch attack to share the money.
The footage was widely shared on social media and tech companies came in for criticism over their handling of the video.
In a statement on its website, Facebook said it had removed 1.5 million videos of the attack in the first 24 hours after the shooting. It blocked 1.2 million of them at upload, meaning they would not have been seen by users. Facebook didn't say how many people had watched the remaining 300,000 videos.
On March 18, New Zealand's Prime Minister, Jacinda Ardern, said tech companies have "a lot of work" to do to curb the proliferation of content that incites hate and violence.
No comments:
Post a Comment