Picture this. You’re at a Gourmerican burger joint chomping a cheeseburger, when an outspoken vegan friend starts preaching that you’re killing the planet. Last week, that same vegan downed a pricey glass of pinot before their flight to a far-flung destination, armed with their strongest mossie repellant and first aid kit. Anything amiss?
The use of social networks like Facebook, Twitter and Google’s YouTube by terrorist groups for propaganda, recruitment, fundraising and other activities has come into sharp focus recently.
It seemed inevitable that these companies would at some point be blamed for the misuse of these forums and become targets of lawsuits from families of victims.
A lawsuit filed in a federal court in California by the father of Nohemi Gonzalez, a victim of the Paris terror attack in November, charges that Twitter, Facebook and Google “have knowingly permitted the terrorist group ISIS [Islamic State group] to use their social networks as a tool for spreading extremist propaganda, raising funds and attracting new recruits.”
Gonzalez, a U.S. citizen, was killed in November when militants sprayed bullets into the bistro at which she was dining with friends. A similar suit has been brought against Twitter in January by the widow of a person killed in Jordan.
In that lawsuit as also the lawsuit filed by Nohemi's father, Reynaldo Gonzalez, the companies are likely to invoke Section 230 of the Communications Decency Act, which states that "no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
That provision protects online intermediaries from being held responsible for content hosted on their websites by third parties, which in this case would include the ISIS.
Congress has "broadly immunized entities like Twitter against lawsuits that seek to hold them liable for harmful or unlawful third-party content, including suits alleging that such entities failed to block, remove, or alter such content," Twitter said in its defense in the January lawsuit.
The January complaint does not allege any direct connection between Twitter and the killer Anwar Abu Zaid or the attack on the police compound in Jordan that killed Lloyd Fields, the victim in the attack, Twitter said. "Nor does it allege that Twitter itself created any of the Tweets, messages, or other content that the Complaint strains to link, even indirectly, to that attack," it added.
The Gonzalez complaint alleges that the three companies provided material support to terrorists under 18 U.S. Code 2339A and 2339B. “The services and support that Defendants purposefully, knowingly or with willful blindness provided to ISIS constitute material support to the preparation and carrying out of acts of international terrorism, including the attack in which Nohemi Gonzalez was killed,” according to the complaint filed Tuesday.
The complaint cites numerous instances of the misuse of the networks by the ISIS and also points to the revenue earned by the companies by placing advertisements on posts by the terrorists groups. “Astonishingly, Defendants routinely profit from ISIS. Each Defendant places ads on ISIS postings and derives revenue for the ad placement,” according to the filing.
Social networks for their part claim they are doing their best to weed out terrorist content though it is turning out to be like trying to whack-a-mole, with the proscribed content or new content resurfacing elsewhere.
YouTube has a strong track record of taking swift action against terrorist content, said a Google spokesman, who said the company would not comment on pending litigation. “We have clear policies prohibiting terrorist recruitment and content intending to incite violence and quickly remove videos violating these policies when flagged by our users. We also terminate accounts run by terrorist organizations or those that repeatedly violate our policies," he wrote in an email.
A Facebook spokeswoman wrote that “there is no place for terrorists or content that promotes or supports terrorism on Facebook, and we work aggressively to remove such content as soon as we become aware of it." A Twitter spokesman said “violent threats and the promotion of terrorism deserve no place on Twitter and, like other social networks, our rules make that clear.”
In a post on combating violent extremism in February, Twitter said that as noted by many experts and other companies, “there is no ‘magic algorithm’ for identifying terrorist content on the internet, so global online platforms are forced to make challenging judgment calls based on very limited information and guidance.”
Gonzalez is asking the court for compensatory damages to be decided in a trial. The lawsuit is likely to add to other pressures the social networking companies are already facing on the terror issue from various quarters including Congress. Senators Dianne Feinstein, a Democrat from California, and Richard Burr, a Republican from North Carolina, for example, proposed legislation in December that would require tech companies to report online terrorist activity to law enforcement.