A new lawsuit argues that Facebook allowed anti-Semitic and anti-Israel content to spread across their platform, putting the lives of Israeli citizens in danger during a wave of violent attacks on Israelis by Palestinians in 2015.
The Verge reports that after a vehicle attack against Israeli soldiers in October of 2015 and a subsequent wave of shootings and stabbings, some Palestinian groups began encouraging violence against Jews via Facebook. Accounts linked to Hamas shared cartoons and memes depicting violence against Jewish people, and over the next 15 months, 47 people were killed and 675 wounded in anti-Israeli attacks.
As some began referring to the anti-Semitic Facebook groups as “Facebook Intifada,” Israeli President Benjamin Netanyahu spoke out against the groups. “What we are seeing here is a combination of radical Islam and the internet,” Netanyahu said speaking to the Likud Party that October. “Osama bin Laden meets Mark Zuckerberg. The incitement in the social networks is moving the murders.”
Facebook is now being brought to court over the groups. The company faced plaintiffs in the Eastern New York District Court yesterday who claimed that the social media platform fuelled anti-Semitic violence and put thousands of Israeli lives in danger. If the suit succeeds, Facebook would be forced to stop providing services to known terrorists and pay out as much as $1 billion in damages.
In general, platforms such as Facebook and Twitter are protected under law from the actions of their users; however, given a trend of recent terrorism lawsuits brought against social media companies and Facebook’s recent dedication to social change, the companies’ levels of responsibility is being tested.
In court, Facebook referred to a clause known as Section 230 of the Communications Decency Act which protects a provider of an interactive computer service from being held liable as a publisher. This clause has protected both Facebook and Twitter in the past from liability for the posts of their users.
Facebook believes these protections extend to the anti-Israeli groups using the platform. “This case, while it does certainly raise complex and important social issues, as a legal matter is a straightforward application of the CDA,” said Facebook attorney Craig Primis. “Decisions Facebook makes about how to operate its service are just as protected as any individual piece of content.”
The plaintiffs referred to the antiterrorism act in their rebuttal, which forbids providing material support to terrorist groups. Attorney Robert Tolchin argued that the concept of “material support” could include providing a Facebook account to an individual on the Treasury Department’s list of Specially Designated Nationals.
“I’m not saying Facebook has to look at everybody’s Facebook page and make an editorial decision,” said Tolchin. “All they have to do is understand their legal obligation not to provide services to the people on a list.”
To make the point that Facebook was failing in their duties to prevent violent anti-Israel users from utilizing their platforms, Tolchin pointed the court to an account registered under the name Mousa Abu Marzook, a senior leader of Hamas in Palestine. The account had been registered on Monday by an Israeli friend of Tolchin’s in order to demonstrate Facebook’s failure to properly police their userbase. The account has since been suspended.
District Judge Nicholas Garaufis, who oversaw the proceedings, asked, “Is it possible to eliminate the accounts of, say, people who live in the seven countries designated by the recent executive order on immigration?”
“I’m interested in knowing whether Facebook could focus on certain areas and eliminate the ability of those areas to disseminate content,” he continued.
Facebook has since filed a motion to dismiss, which is currently awaiting a ruling.
When reached for comment by The Verge, a Facebook spokesperson said, “Our community standards make clear that there is no place on Facebook for groups that engage in terrorist activity or for content that expresses support for such activity and we take swift action to remove this content when it’s reported to us. We sympathize with the victims of these horrible crimes.”