- cross-posted to:
- technology@lemmy.zip
- cross-posted to:
- technology@lemmy.zip
A former senior Israeli government official now working as Meta’s Israel policy chief personally pushed for the censorship of Instagram accounts belonging to Students for Justice in Palestine — a group that has played a leading role in organizing campus protests against Israel’s ongoing war in Gaza.
Internal policy discussions reviewed by The Intercept show Jordana Cutler, Meta’s Israel & the Jewish Diaspora policy chief, used the company’s content escalation channels to flag for review at least four SJP posts, as well as other content expressing stances contrary to Israel’s foreign policy. When flagging SJP posts, Cutler repeatedly invoked Meta’s Dangerous Organizations and Individuals policy, which bars users from freely discussing a secret list of thousands of blacklisted entities. The Dangerous Organizations policy restricts “glorification” of those on the blacklist, but is supposed to allow for “social and political discourse” and “commentary.”
It’s unclear if Cutler’s attempts to use Meta’s internal censorship system were successful; the company declined to say what ultimately happened to posts that Cutler flagged. It’s not Cutler’s decision whether flagged content is ultimately censored; another team is responsible for moderation decisions. But experts who spoke to The Intercept expressed alarm over a senior employee tasked with representing the interests of any government advocating for restricting user content that runs contrary to those interests.
“It screams bias,” said Marwa Fatafta a policy adviser with the digital rights organization Access Now, which consults with Meta on content moderation issues. “It doesn’t really require that much intelligence to conclude what this person is up to.”
Meta did not respond to a detailed list of questions about Cutler’s flagging of posts but argued that writing an article about her was “dangerous and irresponsible.” In a statement, spokesperson Dani Lever wrote “who flags a particular piece of content for review is irrelevant because our policies govern what is and isn’t allowed on platform. In fact, the expectation of many teams at Meta, including Public Policy, is to escalate content that might violate our policies when they become aware of it, and they do so across regions and issue areas. Whenever any piece of content is flagged, a separate team of experts then reviews whether it violates our policies.”
As the public square moves more and more online, there is an increasing effort to assert control & dominance over it so that the old institutions which dominated the media can continue to dominate public discourse as it goes off script.