According to AJC’s State of Antisemitism in America 2023 Report, 62% of Jewish adults have seen antisemitic content online or on social media at least once in the past 12 months; this number increases to 67% for young American Jews ages 18-29. Nearly one in three (30%) American Jews have avoided posting content online that would identify them as Jewish or reveal their views on Jewish issues. While lawmakers from both sides of the aisle and some platforms are calling for increased regulation, social media companies must affirm that antisemitism will not be permitted or facilitated on their platforms. Please note that the suggestions offered below are not exhaustive. There is always more that can be done.
Social media companies should utilize the IHRA Working Definition of Antisemitism to strengthen hate speech policies across their platforms. This will allow artificial intelligence and human moderators to be more consistent and more effective in either content removal or demotion of all forms of antisemitism on their platforms. They can also utilize resources, such as Translate Hate, an online glossary of antisemitic tropes and phrases, to improve media literacy on antisemitism.
Responding to Antisemitism
Ensure transparency | Social media companies should be transparent in the drafting of policies, algorithms, and moderation systems and abide by a set of core principles that will earn public trust. Social media companies must correct the algorithms which allow hate to cross-pollinate and grow. Additionally, information on the impact of algorithms on the proliferation of antisemitic and hateful content should be made public. Social media companies should also regularly publish information about the impact of moderation systems, including the number of human moderators addressing online hate, the training that such moderators receive, and procedures for reinstating content that has been incorrectly removed.
Improve moderation systems | Moderation systems can be improved and harmonized to ensure moderators are accurately and equally implementing policies and community standards. In the rapidly evolving space of online antisemitism—which relies on memes, coded language or images, and implicit speech—non-human regulatory models are not fast enough. Social media companies should integrate the IHRA Working Definition into their policies to regularly train content moderators, and moderators must be trained regularly as antisemitism morphs and changes. Moderators who are not fluent in English need to be trained in their native language to understand company policies related to antisemitism as well as how to recognize the antisemitism coming from within their own historical, linguistic, political, religious, and economic contexts. Finally, safeguards should exist to allow judgments deeming content to be antisemitic to be appealed and reviewed.
Make it easier to report antisemitism | Antisemitism is a complex prejudice. It is not just a hatred or a direct attack against Jews, but a conspiracy about power and control. Consider listing antisemitism as an independent option for users to flag when reporting harmful content.
Promote counterspeech | Social media companies can play a powerful role in reminding users that it is incumbent on all of us to correct false narratives, drown out hateful voices, and push antisemites back to the far-fringes of the Internet where they belong—far removed from mainstream platforms and access to impressionable minds. We know, however, that counterspeech has the adverse effect of elevating the visibility to antisemitic posts because there is more engagement with it. Therefore, social media companies can partner with Jewish organizations in the fight against antisemitism on their platforms.
Today, the fight against antisemitism is primarily taking place in the digital world. Social media companies themselves have the biggest responsibility to ensure their platforms are not used as launching pads for conspiracies, antisemitism, and hatred. Freedom of speech does not absolve them of corporate responsibility.
Improve policies | Social media companies should establish community standards indicating that antisemitic speech will not be permitted on their platforms and that they will not facilitate access to services that do not prohibit it. Relatedly, they must guarantee appropriate safeguards to allow initial judgments deeming content to be antisemitic (or not) to be appealed and reviewed. To effectively do this, the IHRA Working Definition of Antisemitism, as the global, authoritative definition of antisemitism, should be incorporated within community standards.
Strengthen education on Jews, antisemitism, and the Holocaust | Social media companies can provide accurate information or redirect users to accurate information, such as resources about the Holocaust. They must also address the increasing challenge of inappropriate mass reporting. Jewish users and Jewish accounts have been harassed and mass-flagged, even when they did not do anything wrong.
Establish new positions | Social media companies should hire a point person focused on the Jewish diaspora to both listen to the concerns of Jewish communities around the world and work with senior leadership within the company so structural changes happen to ensure antisemitism is understood, recognized, and properly addressed. Additionally, companies should assign user researchers to the Jewish community to better understand how Jewish users experience antisemitism and hate on their platform so proper changes can be made.
Enhance Jewish community outreach | A number of social media companies have consistent outreach with Jewish communal leaders. For those who do not, consider starting regular meetings with Jewish stakeholders. In addition, social media companies can work with Jewish community partners to host Town Hall style events or trainings to educate the community at large on how their platforms address hate.