Facebook, Holocaust Denial, and Anti-Semitism 2.0
Andre Oboler - Sep 15, 2009
Jerusalem Center for Public Affairs
In May 2009, Facebook went into damage control in response to the media interest in Holocaust-denial groups it hosted. This occurred six months after Facebook was notified that such groups not only breached its Terms of Service but were illegal under national laws banning Holocaust denial in several countries.
Between receiving the complaints and responding to the media interest, Facebook rolled out new terms of use. These removed the explicit ban on content that is "harmful," "defamatory," "abusive," "inflammatory," "vulgar," "obscene," "fraudulent," "invasive of privacy or publicity rights," or "racially, ethnically or otherwise objectionable." The reference to local, regional, and national laws also vanished.
Facebook`s eventual response, defending the posting of Holocaust denial, highlighted a dramatic change in direction for a company that once sought to provide a "safe place on the internet" and stated that "certain kinds of speech simply do not belong in a community like Facebook." Facebook has through ignorance created an anti-Semitic policy platform where the only explicitly allowed hate is that, within certain parameters, directed against Jews.
Holocaust-denial groups should be removed from Facebook because Holocaust denial is a form of anti-Semitism. Such content represents a clear expression of hate and is therefore inconsistent with basic standards of decency and even Facebook`s new Terms of Service. Holocaust denial also constitutes a threat to the safety of the Jewish community. Such a ban would not be inconsistent with First Amendment rights in the United States, and would be wholly consistent with hate speech bans that exist in much of Europe.
The treatment of Holocaust denial shows that ground has been lost in the fight against Anti-Semitism 2.0 (see below) and the increasing social acceptability of racism and hate within Facebook. If service providers fail to set standards barring abusive and racist content, lawmakers must intervene. Where laws exist, such as the ban on Holocaust denial in various countries, the same rules as for copyright infringement must apply, and the company itself must be held liable if it continues to facilitate a breach of the law once the matter is brought to its attention.
Holocaust Denial and Hate on Facebook
The spread of both Holocaust denial and the social acceptability of Holocaust denial through social media platforms such as Facebook formed part of the definition of Anti-Semitism 2.0 presented at the Global Forum to Combat Anti-Semitism in February 2008: "Anti-Semitism 2.0 is the use of online social networking and content collaboration to share demonization, conspiracy theories, Holocaust denial, and classical anti-semitic motifs with a view to creating social acceptability for such content."1
An antecedent article on Anti-Semitism 2.01 focused in part on the Facebook group "`Israel` is not a country!... Delist it from Facebook as a country!" The rise and fall of this group became the focal point of the first campaign against anti-Semitism on Facebook to gain significant press coverage2,3,4 and brought the grassroots group, the JIDF (Jewish Internet Defense Force), to the public`s attention.5
JIDF lists of problematic content have included both YouTube videos and Facebook groups promoting Holocaust denial. By asking members to report such content the JIDF achieved some of its early success in eliminating hate.6,7,8,9 Complaints soon followed. On the anti-Semitic website JewWatch,10 the JIDF was accused of censorship. In reply the JIDF pointed out the fallacy of "freedom of speech" on private services such as YouTube.11
The argument against Holocaust denial on Facebook gained ground on 18 October 2008 when the JIDF released a note "Regarding Illegal Content on Facebook."12 The note built on background research by David Eshmoili, a recent graduate of Cornell Law School, who first raised the issue of national laws that prohibit Holocaust denial in countries such as Germany and Israel. Eshmoili, who explained he is "not affiliated or aligned with the JIDF," sent JIDF the material, "knowing that the JIDF would act on the information because of their vigorous activism on the internet in the past, particularly on Facebook."13
The JIDF, attributing the information to John Cohen, their pseudonym for anonymous tip-offs,14 used the information as the basis of their letter to Facebook a few days later.15 This letter inspired blogger Brian Cuban to write to Facebook and publish a blog post in November 2008.16 The issue began to gather dust as Facebook refused to clarify its position in relation to laws outside the United States, or to take action against the Holocaust-denial groups both the JIDF and Cuban had brought to their attention.
Facebook`s Holocaust-Denial Groups Gain Media Attention
The media and public interest in Holocaust denial on Facebook was triggered by a CNET News article17 by Chris Matyszczyk that appeared on 4 May 2009. The article was based on a new blog post by Brian Cuban.18 Cuban used comments made by President Obama at the National Holocaust Museum to reframe his earlier post16 about Holocaust denial on Facebook and to reiterate his complaint about a lack of meaningful response from Facebook. Matyszczyk used Cuban`s celebrity status as the lawyer and brother of Mark Cuban, an American billionaire entrepreneur and owner of the NBA basketball team the Dallas Mavericks, to give the story a popular angle.
The media picked up the story with further reports from major players including the Guardian,19 CNN,20 the BBC,21and Fox News.22 Facebook too responded with urgency. Barry Schnitt, a spokesman for Facebook, wrote to Matyszczyk the day after his CNET article was published. He opened by saying Facebook "weren`t given an opportunity to participate in the story" and closed by saying "in the future, we`d really appreciate the opportunity to comment."23
What is clear is that Facebook went into damage control not in response to moral and ethical questions, but in response to the media interest. Matyszczyk received an answer to questions Facebook had dodged when they were previously asked by Brian Cuban,16,23 and before that by the JIDF whose initial letter to Facebook on 24 October 200815 brought the matter to Cuban`s attention.16 The questions raised related not only to the Terms of Service, which Facebook can and has since changed, but to national laws beyond Facebook`s control and questions about social values in today`s online world.
Challenging Holocaust Denial on Facebook
Facebook received emails from both the JIDF15 and Brian Cuban16 listing the same five Holocaust denial groups: "Based on the facts....There was no Holocaust," "Holocaust: A series of Lies," "Holocaust is a Holohoax," "Holohoax," and "Holocaust is a Myth." Both brought to Facebook`s attention the fact that Holocaust denial is a crime in some countries and that Facebook`s own Terms of Use prohibit content that would "constitute, encourage or provide instructions for a criminal offense" or "violate any local, state, national or international law."
Cuban asked Facebook, "Is there anyone at Facebook I can ask for a comment on why these groups are permitted and/or do not violate Facebook TOS before I write the article?" The answers he received did not address the issue.
The JIDF letter went into further detail noting that Holocaust denial is illegal in thirteen countries: Austria, Belgium, Czech Republic, France, Germany, Israel, Liechtenstein, Lithuania, Luxembourg, Poland, Portugal, Romania, and Switzerland. They also pointed out the strictness of laws in Germany, Austria, and Romania and that "any group that denies the occurrence of the Holocaust is violating the laws of these nations."
The JIDF also argued that "German law also outlaws anything associated with Nazism. So any group that has Nazi symbols and such should be taken down." In additional to national law the JIDF referred to European Union law and specifically Joint action/96/443/JHA,24 which requires countries to make Holocaust denial "punishable as a criminal offence." The wording is: "public denial of the crimes defined in Article 6 of the Charter of the International Military Tribunal appended to the London Agreement of 8 April 1945 insofar as it includes behaviour which is contemptuous of, or degrading to, a group of persons defined by reference to colour, race, religion or national or ethnic origin."
The International Military Tribunal, more commonly known as the Nuremberg Trials, was established by the Allied powers to try Nazis and their collaborators. Article 6 lists the crimes that the tribunal had jurisdiction over,25 namely, the Nazi war crimes that constitute the Holocaust.
The EU Joint Action is a specific and limited prohibition, under international law, against denial of the Nazi Holocaust. The EU position makes clear that such denial is "contemptuous" or "degrading" to specific groups within society. It is not only a legal argument, but a moral one against hate speech.
Facebook`s Response
On 12 November 2008, Brian Cuban received a reply to his initial email, though not to his question based on national laws prohibiting Holocaust denial.23 The reply states that Facebook takes "very seriously" their Terms of Use policy, and then outlines how they apply. Facebook claims to:
• "React quickly to take down groups that violate these terms"
• "[Be] sensitive to groups that threaten violence toward people and these groups are taken down"
• "Remove groups that express hatred toward individuals"
• "[Remove] groups that are sponsored by recognized terrorist organizations"
They go on to say that "We do not, however, take down groups that speak out against countries, political entities, or ideas." Not specifically falling into either category are groups that express hatred toward a group of people, for example, racist groups or those targeting the disabled. Such hatred is illegal in most countries; the United States stands out as an exception. Facebook has removed hateful contents about groups in the recent past, for example, an anti-immigrant group from the Isle of Man.26
The email not only missed the point, it is a canned response. The same email was sent to German journalist Christoph Gunkel on 30 September 2008, and was described in his article "Facebook und Google Earth: Anti-Semitismus im Web 2.0" published by the German newspaper FAZ the following month.27 As Gunkel wrote, "the question of what Facebook intends to do about a group of Holocaust deniers, which has existed since July 2007, is discreetly left unanswered in the written statement by a company spokeswoman."
It was Chris Matyszczyk, who had not contacted Facebook with the question, who finally got a direct reply on the Holocaust-denial question from Facebook spokesman Barry Schnitt on 5 May 2009.23 The reply opens with the same standard email as sent to Gunkel and Cuban, then adds another section addressing the problem with national laws:
When dealing with user generated content on global websites, there are occasions where content that is illegal in one country, is not (or may even be protected) in another. For example, homosexual content is illegal in some countries, but that does not mean it should be removed from Facebook. Most companies approach this issue by preventing certain content from being shown to users in the countries where it is illegal and that is our approach as well. We have recently begun to block content by IP [the "address" of a computer on the internet] in countries where that content is illegal, including Nazi-related and holocaust denial content in certain European countries. The groups in question have been blocked in the appropriate countries.
Facebook`s solution is similar to that of other companies that have localized to take account of such laws, particularly in Germany. There are, however, two flaws in this approach. The first is that U.S. laws governing protected speech do not apply to private spaces such as Facebook. Any concerns Facebook employees or managers have about the first amendment are misplaced, or are being deliberately misused to confuse the public. The second flaw is that this addresses the legal issue in strictly legal terms. There was one further email that Matyszczyk received when he responded to Facebook. This again focuses specifically on Holocaust denial and moves away from the purely legalistic answer (see "Debate, Defamation, and Denial" below).
Facebook`s official response in the media has been to defend their right not to take action (unless legally required to) based on a "free speech" argument. This came at the same time, and from the same spokesperson who made announcements about Facebook`s crackdown on pictures of breastfeeding mothers as "obscene" and therefore a violation of their Terms of Service.
Behind the explicit questions lie deeper moral and ethical questions about the nature of the Facebook community, corporate responsibility, and online social norms. Where does Facebook want to stand in the battle against online hate?
The Evolution of Facebook`s Terms of Service
Facebook has helped shape modern attitudes toward sharing personal information online. Despite assurances that their Terms of Use are taken seriously, the evolutionary development strengthening protections in the Terms of Use was replaced in a radical overhaul in May 2009. This went largely unnoticed as the media focused on content ownership rights.
The first version of the Terms of Service at Facebook.com came into effect on 3 October 2005.28 This includes a section on "Member Conduct" that prohibited, among other things:
• upload, post, email, transmit or otherwise make available any content that we deem to be harmful, threatening, abusive, harassing, vulgar, obscene, hateful, or racially, ethnically or otherwise objectionable;
• intimidate or harass another;...
Facebook was established as a community environment with a set of rules that prohibited discrimination and the sort of intimidation and harassment that has since become known as "cyber-bullying."29 Facebook grew out of the U.S. college community and into the school community. In the first few years Facebook required a school email address in order to open an account. A safety- first policy made sense.
On 27 February 2006, the Terms of Service were altered as Facebook became a "service."30 The next change, on 23 October 2006,31 saw "Member Conduct" become "User Conduct." The list of prohibited behaviors was extended (additions emphasized):
upload, post, transmit, share, store or otherwise make available any content that we deem to be harmful, threatening, unlawful, defamatory, infringing, abusive, inflammatory, harassing, vulgar, obscene, fraudulent, invasive of privacy or publicity rights, hateful, or racially, ethnically or otherwise objectionable;...
Two new interesting categories of prohibition appeared in this section:
upload, post, transmit, share, store or otherwise make available content that would constitute, encourage or provide instructions for a criminal offense, violate the rights of any party, or that would otherwise create liability or violate any local, state, national or international law;
upload, post, transmit, share, store or otherwise make available content that, in the sole judgment of Company, is objectionable or which restricts or inhibits any other person from using or enjoying the Site, or which may expose Company or its Users to any harm or liability of any type.
The first point places a limit on freedom of speech in a manner more consistent with European law than with the U.S. First Amendment. The requirement not to violate any "local, state, national or international law" was introduced at a time when Facebook was expanding internationally. The second clause prohibits material that is "objectionable" or inhibits enjoyment of the site, indicating an interventionist approach aimed at controlling the nature of the Facebook community.
On 24 May 2007,32 Facebook added a new paragraph requiring that people "also agree to abide by our Facebook Code of Conduct." The Code of Conduct33 provided an explanation for some rules. Under the heading "Inappropriate Content" Facebook explained: "While we believe users should be able to express themselves and their point of view, certain kinds of speech simply do not belong in a community like Facebook."
It went on to say that users "may not post or share Content" that "is obscene, pornographic or sexually explicit," "depicts graphic or gratuitous violence," "makes threats of any kind or that intimidates, harasses, or bullies anyone," "is derogatory, demeaning, malicious, defamatory, abusive, offensive or hateful." The point was further highlighted by the section "Unlawful or Harmful Content or Conduct," which explained: "Although as an online service provider, we are not responsible for the conduct of our users, we want Facebook to be a safe place on the internet."
This strong interventionist approach was to remain the Facebook position for almost two years.
The Revolution in Facebook`s Terms of Service
The 24 May 2007 version of the Terms of Service remained in force until a major change on 6 February 2009. This change generated much concern, in particular on issues of content ownership.34 Twenty-six consumer interest groups threatened to file a complaint with the U.S. Federal Trade Commission, Facebook users joined protest groups en masse, and Facebook reverted to the old terms and agreed to rewrite the new terms in consultation with the community.35,36
On 1 May 2009, the new Statement of Rights and Responsibilities replaced both the Terms of Use and Code of Conduct documents. Points previously listed under "User Conduct" are now listed under "Safety" and "Protecting Other People`s Rights." The "Safety" section says, "You will not bully, intimidate, or harass any user," "You will not post content that is hateful, threatening, pornographic, or that contains nudity or graphic or gratuitous violence," "You will not use Facebook to do anything unlawful, misleading, malicious, or discriminatory."
When compared to the earlier documents, the phrase "contains nudity" has been substituted for the previous clause "obscene, or sexually explicit." Although nudity need not be obscene or sexually explicit, this rewording explains Facebook`s sudden campaign against pictures of breastfeeding mothers. The prohibition on content that is hateful or threatening remains, while the one on content that "intimidates, harasses, or bullies" is replaced by a directive that users are not to engage in these three activities.
A number of items were dropped during the change. No longer prohibited is content that is "derogatory," "demeaning," "offensive," "harmful," "defamatory," "abusive," "inflammatory," "vulgar," "obscene," "fraudulent," "invasive of privacy or publicity rights," or "racially, ethnically or otherwise objectionable." Also gone is the clause not to "violate any local, state, national or international law." Facebook is not above the law and removing this clause changes nothing.
By dropping the ban on a whole raft of antisocial behaviors, from the "vulgar" to the "obscene," Facebook retracted its position that "certain kinds of speech simply do not belong in a community like Facebook."33 The removal of the prohibition on defamation and on racism, prohibited since the start, is particularly worrying, specifically in light of Facebook`s canned response that specifically talks about hate against individuals. Facebook has dropped its commitment to being a safe place on the internet. It has given up any pretense of being guided by morals rather than money.
Facebook`s Early Position Regarding Online Hate
Despite the initial Terms of Use and Code of Conduct, Facebook has never been eager to play a proactive role in shaping an online culture against discrimination and hate.
On Holocaust Memorial Day in January 2008, Israeli president Shimon Peres urged Jews and Israelis to use Facebook to combat anti-Semitism. This followed a meeting between Peres and Facebook founder Mark Zuckerberg at the World Economic Forum in Davos. As this author warned that February, "Facebook is not only a potentially effective tool for combating anti-Semitism, it is also a dangerously potent tool for promoting the spread of anti-Semitism."37 Zuckerberg was himself questioned on this in an interview with Nick O`Neill in March 2008:38 "I asked him about his thoughts on Facebook as a tool to fight anti-semitism and if Facebook would take proactive measures to fight against it. Mark believes...users can use these tools to connect and generate more worldly perspectives. As such Facebook does not need to be proactive about it."
More recently Zuckerberg has highlighted that Facebook sees itself taking a proactive role in the development of other aspects of online culture. Speaking on the topic of user content rights, he stated, "We`re at an interesting point in the development of the open online world where these issues are being worked out...we take these issues and our responsibility to help resolve them very seriously."39 With rising anti-Semitism and racism around the world, Facebook should be taking the spread of social acceptability of online hate equally seriously.
The strategy of not being proactive on online hate could only work in an absence of public scrutiny. No one knows how many reports of inappropriate content are made to Facebook each day, on what grounds they are made, or how many of these result in action being taken. What is known from anecdotal evidence is that action, if it does occur, is usually delayed by many months. Only media attention seems able to speed this up. Of the five Holocaust-denial groups originally reported to Facebook, two were removed once CNET picked up the story.
Debate, Defamation, and Denial
When forced to take a stand on the Holocaust-denial issue, Facebook had "a lot of internal debate."40 There were public statements of support from Facebook employees for what Randi Zuckerberg, the site`s marketing director (and sister of founder Mark Zuckerberg) called "Facebook`s policy to not remove groups that deny the Holocaust."40
The debate on removing Holocaust denial centers on these questions: should Facebook remove hateful content? Is Holocaust denial, by definition, hateful content? If so, why? And is counterspeech the best answer to hate speech?
The first question implies taking Facebook`s retreat on ethical issues one step further. Should Facebook provide any policing at all? At one extreme Facebook could abdicate responsibility entirely and only take action in response to requests from law enforcement. This would put Facebook on a par with unmoderated web forums.
The next two questions concern whether Holocaust denial is hate or, merely, ignorance. If Holocaust denial is a form of hate, then banning hate, while making a special provision for Holocaust denial, would itself be a racist action. If Holocaust denial is not hateful but only "repulsive," "repugnant," and "ignorant" (terms taken from various Facebook communications), the two policies can coexist.
The last question addresses cultural differences between the United States and most other countries. The question only becomes relevant if the prohibition on hateful content on Facebook is in danger of being dropped.
Should Facebook Remove Hateful Content?
At the Personal Democracy Forum in Manhattan, Facebook`s Randi Zuckerberg explained, "Our terms of service claim that if you are saying something that is hateful [or] if you are spreading words of violence that it comes down immediately."41 She went on to explain the difficulty that occurs with other forms of offensive speech: "When you have a site with over 200 million people, [they] are going to say things that are controversial or you don`t agree with or that personally may make you furious or upset.... But just because they say that doesn`t mean that it`s hate, it doesn`t mean that we should be censoring it."
This neatly sums up the change in Facebook`s approach and explains why so many terms were dropped from the new Statement of Rights and Responsibilities. The change is not accidental, but neither is it designed to allow hate. The problem then is one of correctly identifying hate.
The final email to Chris Matyszczyk, from Facebook spokesman Barry Schnitt, states:23 "The bottom line is that, of course, we abhor Nazi ideals and find holocaust denial repulsive and ignorant. However, we believe people have a right to discuss these ideas and we want Facebook to be a place where ideas, even controversial ideas, can be discussed."
In a similar vein Facebook`s chief privacy officer, Chris Kelly, wrote:42 "Holocaust denial is obviously repugnant and ignorant. Motivated by hate, it is not always clearly expressed that way. It therefore poses some of the most difficult challenges for any person or company devoted to free speech as a means to bubble up and address such repugnance and ignorance."
Kelly recognizes that Holocaust denial is motivated by hate, but like Schnitt stresses Facebook`s new commitment to free speech. Facebook is making a statement as part of its new push for openness as a key value. The company maintains a façade against hateful content because abandoning this commitment would be an admission that Facebook no longer commits itself to being "a safe place on the internet."
Being "safe" and having an overriding commitment to free speech are mutually exclusive. British law recognizes this by banning material that is "threatening, abusive or insulting" and intended or likely "to stir up racial hatred."43 Note that it is hate itself the UK tries to prevent, not just the resulting violence. An admission of a major change in policy, shifting the balance between safety and openness, could be damaging, so Facebook is instead living with a fiction. This fiction is that Holocaust denial might not be hate. It must be hoped that once it is recognized as hate, Facebook will remove it immediately as Randi Zuckerberg said.
Is Holocaust Denial, by Definition, Hateful Content?
The Working Definition of Anti-Semitism of the European Monitoring Centre on Racism and Xenophobia (EUMC, now the European Union Agency for Fundamental Rights) was presented by the U.S. State Department to the U.S. Congress as part of the "Contemporary Global Anti-Semitism" report in 2008. The report notes that "a widely accepted definition of anti-Semitism can be useful in setting the parameters of the issue" and adopts the EUMC definition as "a useful framework for identifying and understanding the problem."44
While having been accepted in the U.S. context, it should be noted that this definition is an EU initiative designed as a practical tool for law enforcement. As Michael Whine of the Community Security Trust (UK) explains, "the definition must be understood by a policeman on patrol, who can use it as the basis for determining if a racist criminal act has anti-Jewish motivation."45 The definition seems a perfect tool for Facebook itself to apply to issues of anti-Semitism. Others in a similar position have already adopted it, for example, the National Union of Students (UK) whose adoption of the definition was praised in the UK Parliament.46
The EUMC definition first explains what anti-Semitism is: "Anti-Semitism is a certain perception of Jews, which may be expressed as hatred toward Jews. Rhetorical and physical manifestations of anti-Semitism are directed toward Jewish or non-Jewish individuals and/or their property, toward Jewish community institutions and religious facilities."
Note that anti-Semitism is an idea-a perception-that is expressed as hatred and that it does not need to be directed at a Jew to count. The definition comes with an explanatory text that discusses examples of contemporary anti-Semitism. Only one of these points is needed here: `Denying the fact, scope, mechanisms (e.g., gas chambers) or intentionality of the genocide of the Jewish people at the hands of National Socialist Germany and its supporters and accomplices during World War II (the Holocaust)."
If Holocaust denial is regarded by experts and governments to be anti-Semitic, and if anti-Semitism is hate against Jews, then any rule against hate must equally be applied to Holocaust denial. While appreciating the difficulty Facebook has in recognizing hate speech and distinguishing it from other forms of offensive but nonhateful speech, Holocaust denial, well recognized as hate speech, is clearly the wrong place to make a stand.
In light of an appreciation that Holocaust denial is, by definition, hate speech, Facebook`s position can be reexamined. Facebook`s Statement of Rights and Responsibilities clearly asserts: "You will not post content that is hateful." According to Randi Zuckerberg it is also policy not to remove Holocaust-denial groups-an exception for one type of anti-Semitic hate. This is a racist policy and is in fact worse than a policy that simply allows all hate on an equal footing. The current policy privileges and gives acceptability to a particular form of hate. Facebook has itself created an anti-Semitic policy platform where the only explicitly allowed hate is that directed against Jews. It may have occurred accidentally and through ignorance, but Facebook needs to rectify it immediately, if only because that is what they said they would do for any instance of hate.
Why Is Holocaust Denial Hate Speech?
It is one thing to accept the definitions of experts and will of parliaments and other lawmakers, but it is another to understand these. What made the UN General Assembly resolve in 2007 that it "Condemns without any reservation any denial of the Holocaust"? Why did it add that it "Urges all Member States unreservedly to reject any denial of the Holocaust as a historical event, either in full or in part, or any activities to this end"?47 Finally, should a company with the global reach and influence of Facebook be taking notice of such a resolution aimed at countries?
As noted by Jeremy Jones, winner of Australia`s 2007 Human Rights Medal, "Holocaust Denial is a type of racial vilification that should be covered by any sensible anti-racist legislation." He gives an in-depth view of the development and danger of Holocaust denial. The following are sourced from his article: 48
The deniers` argument: As summarized in 1985 by Dr. John Foster:49
there was no plan in Nazi Germany to exterminate the Jews; the camps served a dual function, as internment camps for Jews and others who were considered a threat to national security, and as labour camps; the gassing of Jews was a myth; Zyklon B was a disinfecting agent used exclusively for delousing prisoners; those Jews who died did so as a result of hunger and disease. The Holocaust was a myth, a deliberate hoax, contrived by an unholy alliance of Communists and Zionists in an elaborate conspiracy to create sympathy and extort money for the cause of Israel and Jewish Communism.
The defamation: As Dr. Frank Knopfelmacher puts it, this argument constituted "a group-libel against an easily identifiable and traditionally stigmatised section of the population, which exceeds in ferocity and depth of malice anything that has happened in the field of ethnic animadversion in this country since at least since World War II." 50
The incitement: As Knopfelmacher notes, the intent of the deniers is to imply "that the Jewish people are witting and, rarely, unwitting accomplices in a conspiracy to extort, to lie and to kill, in order to acquire a counterfeit crown of martyrdom to be used for personal and political gain."
To further elaborate the point of defamation, in an article on "Holocaust Denial in England," Deborah Butler notes that:51
Denial of the Holocaust is often accompanied by the allegation that the historical account of the Holocaust is a Jewish fabrication for financial gain.... Even where this additional allegation is not made, it can be said to be implied since a large part of the historical account of the Holocaust consists of the survivors` descriptions of their experiences. Holocaust denial therefore represents a considerable insult to the Jewish people as well as an attempt to distort history.
Butler recommends strengthening laws on racially motivated hate so "the defendant would be convicted of an offence which treated Holocaust denial as an example of unacceptable racist speech."51 This is the position now arrived at across Europe thanks to the EUMC definition. It is also the position Facebook would, by its exclusion of hateful content, have upheld had it not instituted a policy to allow Holocaust denial, most likely in ignorance of the nature of Holocaust denial as hate speech.
Is Counterspeech the Best Answer to Hate Speech?
Is counterspeech the best answer to hate speech? This question is interesting but has no bearing on whether Facebook, if it prohibits hateful speech, should allow Holocaust denial. The question only gains relevance, beyond academic interest, if Facebook were to allow all forms of speech within the law in each country, that is, if it were to drop many of the prohibitions in the current Statement of Rights and Responsibilities.
In the first place, allowing Holocaust-denial groups is not an effective way to "debate" deniers. The argument is flawed because the deniers who control such groups have the ability to remove opposing viewpoints and individuals. The end result is simply the power of Facebook as a social networking tool, with over two hundred million active users, being used to connect deniers and spread hate. True, this can and is done with other websites such as Stormfront, but does Facebook really want to be providing free infrastructure for this activity? How would Facebook users feel about this? One group, United Against Holocaust Denial on Facebook, at the time of writing has over seventy-two thousand members and continues to grow.52
Beyond the question of debate is that of exposure. The comments by Facebook spokesman Barry Schnitt and Chief Privacy Officer Chris Kelly highlight the legitimate question of exposure as the best solution to hate. Schnitt wrote:23 "Would we rather holocaust denial was discussed behind closed doors or quietly propagated by anonymous sources? Or would we rather it was discussed in the open on Facebook where people`s real names and their photo is associated with it for their friends, peers, and colleagues to see?"
The question for Facebook is not whether to shut down Holocaust denial; it is more like a television network deciding to prohibit the broadcast of denial from the network. Facebook, and social networking generally, is a new and powerful form of media. The media can be used to expose hate, as the BBC documentary The Secret Agent did with the British National Party (BNP) in 2004.53 On Facebook itself a group of over six hundred thousand members are now exposing the BNP`s hate.54 Neither of these approaches requires the platform provider -- the BBC and Facebook, respectively - to act as a means for those wishing to broadcast hate. The hate itself, Holocaust denial in this case, is damaging. The benefit in exposing racists is not enough to justify the impact on the victims.
Christopher Wolf, chair of the Anti-Defamation League`s Internet Task Force and of the International Network Against Cyberhate, made a related point in 2008. Speaking about Nazi propaganda on YouTube, he explained, "If offered in an educational context, with explanation of their hateful origins and of how they glorified or played a role in the deaths of millions, perhaps such material would serve history.But they are not offered in that context; they are posted to provoke hate and to recruit haters...the purpose and effect of the videos is to inspire hate and violence.55
Chris Kelly wrote:42
In an ideal world Facebook, or any individual, could take an action that would firmly address hate once and for all. The policies in Facebook`s statement of rights and responsibilities are, however, designed to operate in the flawed real world. Cloaked hatred is not something those policies can change alone, but they may create circumstances where it is expressed and can be attacked for what it is instead of driving it underground to fester.
Kelly`s argument differs from Schnitt`s. He claims that cloaked hatred, in other words, that which is not able to be recognized as prohibited by the Terms of Service, at least creates an opportunity for people to respond. This is a sound point in general, but irrelevant when dealing with a kind of hate that is easily recognized such as Holocaust denial.
What if one were to adopt Schnitt`s suggestion? Those who will proudly declare themselves Holocaust deniers are allowed to do so. They are allowed to assemble other likeminded people. Those who know the community they live in would find racism unacceptable are drawn into a new community where they can be proud to be racist. They build their own groups and exclude those who would disagree with them. They share not only Holocaust denial but stories of the Protocols and other Jewish conspiracy theories. Facebook proves a great tool for building a virtual community. Perhaps they make the group private, by invitation only, with a hidden membership list, or perhaps not. How much benefit can this community gain from the Facebook platform in organizing, sharing, and spreading their message?
The far Right is rising in Europe. Nick Griffin, a Holocaust denier, was recently elected to the European Parliament. Perhaps the stigma against racists is eroding. Now is the time for them to organize, and Facebook is the most effective way for that to happen. Will Facebook become the tool that brings fascism back? Hitler had charisma, but he did not have Facebook. What could he have done with such a tool in his quest for power and his efforts to ensure the elimination of "subhumans" and dominance of the "master race"? To say people would never be drawn to such a message shows an ignorance of history.
Is counterspeech the answer? Yes. But counterspeech means saying hate, and Holocaust denial, is not welcome. Counterspeech does not mean providing a platform for hate or in any other way facilitating it.
The Limitations on Free Speech
Facebook has always recognized that there are limits to free speech. Hateful content has always been banned by the Terms of Service (at least in theory). As noted, Facebook could, theoretically, allow all legal content. This is no real solution. The right to speech needs to be balanced with the responsibility entailed.
Under international law, respect for the rights and reputations of others, and protection of public order are themselves sufficient grounds to justify the limitation of the right of expression. The only question is who limits this right. Is it done by the community, by service providers, or by the state? Outside of the United States, the general consensus is that the state is responsible for providing this protection on behalf of society.
Under international law the basis for freedom of expression is Article 19 of the International Covenant on Civil and Political Rights,56 which provides that: "Everyone shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of his choice."
The article immediately goes on to say that the exercise of these rights "carries with it special duties and responsibilities" and "may therefore be subject to certain restrictions, but these shall only be such as are provided by law and are necessary: (a) For respect of the rights or reputations of others; (b) For the protection of national security or of public order (ordre public), or of public health or morals."
The European Convention on Human Rights57 likewise states: "Everyone has the right to freedom of expression" but goes on to say this right can be limited "...in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or the rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary."
Other countries, such as Australia and Canada, which do not specifically prohibit Holocaust denial still prohibit public hate speech. Australian law prohibits the carrying out of a public act that will "offend, insult, humiliate or intimidate another person or group of people" who are targeted because of "race, colour or national or ethnic origin." The test is taken from the perspective of the victim.58 Canadian law likewise defines hate as crime under section 318 (Advocating Genocide) and 319 (Public Incitement of Hatred) of the Criminal Code.59 Section 319 makes "communicating statements, other than in private conversation, wilfully promot[ing] hatred against any identifiable group" an offense.60
Within the United States, where protection from the state is given a higher priority, the hope is that such restriction can be done without the need to create new laws, change interpretation of the First Amendment, or allow for group defamation. According to some legal scholars, a reinterpretation of the First Amendment more consistent with international law is indeed possible,61,62, group defamation is illegal under international law and most national laws could be introduced in the United States federally,63 and the internet or some parts of it, such as social networks and user-generated content platforms, could be treated differently.
There is a difference between removing the right to express a message, by law, and refusing to facilitate the spread of that message. As Christopher Wolf explained in 2004,64 "We seek voluntary cooperation of the Internet community-ISPs [internet service provides] and others-to join in the campaign against hate speech.That may mean enforcement of Terms of Service to drop offensive conduct; if more ISPs in the U.S. especially block content, it will at least be more difficult for haters to gain access through respectable hosts."
He noted this immediately after stating: "we believe that the best antidote to hate speech is counter-speech-exposing hate speech for its deceitful and false content, setting the record straight, and promoting the values of tolerance and diversity."
The two statements are not contradictory. Holocaust deniers can keep their sites, in countries that allow them, but by saying loudly and clearly "You are not welcome here," service providers and online communities are making a statement. By promoting and protecting the tolerance and diversity of the online community while excluding those promoting hate, a company such as Facebook can, in Chris Kelly`s words, take "an action that would firmly address hate."
More recently Wolf has stated that "as a matter of principle, society must take a stand about what is right and wrong. And, in addition, although little empirical study exists, there is no question that there is a link between hate speech online and real world violence."55 This was recently demonstrated by the deadly attack on the U.S. Holocaust Museum by James W. von Brunn, a Holocaust denier, white supremacist, and webmaster of a hate site.65
Even before the attack, Peter Breckheimer in discussing the implications of protecting internet hate speech under the First Amendment asserted: "to minimize the likelihood of future acts of hate-related violence, the United States must engage the world and actively attempt to find a reasonable solution to Internet hate speech...it is apparent that long-term international solutions are the only way to stem the rising tide of hate."66
Wolf has himself recently noted developments in the United States where courts have been reducing the immunity of internet service providers for postings made by their users. "The day may come, if a trend continues, where the potential for legal liability for tortious speech of others may compel ISPs and web sites to more actively monitor what goes out through their service."67
Racism and hate are social values, and they spread through social networks. When acceptable in polite company, hate grows; when made unacceptable, it shrinks. Hatred cannot be eliminated from human beings, but ground rules can be set for behavior within communities, including online communities.
Hateful content should continue to be banned by Facebook in full acknowledgment that this is widely considered an acceptable compromise of free expression.
This would meet the requirements of national laws outside the United States, as well as international law. Because, as a private company, under U.S. law the First Amendment does not apply to Facebook, there is no legal requirement to allow hate speech, and a strong moral argument to prevent it. Those wishing to spread hate under a guise of using their First Amendment rights should be told they may do so, but somewhere else. A failing by giants such as Facebook or Google may be all that is needed to eventually cause a serious change of the law in the United States. If the companies that can stop the hate choose not to do so, the people`s final recourse-under the First Amendment-is to petition the government, including through the courts.
Implications for Facebook
Beyond the risk of legal action being taken in countries where Holocaust denial is outlawed, Facebook`s change in values, as most clearly demonstrated regarding the Holocaust-denial groups, may ultimately lead to a clash of cultures and a decline in support for the platform. The culture Facebook Corporate now promotes is not the culture Facebook spent years fostering and protecting and that made Facebook such a success.
In a message on Facebook`s fifth birthday (February 2009), founder Mark Zuckerberg noted how the "culture of the Internet has also changed pretty dramatically over the past five years" and that "Facebook has offered a safe and trusted environment for people to interact online, which has made millions of people comfortable expressing more about themselves."68
The claim was indeed true, but back in February, Facebook still had a Code of Conduct that said it wanted to "be a safe place on the internet." With the shift in Facebook`s approach from one advocating a safe environment, and at least paying lip service to preventing discrimination, to a new position that deems some hate material acceptable, Facebook lost its moral compass. In time this may lead to the loss of trust not only between Facebook and its users but within the community itself. In an environment where people no longer feel safe, will they still be willing to share so much of their information?
Facebook`s effort not to take action on Holocaust denial seems based on a desire to avoid social responsibilities and to be treated as just another part of the web, rather than as a specific and influential online community. Facebook`s social capital exists precisely because it is different from the rest of the web and gave users a safe environment in which to express themselves. Without the safe environment, Facebook puts not only its users but the platform itself at risk. In trying to grow, Facebook must ensure that it does not lose sight of where it came from.
Conclusions
Facebook has demonstrated once again that it is media pressure and not its own Terms of Service or ethical deliberations that cause action to be taken against online hate. The company has watered down the provisions against various types of hateful content and dropped its promise to provide a "safe place on the internet." Most alarmingly, despite still prohibiting hateful content, Facebook has decided as policy to allow Holocaust denial on the platform. This demonstrates a lack of understanding regarding anti-Semitism and Holocaust denial in particularly, and a lack of engagement with the problem of anti-Semitism 2.0.
Holocaust denial is a special case under international law. It is recognized as hate speech internationally. There are calls from the United Nations down for all efforts to be taken to eliminate Holocaust denial, which is both a serious defamation against the Jewish people and a tool to promote new hate against the Jewish people through conspiracy theories. Conspiracy theories of Jewish power contributed to the Holocaust; by allowing them, "never again" is made an empty promise.
* * *
Notes
1. Andre Oboler, "Online Anti-Semitism 2.0: `Social Anti-Semitism on the Social Web,`" Post-Holocaust and Anti-Semitism, 67, 1 April 2008.
2. Stephanie Rubenstein, "Jewish Internet Defense Force `Seizes Control` of Anti-Israel Facebook Group," Jerusalem Post, 29 July 2008.
3. Matthew Moore, "Facebook: `Anti-Semitic` Group Hijacked by Jewish Force," 31 July 2008, www.telegraph.co.uk/news/2478773/Facebook-Anti-semitic-group-destroyed-by-Israeli-hackers.html.
4. Benjamin L. Hartman, "GA Special Feature: An Online Battle for Israel`s Legitimacy," Haaretz, 11 November 2008.
5. Andre Oboler, "The Rise and Fall of a Facebook Hate Group," First Monday, vol. 13, no. 11 (November 2008).
6. JIDF (August 2008), The JIDF, www.thejidf.org/2008/08/more-samples-of-anti-Semitism-on.html.
7. JIDF (August 2008), The JIDF, www.thejidf.org/2008/08/another-sample-of-material-we-are.html.
8. JIDF (February 2008), The JIDF, www.thejidf.org/2008/02/problematic-youtube-channels-and-videos.html.
9. JIDF (July 2003), The JIDF, www.thejidf.org/2003/07/jidf-guide-to-hostile-facebook-groups.html.
10. Zionism On The Web, www.zionismontheweb.org/antizionism/jewwatch.htm.
11. JIDF (September 2008), The JIDF, www.thejidf.org/2008/09/freedom-of-speech-and-youtube.html.
12. JIDF (October 2008), The JIDF, www.thejidf.org/2008/10/important-regarding-content-on-facebook.html.
13. David Eshmoili, Re: Article, 9 July 2009 (personal communication).
14. JIDF, Re: Article, 9 July 2009 (personal communication).
15. JIDF (October 2008), The JIDF, www.thejidf.org/2008/10/letter-to-facebook-regarding-illegal.html.
16. Brian Cuban (November 2008), The Cuban Revolution, http://briancuban.com/the-facebook-of-holocaust-denial.
17. Chris Matyszczyk (May 2009), CNET News, http://news.cnet.com/8301-17852_3-10233245-71.html.
18. Brian Cuban (May 2009), The Cuban Revolution, www.briancuban.com/facebook-at-odds-with-obama-on-holocaust-denial.
19. The Guardian (May 2009), The Guardian, www.guardian.co.uk/technology/blog/2009/may/11/facebook-holocaust-denial.
20. Lisa Respers France (May 2009), CNN, www.cnn.com/2009/TECH/05/08/facebook.holocaust.denial/index.html.
21. Daniel Emery (June 2009), BBC News, http://news.bbc.co.uk/2/hi/technology/8097979.stm.
22. Fox News (May 2009), Fox News, www.foxnews.com/story/0,2933,519917,00.html.
23. Brian Cuban (May 2009), The Cuban Revolution, www.briancuban.com/facebook-holocaust-denial-should-be-discussed-openly.
24. European Union (July 2005), EUROPA: Summaries of EU Legislation, http://europa.eu/legislation_summaries/justice_freedom_security/combating_discrimination/l33058_en.htm.
25. International Military Tribunal (April 1945), The Avalon Project, Yale Law School, http://avalon.law.yale.edu/imt/imtconst.asp#art6.
26. "Facebook Shuts Ku Klux Klan Page," The Telegraph, May 2009.
27. Christoph Gunkel, "Facebook und Google Earth: Anti-Semitismus im Web 2.0," FAZ, 14 October 2008. [German]
28. Facebook (October 2005), Facebook via Archive.org, http://web.archive.org/web/20051126052914/www.facebook.com/terms.php.
29. Qing Li, "Gender and CMC: A Review on Conflict and Harassment," Australiasian Journal of Educational Technology, vol. 21, no. 3 (2005), 382-406.
30. Facebook (February 2006), Facebook via Archive.org, http://web.archive.org/web/20060427185410/www.facebook.com/terms.php.
31. Facebook (October 2006), Facebook via Archive.org, http://web.archive.org/web/20061117113750/www.facebook.com/terms.php.
32. Facebook (May 2007) Facebook via Archive.org, http://web.archive.org/web/20070830163157/www.facebook.com/terms. php.
33. Facebook (May 2007), Facebook via Archive.org, http://web.archive.org/web/20070830163931/www.facebook.com/codeofconduct. php.
34. Brian Stelter (February 2009), The New York Times, www.nytimes.com/2009/02/17/technology/internet/17facebook.html.
35. Robin Wauters (February 2009), TechCrunch, www.techcrunch.com/2009/02/17/facebook-backtracks-under-community-pressure-goes-back-to-old-tos-for-now.
36. Brad Stone and Brian Stelter (February 2009), The New York Times, www.nytimes.com/2009/02/19/technology/internet/19facebook.html?_r=1.
37. Andre Oboler, "Facing Up to the `Facebook` Dilemma," Jerusalem Post, 6 February 2008.
38. Nick O`Neill (March 2008), All Facebook, www.allfacebook.com/2008/03/my-interview-with-mark-zuckerberg.
39. Mark Zuckerberg (February 2009), The Facebook Blog, http://blog.facebook.com/blog.php?post=54434097130.
40. Michael Arrington (June 2009), TechCrunch, www.techcrunch.com/2009/06/15/facebook-employees-speak-their-mind-on-holocaust-denial.
41. Chloe Albanesius (June 2009) AppScout, www.appscout.com/2009/06/facebook_free_speech_is_really.php.
42. Chris Kelly (May 2009), The JIDF, www.thejidf.org/2009/05/chris-kelly-chief-privacy-officer.html.
43. UK Parliament (1986), Public Order Act 1986, www.statutelaw.gov.uk/content.aspx?activeTextDocId=2236942.
44. U.S. Department of State, "Contemporary Global Anti-Semitism: A Report Provided to the United States Congress," Washington, DC, 2008.
45. Michael Whine, "Devising Unified Criteria and Methods of Monitoring Anti-Semitism," Jewish Political Studies Review, vol. 21, no. 1-2 (Spring 2009), www.jcpa.org/JCPA/Templates/ShowPage.asp?DRIT=3&DBID=1&LNGID=1&TMID=111&FID=625&PID=0&IID=2966&TTL=Devising_Unified_Criteria_and_Methods_of_Monitoring_Anti-Semitism.
46. UK Parliament, Anti-Semitism on Campus (605), 3 February 2009.
47. UN General Assembly (March 2007), The Holocaust and the United Nations Outreach Programme, http://157.150.195.10/holocaustremembrance/docs/res61.shtml.
48. Jeremy Jones, "Holocaust Denial: `Clear and Present` Racial Vilification," Australian Journal of Human Rights, vol. 1, no. 1 (1994), 169-180.
49. John Foster, "Fabricating History," in Anti-Semitism and Human Rights (Melbourne: Australian Institute of Jewish Affairs, 1985).
50. Frank Knopfelmacher, The Age, March 1979, cited in Jeremy Jones, "Holocaust Denial: `Clear and Present` Racial Vilification," Australian Journal of Human Rights, vol. 1, no. 1 (1994), 169-180.
51. Deborah Butler, "Holocaust Denial in England," Web Journal of Current Legal Issues, 4, 1997, http://webjcli.ncl.ac.uk/1997/issue4/butler4.html.
52. United Against Holocaust Denial on Facebook (May 2009), Facebook, www.facebook.com/group.php?gid=38697069983.
53. BBC News (July 2004), BBC News, http://news.bbc.co.uk/2/hi/uk_news/magazine/3896213.stm.
54. "1,000,000 United against the BNP" (February 2008), Facebook, www.facebook.com/group.php?gid=8644741474.
55. Christopher Wolf, "Combating Anti-Semitism in Cyberspace," International Conference of the Global Forum for Combating Anti-Semitism, February 2008, www.adl.org/main_internet/wolf_proskauer.htm.
56. UN General Assembly (March 1976), Office of the High Commissioner for Human Rights, www.unhchr.ch/html/menu3/b/a_ccpr.htm.
57. Council of Europe, "The European Convention on Human Rights," Rome, 1950, www.hri.org/docs/ECHR50.html.
58. Australian Human Rights Commission (August 2007), Racial Vilification Law in Australia, www.hreoc.gov.au/racial_discrimination/cyberracism/vilification.html.
59. Media Awareness Network, www.media-awareness.ca/english/resources/legislation/canadian_law/federal/criminal_code/criminal_code_hate.cfm.
60. Criminal Code R.S.C. 1985, c. C-46, s. 319, Department of Justice, Canada, http://laws.justice.gc.ca/en/ShowFullDoc/cs/C-46//20090714/en.
61. Catharine MacKinnon, Only Words (Cambridge: Harvard University Press, 1993).
62. Mari J Matsuda, Charles R. Lawrence III, Richard Delgado, and Kimberle Williams Crenshaw, Words That Wound: Critical Race Theory, Assaultive Speech, and the First Amendment (Boulder, CO: Westview Press, 1993).
63. Thomas David Jones, Human Rights: Group Defamation, Freedom of Expression and the Law of Nations (The Hague: Nijhoff, 1997).
64. Christopher Wolf (September 2006), ADL, www.adl.org/main_internet/internet_hate_law.htm.
65. Lindy Royce, "Guard Killed during Shooting at Holocaust Museum," CNN, 10 June 2009, http://edition.cnn.com/2009/CRIME/06/10/museum.shooting.
66. Peter J. Breckheimer II, "A Haven for Hate: The Foreign and Domestic Implications of Protecting Internet Hate Speech under the First Amendment," Southern California Law Review, vol. 75, no. 2 (September 2002), 1493-1528.
67. Christopher Wolf (April 2008), The Dangers Inherent in Web 2.0, www.adl.org/main_internet/Dangers_Web20.htm.
68. Mark Zuckerberg (February 2009), The Facebook Blog, http://blog.facebook.com/blog.php?post=51892367130.
69. Facebook (May 2009), Statement of Rights and Responsibilities, www.facebook.com/terms.php.
70. Andre Oboler "Odio y anti-Semitismo on line," 4 September 2008, www.prensajudia.com/shop/detallenot.asp?notid=10156 [Spanish], English at www.zionismontheweb.org/zionism_commentary/GoogleEarth_Facebook_english.htm.
71. Australian Government Attorney-General`s Department (May 2008), Australian Government Attorney-General`s Department, www.ag.gov.au/www/agd/agd.nsf/Page/Copyright_IssuesandReviews_Moralrights.
72. Mira Sundara Rajan, "Moral Rights in the Digital Age: New Possibilities for the Democratisation of Culture," in 16th British and Irish Law, Education and Technology Association Annual Conference, Edinburgh, 2001.
73. The Official Anne Frank House Website, www.annefrank.org/content.asp?PID=888&LID=2.
74. Christoph Gunkel, "Facebook und Google Earth: Anti-Semitismus im Web 2.0," Frankfurter Allgemeine Zeitung, October 2008. [German]
75. "German Pol Fined for Playing Klezmer Music," JTA, July 2009, http://jta.org/news/article/2009/07/02/1006285/german-pol-fined-for-playing-klezmer-music.
76. Peter J. Breckheimer II, "A Haven for Hate: The Foreign and Domestic Implications of Protecting Internet Hate Speech under the First Amendment."
* * *
Dr. Andre Oboler is a social media expert. He holds a PhD in computer science from Lancaster University, UK, and has been a postdoctoral fellow in political science at Bar-Ilan University in Israel and a Legacy Heritage Fellow at NGO Monitor in Jerusalem. He edits ZionismOnTheWeb.org, a website countering online hate.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment