Deplatforming

From Justapedia, unleashing the power of collective wisdom
Jump to navigation Jump to search

Deplatforming, also known as no-platforming, has been defined as an "attempt to boycott a group or individual through removing the platforms (such as speaking venues or websites) used to share information or ideas,"[1] or "the action or practice of preventing someone holding views regarded as unacceptable or offensive from contributing to a forum or debate, especially by blocking them on a particular website."[2]

History

Deplatforming of invited speakers

In the United States, the banning of speakers on University campuses dates back to the 1940s. This was carried out by policies of the universities themselves. The University of California had a policy known as the Speaker Ban, codified in university regulations under President Robert Gordon Sproul that mostly, but not exclusively, targeted communists. One rule stated that "the University assumed the right to prevent exploitation of its prestige by unqualified persons or by those who would use it as a platform for propaganda." This rule was used in 1951 to block Max Shachtman, a socialist, from speaking at the University of California at Berkeley. In 1947, former U.S. Vice President Henry A. Wallace was banned from speaking at UCLA because of his views on U.S. Cold War policy,[3] and in 1961, Malcolm X was prohibited from speaking at Berkeley as a religious leader.

Controversial speakers invited to appear on college campuses have faced deplatforming attempts to disinvite them or to otherwise prevent them from speaking.[4] The British National Union of Students established its No Platform policy as early as 1973.[5] In the mid-1980s, visits by South African ambassador Glenn Babb to Canadian college campuses faced opposition from students opposed to apartheid.[6]

In the United States, recent examples include the March 2017 disruption by protestors of a public speech at Middlebury College by political scientist Charles Murray.[4] In February 2018, students at the University of Central Oklahoma rescinded a speaking invitation to creationist Ken Ham, after pressure from an LGBT student group.[7][8] In March 2018, a "small group of protesters" at Lewis & Clark Law School attempted to stop a speech by visiting lecturer Christina Hoff Sommers.[4] In the 2019 film No Safe Spaces, Adam Carolla and Dennis Prager documented their own disinvitation along with others.[9]

As of February 2020, the Foundation for Individual Rights in Education, a speech advocacy group, documented 469 disinvitation or disruption attempts at American campuses since 2000,[10] including both "unsuccessful disinvitation attempts" and "successful disinvitations"; the group defines the latter category as including three subcategories: formal disinvitation by the sponsor of the speaking engagement; the speaker's withdrawal "in the face of disinvitation demands"; and "heckler's vetoes" (situations when "students or faculty persistently disrupt or entirely prevent the speakers' ability to speak").[11]

Deplatforming in social media

Beginning in 2015, Reddit banned several communities on the site ("subreddits") for violating the site's anti-harassment policy.[12] A 2017 study published in the journal Proceedings of the ACM on Human-Computer Interaction, examining "the causal effects of the ban on both participating users and affected communities," found that "the ban served a number of useful purposes for Reddit" and that "Users participating in the banned subreddits either left the site or (for those who remained) dramatically reduced their hate speech usage. Communities that inherited the displaced activity of these users did not suffer from an increase in hate speech."[12] In June 2020 and January 2021, Reddit also issued bans to two prominent online pro-Trump communities over violations of the website's content and harassments policies.

On May 2, 2019, Facebook and the Facebook-owned platform Instagram announced a ban of "dangerous individuals and organizations" including Nation of Islam leader Louis Farrakhan, Milo Yiannopoulos, Alex Jones and his organization InfoWars, Paul Joseph Watson, Laura Loomer, and Paul Nehlen.[13][14] In the wake of the 2021 storming of the US Capitol, Twitter banned then-president Donald Trump, as well as 70,000 other accounts linked to the event and the far-right movement QAnon.

Donald Trump

On January 6, 2021, in a joint session of the United States Congress, the counting of the votes of the Electoral College was interrupted by a breach of the United States Capitol chambers. The rioters were supporters of President Donald Trump who hoped to delay and overturn the President's loss in the 2020 election. The event resulted in five deaths and at least 400 people being charged with crimes.[15] The certification of the electoral votes was only completed in the early morning hours of January 7, 2021. In the wake of several Tweets by President Trump on January 7, 2021 Facebook, Instagram, YouTube, Reddit, and Twitter all deplatformed Trump to some extent.[16][17][18][19] Twitter deactivated his personal account, which the company said could possibly be used to promote further violence. Trump subsequently tweeted similar messages from the President's official US Government account @POTUS, which resulted in him being permanently banned on January 8.[20] Twitter has announced that Trump's ban from their platform will be permanent.

Trump planned to re-join on social media through the use of a new platform by May or June 2021, according to Jason Miller on a Fox News broadcast.[21][22]

Other examples

Demonetization

Social media platforms such as YouTube and Instagram allow their content producers or influencers to earn money based on the content (videos, images, etc.), most typically based around some sort of payment per a set number of new "likes" or clicks etc. When the content is deemed inappropriate for compensation, but still left on the platform, this is called "demonetization" because the content producer is left with no compensation for their content that they created, while at the same time the content is still left up and available for viewing or listening by the general public.[23] In September 2016, Vox reported that demonetization—as it pertained to YouTube specifically—involved the following key points:

  • "Since 2012, YouTube has been automatically 'demonetizing' some videos because its software thought the content was unfriendly for advertisers."[23]
  • "Many YouTube video makers didn’t realize this until last week, when YouTube began actively telling them about it."[23]
  • "This has freaked YouTubers out, even though YouTube has been behaving rationally by trying to connect advertisers to advertiser-friendly content. It’s not censorship, since YouTube video makers can still post (just about) anything they want."[23]
  • "YouTube’s software will screw things up, which means videos that should have ads don’t, which means YouTube video makers have been missing out on ad revenue."[23]

Harassment and threats to employment

Deplatforming tactics have also included attempts to silence controversial speakers through various forms of personal harassment, such as doxing,[24] the making of false emergency reports for purposes of swatting,[25] and complaints or petitions to third parties. In some cases, protesters have attempted to have speakers blacklisted from projects or fired from their jobs.[26]

In 2019, for example, students at the University of the Arts in Philadelphia circulated an online petition demanding that Camille Paglia "should be removed from UArts faculty and replaced by a queer person of color."[27] According to The Atlantic's Conor Friedersdorf, "It is rare for student activists to argue that a tenured faculty member at their own institution should be denied a platform."[27] Paglia, a tenured professor for over 30 years who identifies as transgender, had long been unapologetically outspoken on controversial "matters of sex, gender identity, and sexual assault".[27]

In print media

In December 2017, after learning that a French artist it had previously reviewed was a neo-Nazi, the San Francisco punk magazine Maximum Rocknroll apologized and announced that it has "a strict no-platform policy towards any bands and artists with a Nazi ideology".[28]

Legislative responses

United Kingdom

In May 2021, the UK government under Boris Johnson announced a Higher Education (Freedom of Speech) Bill that would allow speakers at universities to seek compensation for no-platforming, impose fines on universities and student unions that promote the practice, and establish a new ombudsman charged with monitoring cases of no-platforming and academic dismissals.[29] In addition, the government published an Online Safety Bill that would prohibit social media networks from discriminating against particular political views or removing "democratically important" content, such as comments opposing or supporting political parties and policies.[30]

United States

Some critics of deplatforming have proposed that governments should treat social media as a public utility to ensure that constitutional rights of the users are protected, citing their belief that an Internet presence using social media websites is imperative in order to adequately take part in the 21st century as an individual.[31] Republican politicians have sought to weaken the protections established by Section 230 of the Communications Decency Act—which provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users—under allegations that the moderation policies of major social networks are not politically neutral.[32][33][34][35]

Reactions

Justifications

According to its defenders, deplatforming has been used as a tactic to prevent the spread of hate speech and disinformation.[12] Social media has evolved into a significant source of news reporting for its users, and support for content moderation and banning of inflammatory posters has been defended as an editorial responsibility required by news outlets.[36]

Supporters of deplatforming have justified the action on the grounds that it produces the desired effect of reducing what they characterize as "hate speech".[12][37][38] Angelo Carusone, president of the progressive organization Media Matters for America and who had run deplatforming campaigns against conservative talk hosts Rush Limbaugh in 2012 and Glenn Beck in 2010, pointed to Twitter's 2016 ban of Milo Yiannopoulos, stating that "the result was that he lost a lot.... He lost his ability to be influential or at least to project a veneer of influence."[37]

In the United States, deprivation of rights under the First Amendment is sometimes cited as a criticism of deplatforming, but according to Audie Cornish, host of the NPR show Consider This, modern deplatforming is not a government issue. She states that "the government can't silence your ability to say almost anything you want on a public street corner. But a private company can silence your ability to say whatever you want on a platform they created."[39] Because of this, proponents say, deplatforming is a legal way of dealing with controversial users online or in other digital spaces, so long as the government is not involved with causing the deplatforming.

Critical responses

According to technology journalist Declan McCullagh, "Silicon Valley's efforts to pull the plug on dissenting opinions" began around 2018 with Twitter, Facebook, and YouTube denying service to selected users of their platforms, "devising excuses to suspend ideologically disfavored accounts."[40] In 2019, McCullagh predicted that paying customers would become targets for deplatforming as well, citing protests and open letters by employees of Amazon, Microsoft, Salesforce, and Google who opposed policies of U.S. Immigration and Customs Enforcement (ICE), and who reportedly sought to influence their employers to deplatform the agency and its contractors.[40]

Law professor Glenn Reynolds dubbed 2018 the "Year of Deplatforming" in an August 2018 article in The Wall Street Journal. Reynolds criticized the decision of "internet giants" to "slam the gates on a number of people and ideas they don't like", naming Alex Jones and Gavin McInnes.[41] Reynolds cited further restrictions on "even mainstream conservative figures" such as Dennis Prager, as well as Facebook's blocking of a campaign advertisement by a Republican candidate "ostensibly because her video mentioned the Cambodian genocide, which her family survived."[41]

In a 2019 article, Conor Friedersdorf described what he called "standard practice" among student activists: "Activists begin with social-media callouts; they urge authority figures to impose outcomes that they favor, without regard for overall student opinion; they try to marshal antidiscrimination law to limit freedom of expression."[27] Friedersdorf pointed to evidence of a chilling effect on free speech and academic freedom. Of the faculty members he had contacted for interviews, a large majority "on both sides of the controversy insisted that their comments be kept off the record or anonymous. They feared openly participating in a debate about a major event at their institution—even after their university president put out an uncompromising statement in support of free speech".[27]

See also

References

  1. ^ The Good, The Bad, & The Semantically Imprecise: The words that defined the week of August 10, 2018, Merriam-Webster (August 8, 2018).
  2. ^ Deplatforming, Lexico.com (Dictionary.com/Oxford University Press).
  3. ^ Freeman, Jo (2000). "A Short History of the University of California Speaker Ban". JoFreeman.com. Archived from the original on December 8, 2019.
  4. ^ a b c Young, Cathy (April 8, 2018). "Half of college students aren't sure protecting free speech is important. That's bad news". Los Angeles Times. Archived from the original on February 8, 2019.
  5. ^ German, Lindsey (April 1986). "No Platform: Free Speech for all?". Socialist Worker Review (86).
  6. ^ Bueckert, Michael (April 2018). "No platform for Apartheid". africasacountry.com. Retrieved November 17, 2020.
  7. ^ Hinton, Carla (February 8, 2018). "UCO Student Group Rescinds Invitation to Christian Speaker Ken Ham". The Oklahoman. Archived from the original on May 28, 2018.
  8. ^ Causey, Adam Kealoha (February 8, 2018). "Creationist's speech canceled at university in Oklahoma". Houston Chronicle. Associated Press. Archived from the original on February 9, 2018.
  9. ^ Fund, John (November 3, 2019). "In No Safe Spaces, an Odd Couple Teams Up to Fight Free-Speech Bans". National Review. Archived from the original on December 18, 2019.
  10. ^ "Disinvitation Database". Foundation for Individual Rights in Education. Retrieved February 16, 2021.
  11. ^ "User's Guide to FIRE's Disinvitation Database". Foundation for Individual Rights in Education. June 9, 2016. Archived from the original on March 9, 2017. Retrieved February 16, 2021.
  12. ^ a b c d Chandrasekharan, Eshwar; Pavalanathan, Umashanti; et al. (November 2017). "You Can't Stay Here: The Efficacy of Reddit's 2015 Ban Examined Through Hate Speech" (PDF). Proceedings of the ACM on Human-Computer Interaction. 1 (CSCW): Article 31. doi:10.1145/3134666. S2CID 22713682.
  13. ^ Wells, Georgia (May 2, 2019). "Facebook Bans Louis Farrakhan, Alex Jones and Others as 'Dangerous'". The Wall Street Journal. Archived from the original on May 3, 2019.
  14. ^ Lorenz, Taylor (May 2, 2019). "Instagram and Facebook Ban Far-Right Extremists". The Atlantic. Archived from the original on May 3, 2019.
  15. ^ "The Capitol Siege: The Arrested And Their Stories". NPR.org. Retrieved April 18, 2021.
  16. ^ John Healy (January 8, 2021) Opinion: It took a mob riot for Twitter to finally ban Trump
  17. ^ Danny Crichton (January 9, 2021) The deplatforming of President Trump
  18. ^ Casey Newton (January 6, 2021) It's time to deplatform Trump
  19. ^ Jaclyn Diaz (January 13, 2021) YouTube Joins Twitter, Facebook In Taking Down Trump's Account After Capitol Siege
  20. ^ "The expulsion of Donald Trump marks a watershed for Facebook and Twitter". The Economist. January 10, 2021. ISSN 0013-0613. Retrieved January 10, 2021.
  21. ^ Martin Pengelly (March 21, 2021) Trump will use 'his own platform’ to return to social media after Twitter ban
  22. ^ Breuninger, Kevin (June 2, 2021). "Trump blog page shuts down for good". CNBC.
  23. ^ a b c d e Kafka, Peter (September 4, 2016). "YouTube 'demonetization,' explained for normals". www.vox.com. Vox. Retrieved September 14, 2022.
  24. ^ Wilson, Jason (December 18, 2018). "How the world has fought back against the violent far-right and started winning". The Guardian. Archived from the original on April 2, 2019.
  25. ^ Shirek, Jon (June 8, 2012). "9-1-1 hoax snares conservative blogger". 11Alive.com. Atlanta: WXIA-TV. Archived from the original on January 16, 2013.
  26. ^ Mishra, Manas; Balan, Akshay. "Google to pull plug on AI ethics council". Reuters. Archived from the original on April 5, 2019.
  27. ^ a b c d e Friedersdorf, Conor (May 2019). "Camille Paglia Can't Say That". The Atlantic. Archived from the original on May 1, 2019.
  28. ^ "Letters". Maximum Rocknroll (editorial statement). No. 415. December 2017. p. 8.
  29. ^ "Universities could face fines over free speech breaches". BBC News. May 12, 2021. Retrieved May 13, 2021.
  30. ^ Hern, Alex (May 12, 2021). "Online safety bill 'a recipe for censorship', say campaigners". The Guardian. Retrieved May 13, 2021.
  31. ^ Thierer, Adam (March 2012). "The Perils of Classifying Social Media Platforms as Public Utilities" (PDF). Mercatus Center (working paper). George Mason University. Retrieved September 15, 2021.
  32. ^ Robertson, Adi (June 21, 2019). "Why the internet's most important law exists and how people are still getting it wrong". The Verge. Archived from the original on February 26, 2021. Retrieved July 17, 2019.
  33. ^ Lecher, Colin (June 20, 2019). "Both parties are mad about a proposal for federal anti-bias certification". The Verge. Archived from the original on February 24, 2021. Retrieved July 17, 2019.
  34. ^ Brandom, Russell (June 17, 2020). "Senate Republicans want to make it easier to sue tech companies for bias". The Verge. Archived from the original on February 20, 2021. Retrieved June 17, 2020.
  35. ^ Kelly, Makena (September 8, 2020). "Republicans pressure platforms with new 230 bill". The Verge. Archived from the original on February 28, 2021. Retrieved September 8, 2020.
  36. ^ Yaraghi, Niam (January 10, 2021). "How should social media platforms combat misinformation and hate speech?". Brookings. Archived from the original on April 9, 2019.
  37. ^ a b Koebler, Jason (August 10, 2018). "Deplatforming Works". Motherboard. Vice Media. Archived from the original on March 19, 2019.
  38. ^ Wong, Julia Carrie (September 4, 2018). "Don't give Facebook and YouTube credit for shrinking Alex Jones' audience". The Guardian. London.
  39. ^ "Deplatforming: Not A First Amendment Issue, But Still A Tough Call For Big Tech : Consider This from NPR". NPR.org. Retrieved April 18, 2021.
  40. ^ a b McCullagh, Declan (February 2019). "Deplatforming Is a Dangerous Game". Reason. Archived from the original on March 31, 2019.
  41. ^ a b Reynolds, Glenn Harlan (August 18, 2018). "When Digital Platforms Become Censors". The Wall Street Journal. Archived from the original on March 30, 2019.

Further reading

External links

  • The dictionary definition of deplatform at Wiktionary