On 5 June 2019, YouTube announced plans to implement a new policy for removing “harmful and supremacist content” (YouTube 2019). The platform periodically engages in purges, having demonetized videos and removed certain controversial channels in the past years in response to concerns regarding the charge that the video platform was a “right-wing safe space” (Dunphy 2017; Hern 2018; Holt 2017). This blog post offers a brief analysis of the latest purge by following links from Nazi threads on the far-right message board 4chan/pol/ to YouTube, forming an extremist and “fringe perspective” on the video platform. Did YouTube indeed successfully delete extremist content? What videos were deleted? How were they deleted? Do these deleted videos appear elsewhere on the Web?
While often characterised as an “underground” or “sewer”, the imageboard 4chan is no lone island in the larger Web ecosystem. Strikingly, 70% of its links go to YouTube — a platform that 94% of 18 to 24 year olds in the US visits regularly (Smith and Anderson 2018). From niche videos on subcultural ephemera to so-called “bloodsport” videos concerning long-form discussions on extremist topics, 4chan seems deeply bound to Google’s video platform, in turn creating usually obscure, and sometimes problematic pockets of cross-platform activity. Can we follow these linked-to corners of YouTube to identify what topics are discussed, and how YouTube responds to controversial contents? Can we explore “4chan’s YouTube”?
Firstly, how does 4chan work and how can it be used as a “lens” onto YouTube? We focus solely on /pol/, 4chan’s political discussion board. While many of its frequenters will not associate themselves with the term, /pol/ has been seen as the source of creative energy for the “alt-right” (Nagle 2017). While notoriously hard to pin down ideologically, the alt-right is said to ascribe to biological determinist arguments concerning race and gender. This, in addition to the provocative “anything goes” tone of /pol/, made an opening for neo-Nazi white supremacists on the imageboard. Because of its well-known affordance of anonymity, reporting on 4chan has historically tended to treat its users as a mass of undifferentiated posters operating under the cloak of anonymity (Coleman 2014; Phillips et al. 2017). In addition to anonymity, 4chan is characterised by ephemerality: every post is eventually deleted from the website. In order to work against this mechanic and maintain ongoing discussion, some users constantly repost specific recognisable discussion topics, called “generals” (OILab 2018). These self-organised issue publics form convenient handles to unpack 4chan’s anonymous mass into constituent conversational parts, and, as done so below, as “windows” onto other platforms like YouTube.
On a content level, both inside and outside these general threads, 4chan/pol/ users are mostly preoccupied with conspiracy theories (Tuters et al. 2018) and antisemitic memes (Tuters and Hagen 2018). The most charitable analysis for why this is comes from Michael Malice (2019), who makes the point that if you overdose on the “red pill”, everything appears as a conspiracy of the media and establishment. Indeed, /pol/ is well-known to be productive of various conspiracy theories, including Pizzagate and QAnon. As we will see, it should thus come as no surprise that the most frequently linked-to YouTube videos from /pol/ tend have an antisemitic valence, exhibiting an “irrational hatred of Jews” (Lipstadt 2019).
Methods: Fringe perspectivism
To map “extreme” parts of YouTube, we used the most extremist general thread we could find: the self-explanatory “National Socialism general” or /nsg/. We then followed the URLs to YouTube posted in these threads. Even though /nsg/ is no longer active (see figure 2), its collection of links represent an obvious reference point from which to measure antisemitic and otherwise extremist activity on YouTube. This can in turn shine light on whether, and if so how, this antisemitic sentiment survives online in the aftermath of YouTube’s latest purge. We call such a cross-platform method “fringe perspectivism”, denoting how fringe discussion fora can be used as measures of ideological discussion within mainstream platforms. We subsequently compared the deleted and the still-online videos posted in /nsg/ threads. What should be noted is that the most-linked to videos are influenced by copy-pasted sets of URLs in opening posts, as well as by occasional spammers, so this only provides a view on pure activity, which does not necessarily overlap with videos that have a high cultural currency. Afterwards, we analysed the way in which the videos were deleted by comparing the before and after situations and grouping error messages for the deleted videos. Finally, we traced where else the deleted content could still be found on the Web.
YouTube’s Great Purge of 2019
On 5 June 2019, YouTube conducted a “purge” of problematic content. This action was allegedly motivated by a feud between a Vox journalist, Carlos Maza, and a conservative YouTube pundit with a large viewership, Steven Crowder (Roose, Kevin and Conger 2019). YouTube “demonetized” the videos of Crowder along with a number of popular “white nationalist” channels, including Red Ice TV. While Crowder has ten times the viewership of Red Ice, but both channels have been identified as part of the so-called “alternative influence network” on YouTube, which media scholar Rebecca Lewis considers to share a “reactionary” position (2018). Crowder and others dubbed this purge the “#VoxAdpocalypse” or “Adpocalypse 2.0”. While YouTube does police its platform for certain types of offending content on a constant basis, it periodically engages on these types of large-scale purges, seemingly when pressured to do so by journalists. Previous instances of “purges” of content included the deplatforming of Alex Jones (Hern 2018) as well as an earlier “Adpocalypse” (Dunphy 2017). Apart from demonetizing specific videos and entire channels (i.e. blocking ad revenue), the measures against problematic content include simply deleting videos and channels. Our intention here is not to wade into the debate concerning the demonetization of self-described “conservative” pundits like Crowder. Rather, we propose to focus here rather specifically on YouTube’s removal of content, whose censorship Crowder himself appears to condone (Crowder 2019).
What was deleted from 4chan’s YouTube?
If, following our fringe perspectivist method, we treat 4chan/pol/ as a space of extreme speech, then we can arguably use it as a lens onto the extreme regions of other platforms. The question then follows, how is YouTube employed in general threads on 4chan? From the perspective of /nsg/, many of 4chan’s users, so-called anons, appeared to treat YouTube as a repository for propaganda material. Analyzing the most linked-to videos, the YouTube content appears primarily as a means through which to articulate and internalise a sense of their history, political concerns, and ideology. Somewhat consistent with /pol/-anons’ use of bigoted memes, these videos therewith function as tools to construct and upheld an in/out group distinction.
Figure 3 and 4 list the thumbnails of the thousand most-posted YouTube videos in /nsg/ threads on 5 and 6 June respectively. We can see a particular genre of videos shared within these Nazi threads: pre-digital documents that have been circulating amongst this community for some time. having an aura of samizdat. This material seems to have a cult-like quality in these communities, for example — in material which remains available on YouTube — using the very same found footage to annotate the video in an effort to bring out supposedly hidden messages contained in the video. Of the videos deleted in the purge, it comprised:
- Archival material (also often seemingly duplicated from VHS);
- Speeches by notorious and esoteric right wing figures;
- Assorted fascist songs playing over archival footage;
- Older samizdat-style “documentaries” concerned with Nazi apologia and holocaust denial with titles like “The National Socialist Revolution” and “Adolf Hitler’s Warning”.
In addition to selectively deleting such types of videos, YouTube also removed whole channels whose videos were frequently shared on /nsg/, channels with names like “Justice Germans” or “Impartial Truth”. The purge seems to have been quite thorough in removing all such videos, since some legitimate World War II archival material was purged, for instance by a Dutch historical archive (which was later corrected; NOS 2019).
While we observed relatively little direct overlap between /nsg/’s YouTube links and the most popular ones on /pol/ overall, they share a commonality of being engaged with the antisemitic “Jewish Question”. Interestingly, many of the most-linked videos from all of /pol/ that still remain following the purge fit this description, including content from legitimate producers like Al-Jazeera on the topic of Israeli lobbying in the United States. Of course, criticism of Israel is not equivalent antisemitism, the argument that it is fallacious and immoral. That said, the videos that set out to do in our sample are often nonetheless used in such a way, as is clear when one reads the comments section. To that end, YouTube seems not, for example, to have deemed the channel Anti-Zionist League as violating its recent stricter content policy concerning “hateful and supremacist content”.
While the most-linked to video on /pol/ overall/ has now been deleted, on inspection it remains available on YouTube under other titles, as well as on archive.org under the title “Jewish Internet Activities”. This video concerns an amateurish, largely antisemitic “documentary” that argued that the “hasbara” Israeli online public diplomacy program in fact constitutes a comprehensive and global program of internet censorship. Other popular videos, which remained available after the purge, developed similarly conspiratorial narratives. For instance, they proposed Israeli involvement in 9/11 (Corbettreport 2016) and the surreptitious infiltration of Jewish influence via left-wing academia, the so-called “Cultural Marxism” narrative, framed as being responsible for the decline of Western culture (Bellicose Nation 2013). Further down the deleted video list is an actual promotional video produced by a department of the Israeli university IDC Herzliya, describing an outreach program whose stated objectives seems similar to those laid out in the now deleted hasbara conspiracy video (Public Diplomacy Program 2016).
Remarkably, amongst the top linked videos, none are by members of Lewis’ “alternative influence network”, not even the much-discussed “bloodsports” debate between alt-right pundits Richard Spencer and Carl Benjamin AKA Sargon of Akaad (Warski 2019), which had been the #1 trending video on YouTube at the time that aired. It is furthermore only further down the long tail of the videos that remain that we come across the expected “memeish” expressions of vernacular creativity (such as “Vaporwave” videos) and other sorts of alt-right preoccupations (like “race realism”). Akin to the preoccupations of /nsg/, the top linked-to videos from /pol/ overall removed in the purge include extreme-right archival material by the holocaust denier David Irving, founder of the American Nazi party George Lincoln Rockwell, notorious Amercian white-supremacist William L. Pierce and former Ku Klux Klan grand wizard David Duke. Besides these “historical” documents, another YouTube link frequently used on /pol/ is the notorious “Remove Kebab” song that the Christchurch shooter played when during his terror attack— which still appears to be easily found elsewhere on YouTube.
How are 4chan’s YouTube videos deleted?
What do the deleted videos tell about YouTube’s “methods” of content deletion? Since YouTube’s official blog post announcing the purge does not outline how it determines what content violates their rules, one can only speculate on its exact detection and deletion methods. In terms of quantifiable metrics, there was no statistically significant difference between what survived and was spared after the 5-6 July purge: those that are gone and remain have a similar number of average view counts, “time alive”, and comment counts. This shows that the deletion of videos was not (solely) determined by their popularity or engagement metrics.
The error messages returned when requesting the deleted videos provide some insight into the “how” of the deletion. From all the 3.239 deleted YouTube videos we found in the /nsg/ threads, 56% disappeared because the entire channel was removed, 25% simply listed “this video is unavailable”, 13% returned the message “This video has been removed for violating YouTube’s Terms of Service”, and only 0.6% was gone because of the users deleting the videos. YouTube’s detection of “harmful content” thus mostly seem to occur on a channel level instead of on the basis of individual videos.
What is furthermore interesting is that the purge ensued in one sweep: a week after the purge (14 June 2019), only four out of the thousand most-linked to videos on /nsg/ threads were since deleted. Considering the purge also affected historical footage by a Dutch foundation (NOS 2019) and channels reporting on hate speech (Taibbi 2019), the level of human judgement in the process appears to be relatively minimal, or otherwise erronous. One might thus surmise that the purged videos were likely identified by a combination of automated text analysis and video recognition instead of manual, human tagging.
Where are they now? Deleted YouTube videos after the purge
Since 4chan/pol/ links so frequently to YouTube, the purge has been met with calls to action, such as #OpHornetsNest. Largely, though, the anons’ overall response seemed rather defeated. However, as mentioned above, it seems that anons often use YouTube more as a repository than as a news source. As such, the /pol/ approach to extremism may be less affected by the purge than one might at first have imagined. While the links to many videos and channels are gone, such extreme videos are often also available on archiving sites, notably archive.org. In some cases they are also immediately reuploaded to YouTube or else identical versions of previously deleted videos remain available elsewhere on YouTube. Metadata furthermore appears to remain online for some channels, such as through playlists.
It appears that when YouTube takes down content with high levels of engagement and relevance to certain political communities, chances are that this same content can easily be found elsewhere on the Web. To demonstrate this, we scraped the first five Google search results for each video title of our overall deleted videos dataset. We obtained results from video streaming services — including YouTube itself — as well as sources that referred to a given video. Given the extreme political nature of the deleted videos, we wanted to track where they were else they might easily be found online to see if the deletion created new zones of extermism outside of Google’s video service. Our approach here was first to manually categorise these videos based on the topics they referred to, including (1) videos that referred to Judaism, Jewish lobbies, and holocaust revisionism; (2) videos that elaborated on various conspiracies (9/11, the illuminatis, freemasons, the new world order, etc.); (3) footage of Nazi Germany; (4) videos that referred to the idea of the white race (European heritage or civilisation; the idea of the great replacement); and (5) videos on the 2019 Christchurch shooting. Following this categorization, we looked at which websites hosted these now-purged videos. See figure 8 for the results.
Listing the websites where the deleted YouTube videos appear reveals a series of sites that mimic aspects of YouTube’s appearance as well as its video streaming affordances. In particular, BitChute appeared as the go-to platform for most of the deleted videos. As Gab is to Twitter, or Voat is to Reddit, Bitchute markets itself as a “free-speech” alternative geared towards users banned from the “corporate and boring” YouTube (BuzzFeed 2018). As figure 10 shows, links to this website from 4chan/pol/ are steadily rising. Other such include altCensored.com and brighteon.com, the latter of which is equipped with key features from YouTube, including a recommender system, a comment section, and a dynamic home-page with recently uploaded or popular videos.
Aside from these alternative video platforms, content banned from YouTube can also be found hosted on various opinion websites. While these websites are likely not to benefit from many visitors, they are worth mentioning here for how they, too, invest in the idea of contributing to an alternative Web for politically alternative content. Perhaps unsurprisingly, davidduke.com is where one may find another copy of the banned video “The Jewish Role in the Porn Industry Video”. Additionally, there are a selection of hyperpartisan sites, such as smashculturalmarxism.com, which presents itself as being exclusively dedicated to “information and views that are not allowed in the mainstream” as well as alternative encyclopedias such for example as wikispooks.com — the latter which passes itself off as a repository of sources of forbidden ideas, precisely because much of its content has been banned elsewhere.
Will the more active policing measures of “mainstream” platforms like YouTube impact the momentum of so-called “alt-tech” like BitChute? Will these alt-tech platforms create the basis for a new kind of right-wing intersectionality? So far, platforms such as Gab have had trouble achieving scalability (Weill 2019), but in the aftermath of YouTube’s Great Purge, a number of popular YouTube “conservatives” and “skeptics” (including Jordan Peterson, Carl Benjamin and Michael Shermer) are currently promoting a YouTube alternative that will only remove content by U.S. court order — which is effectively the same censorship policy as the far-right and highly problematic 8chan. Initially brought together in the defense of free speech, counter-extremism claims that such alt-tech initiatives can in effect function as “melting pots for a range of extreme right activists and groups”, places where “populist political candidates, Identitarians, neo-Nazis and alt-right trolls mingle, allowing for the transfer of ideas which leads to a more cohesive ideology” (Davey and Ebner 2017, 26). According to this theory, in trying to cleanse its reputation of being “the great radicalizer” (Tufkeci 2018), YouTube may in effect only be concentrating the problem elsewhere, contributing, perhaps, to a “ghettoisation” of political content not just on its own platform, but on the greater Web.
Conclusions
Without addressing the more complex issue of demonizing conservative pundits like Steven Crowder, it appears that YouTube’s purge has successfully removed a substantial portion of extremist content from the platform, albeit with some misfires. From the “fringe perspective” of the National Socialism General /nsg/ discussion threads on 4chan/pol/, the purge removed about half of this material. The deleted videos primarily concern archival material of speeches and other propagandistic “documentaries”. Because YouTube’s policing approach seems to have prioritized automated video recognition and content analysis over human judgement, the platform also removed some “legitimate” archival material in the process, however. If we treat /nsg/ — and for that matter 4chan/pol/ itself — as an extremist lens onto YouTube, we can furthermore note how many of the most linked-to videos that remain on YouTube still seem to serve as vehicles for these extreme discussions; not because their content that it extreme, but because the comments are.
With this in mind, another approach to assessing extreme content on the platform might be to consider this type of material from the perspective of how what goes on in the comment section. With this approach, we could detect certain overlaps between the vocabularies of YouTube commenters and discussion threads on /pol/. If the fringe perspectivist theory permits one to identify a particular discursive style as extreme, and we can detect these shared vocabularies, then we might be able to better map the presence of extreme speech on both platforms.
Despite this, and although it is the right that is currently in YouTube’s sights, we should be wary of the platform’s capriciousness. Whilst some may condone its current content decisions, how might they treat a left political insurgency with the aim of breaking up their monopolies? As such, at the very least, the political decisions behind efforts to monitor and censor video content should be made transparent and subject to democratic oversight.
Edits
The previous version of this article claimed that Altcensored.com was an “alt-tech video platform […] equipped with […] a recommender system, a comment section and a dynamic home-page with recently uploaded or popular videos.” This claim was incorrect and only applied to brighteon.com.
References
Chandrasekharan, E. et al. 2017. You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech. Proc. ACM Hum.-Comput. Interact., Vol. 1, No. 2, Article 31. http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf
Crowder, Steve. 2019. “The #VoxAdpocalypse is coming for YOU!”. Jun 5, 2019. Louder with Crowder. https://www.youtube.com/watch?v=NEgvT1DsHnE&t=656s
Davey, Jacob, and Julia Ebner. 2017. “The Fringe Insurgency: Connectivity, Convergence and Mainstreaming of the Extreme Right.” London: Institute for Strategic Dialogue. https://www.isdglobal.org/wp-content/uploads/2017/10/The-Fringe-Insurgency-221017.pdf
Dunphy, Rachel. 2017. Can YouTube Survive the Adpocalypse?. New York Magazine. Dec. 28, 2017. http://nymag.com/intelligencer/2017/12/can-youtube-survive-the-adpocalypse.html
Hern, Alex. 2018. Facebook, Apple, YouTube and Spotify ban Infowars’ Alex Jones. The Guardian. 6 Aug 2018 https://www.theguardian.com/technology/2018/aug/06/apple-removes-podcasts-infowars-alex-jones
Holt, Jared. 2017. “White Supremacy Figured Out How to Become YouTube Famous | Right Wing Watch,” October. http://www.rightwingwatch.org/report/white-supremacy-figured-out-how-to-become-youtube-famous/
Lewis, Rebecca. 2018. “Alternative Influence: Broadcasting the Reactionary Right on YouTube.” Data & Society. https://datasociety.net/wp-content/uploads/2018/09/DS_Alternative_Influence.pdf
Lipstadt, Deborah E. 2019. Antisemitism: Here and Now. New York: Schocken.
NOS. 2019. “YouTube verwijdert archiefmateriaal Alkmaar wegens ‘haatzaaien’.” NOS.nl, June 9, 2019. https://nos.nl/artikel/2288321-youtube-verwijdert-archiefmateriaal-alkmaar-wegens-haatzaaien.html
Phillips, Whitney, Gabriella Coleman and Jessica Beyer. 2017. “Trolling Scholars Debunk the Idea That the Alt-Right’s Shitposters Have Magic Powers.” Vice, Mar. 22, 2017. https://www.vice.com/en_us/article/z4k549/trolling-scholars-debunk-the-idea-that-the-alt-rights-trolls-have-magic-powers
Roose, Kevin and Kate Conger. 2019. YouTube to Remove Thousands of Videos Pushing Extreme Views. New York Times. June 5, 2019 https://www.nytimes.com/2019/06/05/business/youtube-remove-extremist-videos.html
Smith, Aaron and Monica Anderson. 2018. “Social Media Use in 2018” Pew Research Center. March 1, 2018. http://www.pewinternet.org/2018/03/01/social-media-use-in-2018/pi_2018-03-01_social-media_0-01/.
Taibbi, Matt. 2019. “YouTube, Facebook Purges Are More Extensive Than You Think.” Rolling Stone. June 7, 2019. https://www.rollingstone.com/politics/politics-features/youtube-facebook-purges-journalists-845790/
Tufekci, Zeynep. 2018. “YouTube, the Great Radicalizer” New York Times. March 10, 2018. https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html
Youtube. 2019. Our ongoing work to tackle hate. YouTube Official Blog. June 5, 2019. https://youtube.googleblog.com/2019/06/our-ongoing-work-to-tackle-hate.html
Weill, Kelly. 2019. Gab Is in Full Meltdown, and Founder Andrew Torba Blames the ‘Deep State’. Daily Beast. https://www.thedailybeast.com/gab-is-in-full-meltdown-and-founder-andrew-torba-blames-the-deep-state
Choked down enough memes yet? How’s that psychic indigestion? How much does the therapy cost you? Psychic wounds don’t heal, fam.
It’s funny that all your comment section fans are the very people you study. Arguably your studies are merely reinforcing the collective Hive Mind mythos about its own potency.
dear sirs,
as the creator of altCensored.com, i wish to thank-you for your mention and correct a few points you made.
you state:
“Other such alt-tech video platforms include altCensored.com…which are also equipped with key features from YouTube, including a recommender system, a comment section, and a dynamic home-page with recently uploaded or popular videos.”
1) our content is strictly videos that have been limited by YouTube. we are an unbiased community catalog which welcomes channels to monitor suggestions from everyone (see our ‘About’ page below).
2) the ‘alt’ in our domain name originally comes from the Latin ‘alternatus’, and was more recently used as a Usenet newsgroup hierarchy.
3) our site has no recommender system or comment section
admin, altCensored.com
from our ‘About’ page:
“YouTube limits access to videos that are neither illegal, nor violate their own Terms and Conditions or Community Guidelines.
Limited videos are placed behind a warning message, cannot be shared, monetarized or easily found, and have likes, comments, view counts and suggested videos disabled.
altCensored.com is an unbiased community catalog of 15385 (updated daily) YouTube Limited Videos across 1658 (updated daily) monitored channels, including deleted videos and removed channels. Please email admin@altCensored.com with new channels to monitor.”
We have mistakenly phrased that passage: the above only applies to brighteon.com. Thanks for pointing this out.
censorship sure is wonderful, isn’t it? songs will be written about the brave souls who shut down the evil nazi videos. some of them even had to do more than click their mouse, the heroes. what champions of freedom, what stalwart defenders of the glorious era of post-modernity, what sentinels of truth they are. it brings a tear to your eye
Don’t worry guys I’m sure Google and the rest of Silicon Valley are actually really nice and way different than the rest of the power-hungry corporations. After all there was a Google gay pride doodle (what more proof do you need)!
What could possibly go wrong with multinational monopolies being the new arbiters of free speech?