By: Mike Johnson Jr

Date Created: July 25, 2016, 1:54 a.m.

The Coontown Breakdown [Part 1]: A Brief Analysis of the Habits of Coontown Users

 

 

Audioburn, 7/16/2015

Part 2: The Coontown Breakdown [Part 2]: Electric Boogaloo - More Data, Less Prose

Introduction

This morning, reading through Reddit, through the drama of Ellen Pao and Yishan Wong and the user backlash over censorship and hate subreddits and especially CoonTown, I decided to do an analysis.

CoonTown may very well be gone by tomorrow, and I haven't seen any analyses done on the subreddit yet.

If you don't already know what CoonTown is, let me give you a brief overview. CoonTown is a subreddit on reddit (a subforum of the Reddit forum) dedicated to showing black people in the worst possible light, bringing awareness to black crime and to the atrocities of the black underclass. The N-word is thrown around like a basketball at a Play-Offs game. They think that they hold the truth and the light, that this revelation that black people living in impoverished and neglected communities (usually with broken family structures) tend to commit crime at higher rates is utter Gospel, and that the liberal-Jew media is trying to hide the truth from the People, by instead showing Trayvon Martin and Tamir Rice and Michael Brown, etc., but not paying attention to the real criminals. They described their own subreddit: "Welcome to the home of the good boyz who dindu nuffin'. Make sure your subreddit style/CSS is enabled for maximum jungle immersion." I would not recommend visiting, so I'm not going to post a link, but with some level 1 Google-Fu you'd be able to find it. Back to the project.

Processing the last 5 submissions took nearly an hour, which gave a sample size of 225 of the most recent users (whose posts elsewhere spanned across thousands of different subreddits, with hundreds of thousands of comments). I plan to do another run through over night while I'm sleeping, for hopefully a sample size of at least 1000 users (which will also save the users).

The results I found were still pretty surprising, even with such a small sample size. Let me stop typing and just show you the data.

Wait, no, not yet.

So let me explain what this is. This is a brief analysis of the subreddits that the most recent 225 posters of CoonTown frequent, sorted by total karma for each subreddit, for both comments and submissions, inspired by a cross-posted datavis post @ /r/blackfellas I saw a few days ago (cross-posted with dataisbeautiful). We omit /r/coontown.

Alright, enough prose, on to the data.

Code

Here is the code that made it happen:

	
		import praw
		import json
		users = []
		submissions = []
		r = praw.Reddit(user_agent='africanawiki')
		subreddit = r.get_subreddit('coontown')
		#get submission object ids
		for i,submission in enumerate(subreddit.get_hot(limit=5)): #using 25 for tonight
			print 'getting submission object %s' % (i)
			submissions.append(r.get_submission(
				submission_id=submission.id))
		root_comments = []
		for i,s in enumerate(submissions):
			print 'getting comments %s of %s' % (
				i, len(submissions))
			for c in s.comments:
				root_comments.append(c)
		def get_comments(comments,level):
			for i,c in enumerate(comments):
				try:
					print 'getting comment count: %s in level %s' % (
						i,level)
					if c.author.name not in users:
						users.append(c.author.name)
				except AttributeError:				
					print 'nada'
				if hasattr(c,'replies'):
					level += 1
					get_comments(c.replies,level)
				
		get_comments(root_comments,0)
		kb_submissions = {}
		kb_comments = {}
		for idx,username in enumerate(users):
			try: 
				print 'getting info for %s, %s of %s' % (username,idx,len(users))
				user = r.get_redditor(username)
				submissions = user.get_submitted(limit=None)
				comments = user.get_comments(limit=None)
				for s in submissions:
					subreddit = s.subreddit.display_name
					kb_submissions[subreddit] = (
						kb_submissions.get(subreddit, 0) + s.score)
				for c in comments:
					subreddit = c.subreddit.display_name
					kb_comments[subreddit] = (
						kb_comments.get(subreddit, 0) + c.score) #the error was here
			except:
				print 'user deleted his/her account, smart'
		karma_by_subreddit = {
			'submissions':kb_submissions,
			'comments':kb_comments,
			'users':users,
		}
		#save object to disk as json 
		with open('coontown_breakdown.json','w') as fp:
			json.dump(karma_by_subreddit,fp)
	

Knock yourselves out analysing other subreddits.

You can find more at my github repository, which also contains the code for my open source data visualization app Agile which I used to help visualize these charts above (+Highcharts).

Raw Data

Here are some links to the raw data (warning: auto-download):

Peace

Thanks for viewing, I hope you enjoyed.

code_black()