Episode Show Notes

							
			

[START OF RECORDING]

JACK: [MUSIC] Okay, so, this episode has mature content. I don’t recommend listening to this with young ears around or on some kind of speaker where others might hear it because this episode gets into some dark stuff, so listener discretion is advised. Alright, so this episode is about a mobile app called Kik, spelled K-I-K. It was made in 2010 to help mobile users be able to chat with one another easier. See, back then, people who had BlackBerry phones had a hard time chatting with Android users and iPhone users, so some Canadian college kids made Kik to be able to let people chat freely no matter what type of phone you had. Kik was an immediate hit. Just one month after launching they reported to have one million users and it’s been growing rapidly ever since. But something happened in the Kik chat app which sent this app down a dark path. (INTRO): [INTRO MUSIC] These are true stories from the dark side of the internet. I’m Jack Rhysider. This is Darknet Diaries. [INTRO MUSIC ENDS]

JACK: This is the tale of two different Kik users. First is a guy who we’ll call Doctor.

DOC: Some person, I don’t remember their name, said that Kik was a good app to meet people on and to join some communities. I figured well, why not? Might as well try it. So, I joined.

JACK: Sounds innocent enough. Everyone can use new friends, right? So, that’s why Doctor joined Kik. By the way, Doc is like, twenty-something years old and I’m just disguising his voice a bit. Now, when you join Kik, you can do so anonymously. All the questions are soft, like it asks for your e-mail but you could just put anything in there because they don’t e-mail you any sort of registration confirmation link or anything. It asks for your phone number but you can skip that, too. Yeah, when you get on Kik, you can be whoever you want without giving any details as to who you really are. The other guy we’ll hear from we’ll call Azrael.

AZRAEL: Everybody was telling me to make an account just to hang out and message them on there instead of whatever we were using at the time. I don’t even remember.

JACK: This is called the network effect and it’s powerful. The more people who join and want to chat with other people, the more they invite other people to join. So, you’re either on Kik or you’re not, and if Kik is where all your friends are, you might as well get in there and be there with them. Now, these two people, Doctor and Azrael, have entirely different stories that take them down two different paths while using Kik, and we’re gonna weave in and out and back and forth between the two of them. This should be fun.

AZRAEL: [MUSIC] Kik is an old instant messaging app.

DOC: You can chat, share pictures, videos.

JACK: On the app, you can connect with someone you know and send them private messages or you can join chat rooms. In fact, when you first install the app, it tries to show you some cool chat rooms to try joining. Rooms have themes; like, you can join a room about Pokemon or Fortnite or different regions of the world where people live.

DOC: You can find different groups based on search preferences, so you can use different tags and such to try and find specific groups that might share interests with what you’re looking for. It’s like, really, it’s ‘the road to hell is paved with good intentions.’ It has a lot of potential and being a good forum site where everyone can be there and have fun and make good groups and start good communities. But it is not maintained well enough, so it has its darker side.

JACK: Ah, yes, the dark side of Kik. That’s what I’m interested in, so let’s go there. But first, let’s back up and talk about the company who runs Kik. You’re gonna be very curious about this eventually, so we might as well get into it now. [MUSIC] Kik was created by some Canadian students in University of Waterloo in 2010. A month after launching, it had a million users. Five years later, the Chinese company Tencent invested 50 million dollars into Kik, trying to make it the WeChat of the West. WeChat is a massive chat app in Asia, so Tencent saw potential in Kik. This helped explode Kik’s popularity. By this point, the company was called Kik Interactive and they had a whole team of people working there. There was a CEO who oversaw everything, bunches of employees, marketers, a whole team of people who were working on Kik. Their team was expanding fast too, with dozens of employees at the time. In 2015, Kik reported it had 240 million users and 70% of them were between [00:05:00] the ages of thirteen and twenty-four.

AZRAEL: I actually went and looked at the terms of service ‘cause I wanted to actually know what you’re supposed to do with the app. It’s marketed for kids thirteen and up. It’s actually, you’re not supposed to share any not-safe-for-work imagery. They actually have certain words blocked so you can’t name your chat certain things. Like, you can’t use any curse words in the naming of your chats.

JACK: That’s right; Kik is for kids. In fact, if you go to kik.com/brands, which is the web page they use to pitch to potential advertisers, the title of the page says Reach Teens in Their World. It goes on to say that one out of three American teens use Kik. Kik loves marketing this app to teens because it’s well-known that teens are trendsetters, so if teens make a social platform popular, then everyone else will eventually come, too. That’s what happened with MySpace, Facebook, and so many others. But despite it being super popular, Kik didn’t really have a way to make money. In fact, they were losing money fast, so they had to come up with a plan to make Kik profitable. They saw that WeChat is not only a chat app but also a payment app. You can use it like Venmo or whatever and send money back and forth between users. Kik wanted to do something like that but at the same time, they saw the boom in cryptocurrencies and decided to make their own cryptocoin called Kin.

[MUSIC] In 2017, they launched Kin. Now, to start out, Kik gave themselves tons of this cryptocurrency and they were telling people that there’s two ways to initially obtain it; there would be a private pre-sale and there would later be a public sale. In both methods though, people would buy this Kin with Ethereum from Kik. The private pre-sale resulted in Kik selling 50 million dollars worth of Kin. By the time the public sale was over, Kik had made 98 million dollars from doing this initial sale of their Kin cryptocurrency. All that money went straight into Kik’s bank account. The SEC warned them about raising money through cryptocurrencies like this. They had to follow certain rules and it gets tricky. But it didn’t seem like Kik respected the rules, so in 2019 the SEC filed suit against Kik and Kin, alleging securities violations. We’re getting into some legal weeds here but it seems like Kik is saying that Kin is a cryptocurrency and it shouldn’t be considered a security, but the SEC was saying well, if you’re using it to raise money for your company, you’re sort of treating it like it is a security.

Kik was saying it was a currency; the SEC was saying it’s more like a stock. It’s tricky because, well, yeah, it is a currency. Kik was really using it to raise money for their company, basically treating it like a security. So, Kik and the SEC went into a fierce legal battle. This legal battle was rough and changed everything about Kik. The team at Kik was tied up with this lawsuit and just loved all the money they were making with Kin, so they just sort of stopped caring about Kik, their chat app. Kin was making money. Kik was losing money. It’s as if Kik Interactive split into two when they made Kin. In fact, they made their own separate entity called the Kin Foundation which just focused on Kin. So, in 2019, Ted Livingston, the founder of both Kik and Kin, announced that Kik would no longer be supported. They were done with it and they were going to shift focus entirely to work on Kin.

In fact, they took their staff of almost 100 people down to just 19 people. The CEO said, quote, “Going forward, our nineteen-person team will be focused on one goal; getting millions of people to buy Kin to use it.” End quote. So, nobody, I mean nobody was left to work on the Kik chat app. The app was abandoned by its own company and Kik Interactive announced it would be shutting down the chat in app in October 2019. [MUSIC] But then a company called MediaLab AI stepped in. This is a California-based company who owns several other chat apps like Secret, Yik Yak, and Whisper. They offered to buy Kik, so it was sold to MediaLab. We don’t know for how much but this is who owns Kik today; MediaLab AI. So, the app is still alive and growing even today. Now, you’re gonna find all this to be pretty important later, so thanks for sticking with me through this. Now, MediaLab has a history of buying failing apps and trying to make them profitable. So, what’s the first thing they do when they buy Kik?

DOC: The only thing MediaLabs did was [00:10:00] the second they took over, everybody started seeing ads on the app.

JACK: If you use Kik today, you’ll undoubtedly see ads everywhere. They’re in chat rooms; they’re in private messages. It’s pretty much a permanent banner at the top of the app that’s always displaying ads. So, that brings us to today. I haven’t seen recent numbers but the latest count that I saw was that Kik has over 300 million users and it’s owned by MediaLab AI and they’re likely doing as little work as they can to just make it profitable and keep it going. Alright, so you get in there, you’re playing around. What do you discover when you’re in there?

DOC: So, what I discover is that there are many groups for many, many, many different things. I join a few groups, some related to some gaming stuff, and then I start noticing that some people are talking about some more kinky groups. Let’s call it that. But I also start noticing some other groups that generally just share straight-up porn.

JACK: Ah yeah, a chat program which lets you make rooms about whatever you want and post pictures and videos of whatever you want; yeah, of course there’s gonna be porn there. But you know, posting porn online is typically legal.

DOC: As long as the porn is legal and it’s in and of itself, that the actors in it are legal and legal age in the country that they’re from and in the country that it is being shown in and that there’s no acts in it that are illegal such as rape, as long as that is ticked off, then it is legal to my knowledge, at least.

JACK: Okay. So, how’s the porn scene on Kik? Here’s the thing that’s weird already, right? You’ve got this – target audience is thirteen to twenty-four-year-olds. 70% of their users is that and yet, there’s porn channels. It’s interesting that they have that already but yeah, how’s the porn scene over there on Kik?

AZRAEL: It is very active. Some of the actual social groups I joined were much less active than the porn groups.

JACK: So, when you get into a porn chan or room or whatever it’s called, what’s going on there? Are people just posting photo after photo of nudity and stuff like that or is there videos being posted, or is it – do you ask people hey, do you have something like this, or do you just – is it like TV? You just turn it on and you see what’s there or is there – it is like…

DOC: Yes to all of the above. It depends a lot from group to group. In a lot of groups when you join, first you are you asked to verify because there are a lot of bots on Kik, so to avoid there being bots and people who aren’t active and such, you’re being asked to verify. So, you send a message to an admin and they tell you to send something; maybe a video, some porn video so that you show that you are willing to share, or maybe a live picture of yourself or something like that. Then you get into the group and, well, depending on the group, you can be finding all of the things that you mentioned. Some are very, very active and people just share a lot of pictures and videos. Other groups are less active and you have to specifically ask for what people want, so to speak. That’s kind of where I got my start, actually, is trading, as it’s called, where you – where people have some request for something in porn and then you try and give it to them. Then in return, they give something back. It is videos and pictures, and that’s kind of where I got really interested in it because, well, when you’ve seen one pair of boobs, you’ve seen them all, so – to a degree. So, the porn part pretty quickly got boring but trading was very, very different and a lot more fun.

JACK: So, this is how Doctor got started trading pornography on Kik. People would ask for certain types of explicit material, he’d hunt for it, find it, and then share it in the channel or privately to that user. In return he’d get some other picture or video which he’d then save so that he could maybe give that to someone else someday. This is all free, too; just show up and be active, really. But the thing is, Kik’s terms of service strictly [00:15:00] forbids this type of activity. So does Facebook, too. Pornography isn’t allowed to be posted on either of those platforms, but Kik’s terms of service says users aren’t allowed to share material that’s unlawful, obscene, defamatory, liablist, threatening, or pornographic, and the list goes on. But the point here is that porn is not permitted on Kik, so it just shouldn’t even be there, yet Doctor was telling us how prolific it is on there. With a couple searches, you can quickly find channels full of porn. Yeah, a lot of their users are teenagers, so I guess we should catch up with Azrael again for a minute. So, while Doctor was on there trading porn, Azrael was on a whole different path. He was just hanging out in an anime channel doing normal things that people do in chat groups, nothing kinky or weird.

AZRAEL: One of the chats I was an administrator for got raided [MUSIC] by a source clan – were immoral raiders and they just – somebody dropped into their chat, gave them a hashtag, they hit the chat. They jumped in, tried a few spambots; spambots weren’t spamming hard enough, so they just started threatening people. They claimed to have our IPs and I knew that you couldn’t really grab IPs without some special tools so we would remove them and it was like that for a couple hours. I was just removing account after account after account. Then they started private messaging me pictures of gore and claiming that they would do that to me ‘cause they had my address and all this and that. I just didn’t care. Eventually I got legitimately angry and talked to a buddy of mine who was into the exploitation side of Kik and he got me my first actual mod that I could hit people with. That’s where it all started, I guess. [MUSIC] What happened was, I kinda started spotting these people that were using modified APKs.

JACK: APKs are what Android apps are. That’s just how they are bundled and that’s how they come. What he found were people were taking Kik and modifying the app to do different things, pretty much hacking the app itself.

AZRAEL: The only exploit back then was you could turn off your read receipts and see who was reading your messages.

JACK: Ah, right, so when you send someone a message, it’ll show you if it’s read or not. But with this modified version of Kik, you could make it so that the messages you actually did read show as unread to others.

AZRAEL: I don’t count that as an exploit, considering. But then things started ramping up and the old owners of Kik released the source code. After that, the modding community with it went wild. That’s where blue mods came in, Kingskull, and all that.

JACK: What is…

AZRAEL: Kingskull…

JACK: What are these things? What are the mods? What – is there – what, are there bots in these things or something? Is that what you’re talking about?

AZRAEL: Yes, yes. So, my custom APK, basically it has a few basic exploits; I can see who reads my messages, I can prevent people from seeing that I read messages, I can send raw XML files which are stanzas, and all that sort of stuff.

JACK: Hm, interesting. There’s a whole community on Kik who run customized versions of Kik that allows them to have different features that the normal user doesn’t have. Now, keep in mind, this also is not allowed in Kik. It’s against the terms of use. It specifically says, quote, “Except as permitted by us in writing, you agree not to reproduce, distribute, modify, prepare derivative works of, translate, reverse-engineer, reverse-compile, or disassemble the services in whole or in part.” End quote. So yeah, MediaLab doesn’t want people modifying their app to make it do extra stuff like this, but apparently MediaLab isn’t monitoring for this type of activity, so there’s quite a few people doing this. In fact, there’s a guy named Lou who made a modded version of Kik and he was actually trying to sell it.

AZRAEL: He wanted you to pay fifty bucks for it but who pays for shit?

JACK: [MUSIC] So, this is what Azrael was seeing; modded versions of Kik, and he was fascinated by all the extra things you could do that a normal user couldn’t. On top of that, he was seeing that there were all these raider clans on Kik [00:20:00] who would go infiltrate chat channels and attempt to try to grief other channels or something. He really didn’t like that his anime channel got raided, so he modded up his Kik client to help defend his channel.

AZRAEL: That has a bot system where you can lock your chat so that anybody who joins is immediately removed and a couple other things like that. You can censor words so if somebody says a censored word, they’re removed. My smart thing I did was I went and got a list of all the working raid bots and plugged them in one at a time into my censors list so any time anybody added the bot, removed. So, it completely prevented us from getting raided.

JACK: This was cool to him at least, to have a sort of super-powered Kik client to do extra stuff that other people couldn’t, but it fascinated him to the point where he wanted to know what other modded clients you could get with other features. He eventually found one that let him send XML stanzas to other users. Let me explain. See, Kik uses a protocol called XMPP which sends messages from one user to another. XMPP uses XML to encode and format the actual messages being sent. So, just think of XML as this language that each side of the chat app understands and it’s how communication happens in the app. Well, Azrael’s Kik client had the ability to modify the way that XML message looked when sending a message to someone. He could essentially inject a stanza of code which is just like a snippet of code into the chat, and then this would be executed by other Kik clients that are in that chat room. When another user would see that stanza, their client would have to process that code which would sometimes make their chat app do weird things.

AZRAEL: So, you could join a chat, send a stanza through the XML system and it was like oh, this user was just given admin. Then you have admin ability.

JACK: [MUSIC] Whoa, to get admin ability to any chat room you wanted? That’s crazy. You can just take over any chat room any time.

AZRAEL: That was patched, so that isn’t a thing anymore.

JACK: Okay, good. Glad that was fixed. People shouldn’t be able to just take over chat rooms. But we’re starting to see Azrael’s path take shape, here. He’s exploiting the Kik app to defend chat rooms. But can you guess where this is going? It doesn’t quite seem dark yet, right? It just kinda seems like a Nerf fight.

AZRAEL: Yeah. It really…

JACK: To play like, oh, I’ve got a chat room and I’m trying to defend my chat room. It doesn’t seem that big of a deal.

AZRAEL: It really wasn’t.

JACK: So, where does – how does this get worse?

AZRAEL: So, I started noticing other chats were getting raided and there were clans, and these clans had a large number of exploits. I was thinking at the time if I join one of these clans, not only will I have all of their exploits but I’ll be able to learn to the point where maybe I can pivot off Kik and help people there or I’ll just have enough exploits that nobody can touch my chat room.

JACK: There’s something funny about the way you say it. For someone just listening to it for the first time, when you say ‘clans’ and there’s, I don’t know, ‘raid clans’…

AZRAEL: It sounds very childish. I understand, no.

JACK: Alright clan, here’s what we’re gonna do; we’re gonna get together and we’re gonna raid this channel.

AZRAEL: Yeah. Basically, the clan I joined at the time was called Raindrop and they were focused on just anti-toxicity, you know what I mean? So, if one of our spies joined a chat and the chat was toxic and they would, say, share somebody’s pictures that didn’t want them shared or the admin was just really rude and mean and kicked people for no reason, we would go and take the chat.

JACK: So, while Azrael used to defend chat rooms from takeovers and spammers, he’s now become one of the spammers and raiders. Now, the tactic to take over a channel is somewhat interesting. First, a bunch of people join from his clan and then they start spamming like crazy, posting pictures, text, whatever, and it just becomes a constant stream of scrolling information. You really can’t read anything when you’re in that room. People with slow internet connections might get lagged out. People with notifications on might just get bells ringing constantly and yeah, the admins could kick people, but the raiders knew who the admins [00:25:00] were and would target them specifically with crazy stanzas and all kinds of private messages, XML that’s all buggy, and things to just make their app freeze.

Once the admins start falling out of the chat room and they can get enough people in that chat room, they can take it over. This is the kind of stuff Azrael starts doing and was having fun with it. While all this is going on in Kik, Doctor is still over there in some other chat rooms fulfilling peoples’ desires to see certain porn. I’m not quite sure I understand this whole trade – the whole reason why anyone would want to trade, but I don’t think I need to really get into it, so – ‘cause it just seems like oh, you want to masturbate? Here, I’ll give you some stuff. It’s just weird to help people masturbate, but okay.

DOC: Yeah, it is weird and when you put it like that, it’s – it is very, very weird. I found it kind of oddly fun in a way just because it was – more in the challenge because very – because I was very good at trading, very good at getting a lot of stuff, so very quickly I had more than 2,000, 3,000 items in my library.

JACK: Now, keep in mind, Kik is only a mobile app, so everything these guys are doing is 100% on their phone. So when someone requested some porn, he’d have to go looking through thousands of things on his phone to get a good one and send it.

DOC: Yeah. The part that made me good at trading was also that not only did I have that many – and I got more; I got over 8,000. But I was able to pretty much memorize them. So, I had watched most of what I got in a sort of objective manner so I knew what was in it. So, if people asked something, I knew okay, do I have that or do I not have that? Do I have something that falls close to those criteria? Approximately when did I get it so that I can find it in relation to others? Then I could scroll down and find it. So, the challenge became people asked for something and I had this little calling signet that I posted out; ‘the Doctor’s in, so if you have an itch that needs scratching…’. I had something like that. Well, to a degree, I think it’s a big part of it and why people are into it, into trading, is not just the whole kink thing but power, really. I think it’s the feeling of power that you get from being able to give people what they want always and that people recognize that. When I started out, people didn’t know me at all but very quickly a lot of people started noticing that I knew my business and if – and a lot of people started saying if anyone has it, it’s Doc. Getting recognized like that is a powerful feeling and to be honest, it’s a drug in a lot of ways. It’s a weirdly good feeling.

JACK: How much time were you spending on this?

DOC: Way too much. Way, way too much time on it. This was during the first lockdown around where I lived, so I had nothing but time. Like, it was one of the first things I did when I woke up, was see do I have any requests, anyone who’s personally messaged me? Then as I would go about my morning routine, get a cup of coffee and stuff, I would start looking through what they wanted, start finding some stuff, and I would go all day, pretty much. I might be playing games on the side, maybe even chatting to some people online, but I would pretty much be on my phone the entire day.

JACK: Okay, so everyone has their own addiction. I’m not gonna judge. That’s just what Doc was into. But still, you might be asking, these are supposed to be true stories from the dark side of the internet. Raiding chat rooms and looking at porn is not that dark. Well, thanks for hanging out with me this long setting up the story because everything is about to get really dark from here. This is the last chance to turn back. After the break, it gets dark. Doc had been on a massive porn-trading binge but eventually got a request that he didn’t know how to fulfill.

DOC: People sometimes started asking for what they generally just called CP. [00:30:00] At first I didn’t know and as soon as you asked what’s CP, nobody would say and they would just close the chat, block you. But after a while I figured out CP stood for child pornography and sometimes people also just refer to it as underage which was easier to understand at first, at least. But CP was kind of like, the code word that people used, the shorthand, and if you didn’t know what it was, you weren’t supposed to know, so to speak. Those items had a lot more value. Those were the ones that some people really wanted when they wanted to test what kind of collection you had and to see, oh, you maybe have these things and these things. Then they start talking about age slowly. Like, what are the age of the actresses and oh, do you have a little bit younger? How about a little bit younger? A little bit younger?

It became really creepy very, very quickly and I couldn’t provide and I wouldn’t provide because I had some that looked very young but I knew that those were porn actresses that were like, 22+, something like that, that I had looked up to make sure. [MUSIC] But then I started getting some items that I honestly didn’t know how young they were. People started valuing those items more in certain groups and age was all of a sudden a huge value factor, so that the lower the age would get, the higher the value item, no matter if it was picture or video. For comparison, a picture of an underage girl, a lot of people would value as the same value as five to ten normal porn videos.

JACK: Now, while he’s trading porn to people all day, every day, he’s still collecting tons of porn on his phone.

DOC: Maybe I was trading with someone and then they start sending some, and then I have to say oh, no CP, dude. But I would still get these things and see them.

JACK: He was shocked that people had this type of content.

DOC: At first, some of the stuff I saw made me horrified. Like, who would even keep this around and why would people send me this when I didn’t even ask for it?

JACK: Then after he saw this a few more times, the shock wore off and after he saw it a few more times after that, he’d save that photo or video to his phone, still not giving it to others when they asked but keeping it because he was a collector.

DOC: Over time I got jaded, so I really didn’t see the age as much anymore. At some point I realized that hey, I have access to a lot of these – a lot of people are asking for it. Well, sure. The power got to my head and I was like okay, sure, I can distribute some of it. But then I set a limit; I would not go below this age.

JACK: What was – what age?

DOC: The age was fourteen, I think.

JACK: Nothing below fourteen years old.

DOC: Yeah.

JACK: Mm-hm.

DOC: Again, as soon as I accepted that, it was just a downward spiral from there, really. [MUSIC] After a while, some of the stuff – I just stopped caring and I just saw the value of the item itself, not what was actually in it. It sounds horrible, I know. It felt horrible when I realized it later but at the time, I didn’t realize what I was doing and how bad it was.

JACK: Do you have any idea how old the people were that you were trading these with?

DOC: There was a wide range. Like, the youngest people I talked to was around ten or eleven. I wouldn’t get – I never gave them any porn but a lot of people [00:35:00] were trying to – people who were child molesters, they were trying to get them to make porn and that’s usually why they – why I got in touch with them, was not to get porn from them but because I was an admin in the same rooms, so they would come to me and say hey, this person is asking for nudes of me, and I would ban them, the people who were asking for nudes.

JACK: Oh, right, this app is targeted towards teens. According to Kik, one in three American teenagers use Kik which means there’s probably a lot of pre-teens on there, too. That’s a really bad mix, when you have pre-teen children in the same chat room as people looking for child porn. This is ugly. This is really ugly and it can’t end well.

DOC: The oldest I talked to I think were in their mid-sixties.

JACK: What the hell is a sixty-year-old even doing in this chat app? Oh yeah, I know why; because porn and child porn is prolific on Kik.

DOC: I was in a lot of rooms. When I was the most active, I was active in like, thirty or so rooms.

JACK: Thirty porn rooms?

DOC: If not more.

JACK: See what I mean? I hate to say Kik just allows porn without knowing exactly what Kik is doing to stop this, but it’s incredibly easy to find porn in the app. I mean, Doctor says he was in thirty rooms himself, so you can see that this isn’t just a few bad people or just a small issue; there are a lot of people on Kik solely there for the porn. Not only that, but many of these rooms are actively posting child porn. How vast or how big is the child porn exchange system going on on Kik?

DOC: Never-ending. If you use the right tags to search for when you’re searching for the private groups – not the private – the public groups even, you can find so many groups. It’s a very active scene, unfortunately.

JACK: So, the moderators of Kik – obviously there’s moderators in each room but isn’t Kik at all moderating any of this stuff?

DOC: Honestly, I don’t know. I think they try to, to some degree. I know that they have some form of artificial intelligence running through some of the rooms looking for some things, trying to recognize some patterns and see if these and these things are posted, then they will close down the rooms, maybe ban the users, and make sure the content is no longer viewable.

JACK: How do you know that they’re running that?

DOC: Because some people I’ve traded with, later on – if I look at the history of – chat history with them, I can go back and see oh, these things are no longer available and sometimes the rooms get shut down.

JACK: Okay. I had to research this and this is what I found; Kik has a few methods to combat child pornography. In 2015, Kik partnered with Microsoft to test Microsoft’s PhotoDNA service. Apparently what Microsoft has is a database of known child porn image hashes. Now, hashes aren’t the images themselves; it’s just a string of text, like a fingerprint of the file. If you have the hash, you can’t generate the image from that but if you have the image, you can quickly generate a hash of that image. So, Kik was somehow using this PhotoDNA technology to scan their images to see if there was a known child porn image on their site. But I’m not sure exactly how they did that or what they did when they found a match, or even if this PhotoDNA service is still in use anymore. The last time I heard about it was in 2015. In Doc’s case, who started using Kik in 2020, he didn’t see any evidence that would suggest that this isn’t always on filter. There were child porn images posted all the time to chat rooms and private messages and those rooms would stay up and online forever. If you scroll back through the history in those chat rooms, you could see child porn that was posted long ago in the past.

DOC: From what I can tell, it only starts looking in the rooms and looking at individual people if they are reported for something.

JACK: So, if a user reports another user or chat room, then some kind of anti-child porn scan triggers and may remove content. I guess from Kik’s point of view, [00:40:00] there are probably millions of images posted every minute in safe and clean chat rooms, so maybe it’s just too expensive or too hard to scan every image to see if it’s a known child porn image. So, my guess is they’re just not very thorough with trying to combat this. But I don’t want to just take one guy’s opinion about what he saw on Kik. I want to do my own research on this and see if there’s other evidence I can find about child porn on Kik. Of course, I’m not gonna go on the app and look for it myself. Uh-uh; I’m staying far away from that for sure, but if I search the internet for Kik and child porn, news stories just start jumping out at me.

HOST1: A Calabash man is accused of using an app to receive and send sexually explicit photos and videos of minors. According to arrest warrants, 19-year-old Benjamin Lindsey used Kik – that’s a messenger app – to distribute the images. The unknown children are between the ages of seven and twelve.

HOST2: The court documents say Joshua Richard Harrison confessed to posting explicit photos of children through the app Kik.

HOST3: Milwaukee County Judge Brett Blomme’s Instagram page is filled with personal and professional highlights on his way to the judiciary, but after a search in Cottage Grove of one of his two homes and his arrest, authorities say his computer activity on the app Kik links him to child porn, with uploaded images and videos consistent with child pornography through the Kik app on twenty-seven occasions last fall. Judge Blomme has been on Milwaukee County’s Circuit Court bench since August. His assignment; children’s court. As a bond condition, he’s now banned by a court of having unsupervised contact with minors. The judge leaves this county’s criminal justice system knowing he will return not to dispense justice but to potentially have to accept its consequences.

JACK: The news articles are continuous. There was a guy in Tuscon, Arizona who was arrested for posting child porn to a chat room in Kik. There was a guy in New Jersey who was arrested for trying to have sex with an underage girl on Kik. In 2020, a New York man was sentenced to eleven years in prison for possessing 1,500 images and 150 videos of pornographic material involving minors that was distributed on Kik. In 2020, an Ohio man was guilty for distributing child porn over Kik and got twenty years in prison. There are so many more cases; in fact, I’m looking at a BBC article now which is simply titled Kik Chat App Involved in 1,100 Child Abuse Cases. Just as a cursory glance, Kik has a very big problem with child abuse. Yes, this is child abuse; in fact, the common term I’ve seen used to describe this stuff is CSAM, spelled C-S-A-M which stands for Child Sexual Abuse Material.

It’s simply defined as any sexually explicit image or video of anyone under eighteen years old. This can be a very traumatic experience for kids and the trauma comes right back any time they see the image again. There are organizations out there fighting hard to stop the spread of this. Now, before we get into those organizations, I want to know what Kik is doing about this. So, I reached out. I started where I normally reach out to anyone, which is Twitter. There’s a Kik account on Twitter. It has over 300,000 followers but looking at that account, they haven’t posted a single tweet since October 2019, over a year and a half ago, which was the same month that MediaLab bought them. However, in their bio, there’s a link for help and I need help for sure. So, I click the link which was help.kik.com. The first time I clicked it, I got a 404 page; Not Found. But then I clicked it again and it sent me to a Zendesk portal that doesn’t exist. The site just says ‘Oops, that help center no longer exists.’

Okay, so Kik has no social media anymore and their help portal is just abandoned. They just seem to have completely canceled their Zendesk service and didn’t bother to update the URL in their Twitter bio. Alright, so next, let’s see if I can get help in the app. While using the app, I did find a way to message the Kik team directly which is a perfect way to get in-app help. So, I messaged them saying I need help, and the Kik team replies, ahoy there! I say, there seems to be a child porn problem here. What are you doing to stop this? Their reply was, if you believe the message you received contains illegal content, please contact your local law enforcement agency. You can also report this to our support team. Then they give me a link for support, but the link they give me is help.kik.com which is the same link in their Twitter bio, the link that’s been dead for years. So, I tell them that link doesn’t work. They reply to me, follow us on Twitter; twitter.com/kik. I’m like, that Twitter account has been inactive for years.

Then they reply, howdy! At the point, I realized I was talking with a bot and [00:45:00] was getting nowhere and I was getting more frustrated. I really want someone from Kik to talk with me, so I looked for another way to contact them. I go to kik.com and click the Contact button. There are two e-mail addresses there. One is kiksupport@medialab.la. The other is kiksafety@medialab.la. The .la threw me off because the company’s name is MediaLab AI, but okay, whatever, I’ll try these e-mails. I e-mail Kik Support asking to speak with anyone there about this. I get a generic response; thanks for e-mailing us. We are experiencing unusually high e-mail volume and we’ll get back to you whenever we can. They never got back to me and it’s now been three months. I sent an e-mail to Kik Safety; no reply. Not even a confirmation that they got my e-mail. Yeah, that’s also been three months of waiting with nothing from them, either. I go back to their website. Who’s running Kik, I wonder?

There’s a link at the bottom of the page called Safety Center. I click it; it has lots of information on being safe in the app but then at the bottom of the page are people’s faces. Meet our Safety Advisory Board. There’s a picture of Anne, Brooke, Justin, and Hemu. These are their advisors and Brooke is specifically advising them on child exploitation problems. Perfect. So, I message her on Twitter. No response. I message her on LinkedIn. No response. I give up on her and start going down the line. Anne is another advisor. I send her a tweet. She does reply but she tells me she hasn’t worked on the Kik Advisory Board in years. I also see Justin’s on the board. I tweet at him but he replies saying nope, I haven’t been on the board for years, either. Okay, so this entire advisory board which is posted publicly on their website is no longer valid. I’m starting to think they don’t have any advisors at this point and have not updated their website in years.

It feels to me like MediaLab has abandoned Kik. Their website is defunct, their Twitter is defunct, there’s no phone number for them because there’s probably no one to answer the phone. Nobody appears to be home at Kik. But according to the Google Play Store, the app is still getting updates. About once a week it’ll have some kind of update, so someone is clearly back there doing something. But I’m telling you, after trying for months and months and months to get ahold of someone, anyone, I was completely unsuccessful. As far as I can tell, nobody is there. Once again, MediaLab is who owns Kik. They took it over when Kik was facing those SEC legal troubles and they knew it was losing money like crazy but they’re trying to make it profitable with ads. My theory is that it’s not profitable or MediaLab is just really trying hard to cut corners as much as they can by just not staffing effectively and not taking child pornography problems seriously. They only do the bare minimum just to maintain the app. To get more help, I called up my friend Katelyn.

KATELYN: [PHONE RINGING] Hello?

JACK: Hello.

KATELYN: Hey, how are you?

JACK: Great. Thanks so much for taking my call.

KATELYN: Oh, no problem.

JACK: This is Katelyn Bowden and she’s badass. In fact, she’s so badass she started something called the Badass Army which stands for battling against demeaning and abusive selfie-sharing. Basically, if someone takes your nude photos and posts them publicly without your consent, that’s a problem. It’s devastating and Katelyn has been helping victims of revenge porn for a while now by doing things like helping people get their images removed from the internet.

KATELYN: Yes.

JACK: Have you ever had to try to get images removed off of Kik?

KATELYN: Yes. Actually, that was one of the largest platforms that we worked with ‘cause there are so many different ways that these images are getting shared, and Kik is one of the platforms that people feel more anonymous on, so therefore they’re more likely to do this sort of thing. In the beginning it was almost impossible to get anything removed from Kik. Then for like, a few months, right in the middle – I want to say it was toward the end of 2018 or so – they were really great about responding. But it didn’t last really for very long at all. Suddenly everything just was going unanswered again.

JACK: When you say responding, there’s – this is a non-consensual posting of a nude photograph; can you please remove it, and they would remove it?

KATELYN: Yes, they would. Well, they would shut down the chat room that was – the picture was being shared in or they would shut down the user’s account that was sharing the image. Or if law enforcement was contacting them, they would respond with the information that they were asked. With Badass, we normally would use a DMCA. There was the copyright violation and they [00:50:00] just would shut down what needed to be shut down. It would remove the image.

JACK: Okay, so at some point they stopped removing images for you.

KATELYN: They stopped responding at all. There wasn’t even an easy way to get ahold of them. Suddenly the e-mail – the e-mail that we had been sending our DMCAs to was no longer active. For a while there was just no response at all. Then eventually things were getting bounced back. Then we found other e-mails for the company that had bought them and still just nothing.

JACK: Okay, so…

KATELYN: It was like we were just throwing these e-mails out there and nobody was reading them.

JACK: That was a while ago. Here we are in 2021; are – do you think these images are still up?

KATELYN: Well, actually, when you had brought up the subject of Kik with me, I was just curious; there was an account that I had used to infiltrate these picture-sharing groups. I haven’t used it in a while. I haven’t checked it in a while. But it got my curiosity going and there was a court case about a year back where the creep that had been sharing the images was found guilty and sentenced and part of my job with Badass was helping the victim clean up the mess now that it was no longer needed as evidence. So, out of curiosity, I went to that old account and I know that I had sent out DMCAs to get these images removed and then I just kind of – well, I was busy and I never really double-checked and triple-checked to make sure they were removed. Much to my dismay, they’re still up. These images are found to have been illegal. It’s not like this is just a copyright violation or anything like that in their eyes. These are images of a teenager that are still up and available.

JACK: Well, specifically under eighteen.

KATELYN: Yes, specifically under eighteen.

JACK: This is really frustrating me. Kik has completely stopped removing revenge porn photos, too? Or not even respecting DMCA takedown requests? What the hell, Kik?

KATELYN: Well, the issue is that if Kik knows that these images are out there, they then are obligated to clean them up. If they don’t clean them up and they know about it, then they’re in big trouble. So, what they’re doing right now is they’re turning a blind eye to it. They’re ostriching; they’re sticking their head in the sand and pretending this isn’t a problem.

JACK: Yeah, good point there. If Kik doesn’t know this is going on in their chat app, that’s abhorrent. If they do know and they’re not doing anything about it, that’s also abhorrent. There’s no explaining the situation at this point.

KATELYN: The issue is that Kik doesn’t moderate, okay? So, there’s nobody sitting there watching any of this stuff happen. Unless they are being made aware of the issue, they don’t know it’s happening. They’re in big trouble for not moderating at all ‘cause they’re allowing these illegal images to still be on their platform. But the issue is if they know about it and they still don’t do anything, then they’re in much bigger trouble ‘cause then they’re knowingly doing it.

JACK: What’s so frustrating about all this to me is that Kik doesn’t seem to be held responsible for any of the problems that they’re allowing to happen.

KATELYN: That is actually kind of a safe haven. I’m gonna say something that’s gonna be extremely unpopular because – and even I don’t like saying it. That is that, Section 230, it makes it so that the platform itself is not responsible for what the users do on it. If a user is committing a crime with the platform, the platform is not liable. Until a method is found to incentivize platforms taking the initiative to delete these images and to make sure that their platforms are moderated, then there’s no reason that they feel they need to do it.

JACK: I’m getting really frustrated now. My palms are actually sweating. What does a person do when they get this frustrated? That’s it; I’m calling the authorities.

JP: [PHONE RINGING] Hi, Jack. How are you?

JACK: Hi. Thanks for taking my call. So, let’s start out; who are you and what do you do?

JP: My name is JP Rigaud. I serve as a Special Agent for the Ohio Attorney General Dave Yost Bureau of Criminal Investigation. I’ve been an agent since 2012.

JACK: Okay, so, I have a crime to report.

JP: Oh, wait, you may have to talk to the police department for that, not me, but go ahead.

JACK: Okay, well, there has been quite a lot of child [00:55:00] sexual abuse material on Kik recently and I don’t think Kik is moderating this.

JP: We’ve wanted to – it probably takes up a significant portion of not just, I think, the time of the agents that I serve with but the agencies that we help out. Again, it’s – Jack, it’s fairly common that we are, in a sense, tackling these. But as you know, it’s the World Wide Web; it’s huge. It’s well beyond what any agency themselves can take on.

JACK: JP went on to explain to me that whenever they receive reports of people trading child porn, they take it very seriously and investigate the person and make arrests whenever they find evidence. The reason why I called him is because he’s dealt with this exact case a bunch of times. He’s dealt with child porn issues in the past and arrested people. So, it’s true; they do take this seriously. As we heard earlier, there’s no shortage of news stories about people being arrested for trading child porn on Kik, which is good. Those people should be stopped, but I feel like the common part of this story is that Kik is seemingly permitting this type of activity. Is there some kind of legal trouble that they can get into for allowing this?

JP: That’s a good question. I don’t know. It’s probably a little bit too big of a – I want to say something for me to put my hands around and really, I would need to really pull in some minds in the sense of prosecution, some legal perspective on – I’m not disagreeing with you that they should be about the work of making this a – how do I want to say it – a safer place not just for kids but for all of its users. But I don’t know that I could even speculate how that would happen. But when I hear you say what can we do, just keep reporting. I know NCMEC has been a fantastic organization. In a sense, many times you see these reports of the arrests and whatnot but many times it starts from them. In their history too, they were and they still are about the business of trying to protect and serve our communities in the sense of no, they’re not law enforcement but they have that drive to see something resolved.

JACK: Okay, so his suggestion is to report this to NCMEC which is spelled N-C-M-E-C. This is the National Center for Missing and Exploited Children. They run something called the Cyber Tip Line. The Cyber Tip Line was created by congress to process reports of child sexual exploitation and then take these reports and help enrich those databases of known child porn hashes as well as reporting it to law enforcement. This is probably where that PhotoDNA service gets its hashes from. The Cyber Tip Line puts out yearly reports on what it sees and in 2020 it received 21 million reports of child sexual abusive material. Wow, that’s a lot. That poor person who has to go through all of those reports must have no hope for humanity. But I’m digging into this report to learn more. There’s two groups of people who report to the Cyber Tip Line. The first is just regular people like you and me. If you see something, you can say something and tell the Cyber Tip Line.

But in 2020 there were only 300,000 reports by the public, so where’d those other 20 million cases come from? The companies themselves who saw the explicit material on their platform self-reported. For instance in 2020, Facebook the company reported 21 million times that they saw child porn on their site. Google reported that they saw it 500,000 times. Imager.com also reported 31,000 instances of child porn to the Cyber Tip Line. So, how many times did Kik report child porn to the Cyber Tip Line in 2020? Well, if I search for Kik in the report, it’s not present. But if I search for their parent company, MediaLab, it is there and it says they reported 14,000 times to the Cyber Tip Line in 2020. But it’s not clear if that was for Yik Yak, Whisper, or Kik since they own all these apps. I asked NCMEC to clarify how many were for Kik specifically but they refused to provide extra detail. They’re super busy anyway. So, it seems like MediaLab is doing something but it just seems like the bare minimum here, like just enough to stay out of trouble or to be able to say in court yeah, we have filters in place and are actively reporting things to the Cyber Tip Line.

But they can do so much more. They’re letting so much go through without any consequences to the users trading it. But are they in violation of any laws for permitting this activity? [01:00:00] That part I don’t know. I’m not a lawyer, but I do wonder about COPPA laws. This is the Child Online Privacy Protection Act and it was put in place to safeguard the data of children who are under thirteen. Now, in the terms of service on Kik, it explicitly says that anyone who is under thirteen is not allowed to use the service, but is that good enough? I mean, we clearly know there are kids under thirteen on Kik, but does Kik know that, too? In 2019, TikTok was under scrutiny for violating COPPA laws. They were illegally collecting personal information from children under thirteen and they agreed to settle a lawsuit and paid 5.7 million dollars as a result of this allegation. Surely if TikTok has been found to violate COPPA laws, then Kik must certainly be violating them too, right? How do you get the FDC to create a suit against someone? I don’t know. I suppose you’d have to start with evidence; screenshots, pictures, videos, chat messages.

But there’s no way I’m going in there and collecting that. You kidding me? I know I don’t have what it takes. I think what we have to rely on is some watchdog group or government entity seeing the prolific problem that Kik is allowing and trying to bring them to court to prove what, though? That they’re negligent at fighting child porn on their app or even obligated to do that? Section 230 says the app itself isn’t responsible for something that the user does which might be illegal. It might be really hard to prove anything here, so if a group brings this to court, good luck. But that wouldn’t be the first time Kik would face legal problems. You already heard about their SEC lawsuit. Well, they lost that case which resulted in Kik having to pay a five-million-dollar fine. Or, I guess the Kin Foundation had to pay that. But even before that lawsuit, the parent company to BlackBerry issued a lawsuit against Kik saying they were infringing upon some BlackBerry copyright. Kik settled that lawsuit and paid an unknown amount, so you can see that they haven’t been the cleanest company legal-wise, but that’s just with Kik Interactive.

What about MediaLab AI, the company that owns Kik now? Well, looking at PACER records, I do see a handful of lawsuits against MediaLab. One was a teenager who claimed she was sexually harassed on Kik and she was suing simply because the app didn’t warn parents clearly enough that pedophiles were active on this app. That case got dismissed. There was another lawsuit; it looks like WorldstarHipHop is saying that MediaLab published a copyrighted video. That one’s still going on, but there’s one more that’s pretty interesting. MediaLab owns Whisper which is a chat app, but I guess you can share secrets with others anonymously. Apparently in March 2020, a security researcher found that the Whisper database was sitting right there on the internet without a password. This left a lot of people exposed; aged and locations of users were leaked. If you looked at this database, there were 1.3 million users on Whisper who were fifteen and under. In total there were 900 million user records in this database breach.

So, this resulted in a class action lawsuit where people were saying they were suffering damages from getting their data exposed like this. It should have been more secure. MediaLab did reach a settlement agreement and paid the victims for this but we don’t know how much was in that settlement. The victims were seeking five million dollars but it just doesn’t say what was agreed on. Oh, and if you read the app reviews for Whisper, it also looks like that app is not doing well, either. It seems like there’s lots of scammers on the app now and maybe even prostitutes. Users are reporting there’s just not good moderation taking place there. Yeah, it doesn’t seem like MediaLab is the cleanest company legal-wise either, and apparently not the most secure because of their Whisper database breach. But maybe there’s the Parler option. Remember Parler, that right wing social media app? Well, after the insurrection on the US Capitol in January of this year, Google and Apple told Parler they must moderate their content or they’ll be removed from the app stores. That was the line they drew; they had to moderate what was going on in the app.

Parler didn’t moderate their content and so, they got kicked out. Perhaps Google or Apple should do the same with Kik. Say hey, you’ve got a real bad child porn problem going on here and you’re not working hard enough to fix it; either moderate or get kicked out. By the way, this wouldn’t be the first time Kik would be kicked out of an app store. They were first kicked out of the BlackBerry app store after their lawsuit with BlackBerry. BlackBerry really didn’t like them after that and they just removed them from the BlackBerry app store. Kik was also removed from the Windows Mobile app store. This one looks to be more voluntary, though. Kik was just done updating their Windows version and they’re just like, we’re just done with this. So, that’s two app stores they’re gone from, but they’re still present and available in two of the biggest; Apple and Google.

I think Google and Apple here have a solid ground to throw Kik out of the app stores. I mean, it’s pretty easy to go into Kik and see for yourself all the child porn that exists in there. You might have [01:05:00] to trade some child porn in order to see all the rooms to prove it yourself, but once you do, you’ll be able to get access to rooms and see it for yourself. But is that going too far, ganging up on Kik and kicking them out? Yeah, it’s hard to know what’s right and wrong these days, but let’s do a thought experiment. Suppose this app had only one use which was to spread child porn. If 100% of their users were there just for that, would anyone have a problem kicking them out of the app store then? The question becomes how many bad users or violations does the app have to have before it can be considered for removal from the app stores?

You know, when a bar that serves alcohol has too many reports of disorderly conduct, they can get their liquor license revoked, or if the bar doesn’t prevent illegal activities going on in the bar, they can also have their liquor license revoked. But there isn’t any clear rule like that with social media apps. The only way to get your app revoked is to have it removed from the app stores which are just private companies, and that’s putting a lot of burden on the app stores to figure out who should stay and who should go. It also gives them a lot of power, that they can just remove apps whenever they feel like it. But looking at the Google Play Store, there are clear guidelines that all apps must follow to be listed in their app store. Here, let me show you what they say.

HOST4: Before submitting your app, ask yourself if it’s appropriate for Google Play and compliant with local laws. Our restricted content policies cover a broad set of topics. Child endangerment is never acceptable on Google Play. If an account is found to violate these policies, we’ll take action, including reporting the account to the appropriate authorities.

JACK: Okay, so if the app is endangering children, then it’s not acceptable in the Google Play Store. But that’s not the intent or purpose of Kik. Any chat app can endanger a child, so I guess it comes down to what the users are doing in the app.

HOST4: User-generated content hosted in your app must meet certain requirements including the implementation and use of content moderation and reporting systems.

JACK: A-ha, there it is; user-generated content must be moderated or you can be kicked out. So yeah, that’s where I’m putting my finger, right there on that violation. But I think it still would be pretty hard to prove you aren’t moderating, because what does that even look like or mean? Blocking curse words in chat room names could be considered moderating and they do that right now, so what do they really need to do or change here and how would they be able to prove that to Google if Google was mandating they do this? It’s really tricky stuff. Also, don’t just take my word for this. Try this; Google the word Kik, K-I-K. Then click the News tab on Google so you see all news articles about Kik.

I guarantee you the first three, maybe even ten pages of results are all about child porn on Kik. If all the news articles about your company are talking about child porn on your app, then that should be clear evidence that your company has a very bad child porn problem. Other major publications have highlighted this problem, too. There’s a Forbes article which is titled The $1 Billion App Kik Can’t Kick its Child Exploitation Problem. That article was written in 2017, way before MediaLab even bought them, so this isn’t a new problem for Kik. There’s also a New York Times article which is titled The Wild, Popular App Kik Offers Teenagers and Predators Anonymity, and that article is about how two college guys coerced a thirteen-year-old on the app to meet them in real life, and then they killed her. If you Google other chat apps like WhatsApp, WeChat, Signal, you simply don’t see articles like this or articles about child porn almost at all.

So again, this is more proof at how rampant the child porn problem has become on Kik. I want someone to investigate this further, to really know what’s going on, someone who has the teeth, who can actually make something change from all this. All I can do is underline and highlight that this is a problem, but nothing is going to actually happen from me talking about it on this episode. It’ll be someone else who actually has power to do something that will actually make a change. [MUSIC] That brings us back to Azrael. Azrael is the hero you didn’t know you needed. Remember last we heard he was just getting into using modified Kik clients to take over chat rooms? Yeah, well, he was doing good with that Kik clan and then he joined a new clan called Xensec. Now, because he was the new guy, they wanted him to prove himself.

AZRAEL: What they wanted me to do was they wanted me to either bait somebody into proving themselves – proving to us that they’re a pedo or they wanted me to grab a pedo’s IP and basically [01:10:00] get all their ISP information and all that stuff.

JACK: Okay, whoa, whoa, whoa, this is the first time we’ve heard the word ‘pedo’ on here, on this interview, so was this the first time you’re experiencing that there’s pedophilia on – or…

AZRAEL: It was the first time that I saw that pedophilia was a large-scale problem, yeah. Xensec focused solely on the pedophilia problem.

JACK: So, okay, so when they said that to you, you must have been like, the what problem? How am I supposed to find these guys?

AZRAEL: Kinda, yeah. It was like, alright, so how do I get a pedo to do this? At the time, I had a stanza that – I plug in my link to the stanza and it basically shows up on the receiving end as an invisible picture. Then they’re like, why isn’t this loading? They click it; I have their IP, get all their ISP information. So, that’s what I used. Basically what happened was one of our other members came in and dropped an @ of a known pedophile. I was like…

JACK: You dropped a what? An app? An @?

AZRAEL: An @, so, your…

JACK: Oh, so it’s just your username.

AZRAEL: …profile – yeah, dropped their username. I was like, you know what? Fuck it. Not doing anything. Let’s get this dude. [MUSIC] So, I dropped the stanza and then I said something to make it so he would open the chat.

JACK: Okay, so a classic phishing scheme. He said something that he thought a pedophile would be interested in to get them to read his message. But that stanza of code he put in the private chat would collect more information on this person if they read the message. It basically tells the Kik client to reach out to a website for something and when the user goes to that website, Azrael can see what IP just went there and the IP can tell him what city and location this user is in. While you can use a VPN with Kik, I’ve been told that Kik just doesn’t work very well over VPNs.

AZRAEL: Exactly. Let me tell you, dude, pedophiles will click anything.

JACK: So, he collects this extra information on the suspected pedophile and then reports all of what he finds to the admins of this new clan that he’s trying to get into.

AZRAEL: The admin’s like oh, you didn’t even tell us you were doing that. That’s great. You’re one of us.

JACK: So, he was now in a Kik clan that specifically targets pedophiles and child-porn traders.

AZRAEL: When I say pedophile, especially on Kik, I mean people that are sharing straight-up mega links to terabytes of CP. It’s just, it literally makes me sick to my stomach to think that this is out there. You know what I mean? People are exploiting children and people are paying for it and sharing it on a child’s app.

JACK: [MUSIC] I’ve been the guest of honor in this clan. I’ve got to go in and check it out, and it’s wild. They’ve got bots that are there collecting all kinds of information, and any time the chat room finds a person involved with this kinda stuff or a chat room, they just feed it to the bot. The bot has a list of all bad actors in chat rooms and when the clan is ready to do a raid, they know exactly what channel to hit. Then the people in the clan will get their souped-up versions of Kik and go raid different chats in an attempt to take over the chat room and close it down. They all pile into a channel and then start spamming it.

AZRAEL: So, there’s two different types of spam codes. There’s a QOS attack code which is literally just as much as you can pack into a single chat bubble. Then there is crash codes. Crash codes rely on actual zero-day exploits within the Kik system which will force-close your app if you look at the code.

JACK: Now, they could use some tricks to try to take over the chat and close it down, or they could just try scaring people.

AZRAEL: We rely honestly heavily on the fear factor. If somebody comes in and just spams your room and you don’t know what’s happening and their account looks scary and the bot they’re using looks scary, you’re just going to assume that person is scary. If you are forced to exit the chat, how did they do that? How am I gonna join back? What happens if I join back? You know what I mean?

JACK: So, you’ve gone in and spammed lots of pedo rooms just – how many do you think your crew has spammed at this point, [01:15:00] or raided?

AZRAEL: We have hundreds. I personally have only raided ten or so myself. I prefer phishing at this point, taking accounts that way, because if I log into an account, I am taking that account out of play. If he owns rooms, I can purge them. If he’s victimizing a child, I’ve cut contact. You know what I mean? But we have hundreds of rooms that people who are better at spamming and attacking have taken.

JACK: I want to remind you that all this is done on mobile devices since Kik is just a mobile app. How many phones do you have?

AZRAEL: One.

JACK: You’re doing all this on one phone?

AZRAEL: Yes.

JACK: With multiple Kik apps installed?

AZRAEL: I have – let me count my Kik apps for you. One, two, three, four, five, six, seven. I have seven Kik accounts…

JACK: So, this must be…

AZRAEL: …on this one phone.

JACK: …each is a different APK ‘cause I don’t think you can make the same APK installed, you know…

AZRAEL: Yes. Yes, so I have seven different APKs.

JACK: So you have to have multi-task on your phone, like oh, let’s go to all these different Kik apps.

AZRAEL: Yep.

JACK: Oh man, that just must be like – that must be an evening.

AZRAEL: It’s a lot to do all at once and every single time I raid, my heart is racing. It’s such a rush ‘cause there’s – you’re doing so much all at once and if you fuck up one little thing, your raid stops.

JACK: Now, Doc, the guy trading this stuff on Kik, has seen these raids in his chat rooms.

DOC: Yeah, multiple times.

JACK: Of course, he finds it annoying.

DOC: Usually it’s not that effective but I have seen sometimes where one person comes in, starts talking, being nice, and then they realize there’s child pornography in this room and they start sending very, very hateful messages almost as if a bot takes over, because they send it to a lot of people very, very quickly. Then they start spamming messages, often just some long copy-paste message they send over and over and over again. They get banned by the admin but then later, they join again. They shouldn’t be able to join because they’re banned but they do, and then they start spamming again. Then another person comes in and starts spamming, and then another person comes in and starts spamming. Then there’s usually three or four people at least that keep spamming and no matter what you do as an administrator, you cannot get them out. They keep being able to join for some reason, and then people start leaving because it’s annoying. Because even if you block this person yourself, you will still see that a message is being hidden from you, so you will still see these – ‘this person has been blocked’, ‘this person has been blocked’ filling up your screen over and over again, so the rooms kinda die from that.

JACK: Has your account ever been crashed? Does your app crash sometimes or hang or you just can’t type anything anymore?

DOC: My app didn’t crash but I know that a lot of peoples’ apps did. I think I was just generally using a good phone. People on iPhones usually had their apps crash.

JACK: What a weird battle this is.

AZRAEL: In my clan right now, we have literally thousands of chats, hundreds of accounts that need to be taken down. There’s only twenty of us. We can only do so much. It’s gotten to the point where I literally have nightmares about a child being victimized because I didn’t take out that chat or that account.

JACK: I thought it was ridiculous to talk about chat-raiding clans at first. Then, I guess I still think it’s ridiculous. Like, what does this even matter, right? But now it looks like Azrael is a vigilante of some kind. He uses his raiding skills for good. He cares about combating child porn so much that he feels like it’s his duty to do something. He can’t just stop when he knows it’s constantly going on and no one is stepping up to fix this problem. Now, Azrael’s clan has of course reported numerous people and chat rooms to Kik. They say for the most part, Kik does not take action. If they do take action, it’s typically months later and then they might see a ban or a [01:20:00] closed channel. So, they’re doing their part by telling Kik about this problem.

AZRAEL: Recently I tried to – so, I’ll grab somebody’s ISP information, run that through some tools to get as much information as I possibly can, and then I will drop that to the Federal Bureau of Investigations Cyber Crime Division which is absolutely scary to me because cyber-crimes. I’m firmly grey hat, so I could get myself in trouble.

JACK: Yeah, I mean, what he’s doing is phishing people and doxxing them and he’s using a hacked Kik client that goes against the terms of use on Kik, too. I bet if Kik knew about the clan or Azrael, they would immediately ban them from the app because they’re breaking the terms of use, which is just ridiculous. Yes, the power they have can be used for something awful, but they’re using these powers to fight something much more awful on Kik.

DOC: At a point, I kinda realized what I was doing and how bad it was, and I – well, really, I freaked out and I didn’t sleep for like, two whole days. In my sleep-deprived state, I sent an e-mail to you which probably looked kinda panicked in a way. It was, and I was – I needed a voice of reason, someone I trusted who knew sort of the inside of how different IT security things work and who knew the potential of me having the contacts I did, but also the danger of having the items. I was being – I’ve been told by you that I should delete them.

JACK: Yeah, I said why don’t you just delete everything? Uninstall the app and stop?

DOC: Yeah. In my power state, in the state of having that much power, it was tough. It was very, very hard to do because it felt wrong. I had worked very, very hard to get more than 8,000 items, items that I had catalogued in my head, so I knew them by heart. I knew so many things that I had, all these things, and people look up to me. I was being told to delete them, to throw away all this power. It took me three more days and I talked to a friend of mine whom I really, really trust about it as well. I ended up deleting it. I deleted everything I had.

JACK: You think someone should just go shut this thing down like they did with Parler?

DOC: I want Kik to shut down but I also want the police to actually utilize what is right in front of them. Kik is such an easy place to get – to spot and catch predators, but they aren’t being caught in there. Things are running rampant and they could use this tool at their disposal, but they’re not. While I want it to close, I also want it to be exploited by the police, so to speak.

JACK: You might be wondering maybe this is some sort of honeypot set up by the police. Well, that’s actually happened a few times. There’s a Forbes article about a guy who was arrested for trading child porn on Kik. The police commandeered his phone and got access to his Kik account, but they didn’t delete it or shut it down. They made a deal with this child porn trader to keep trading on Kik so they could collect information on other Kik pedophiles. The guy made the deal and stayed on Kik trading child porn while the FBI watched over his shoulder. It’s weird and creepy as hell that this happened because this Forbes article says it’s not clear if the FBI actually caught anyone else from this operation. What’s more is that Forbes asked Kik about this operation and Kik had no idea there was an undercover operation going on in their chat app. So, this tells me Kik wasn’t working with the FBI on this one. All this is to say that yeah, while the police may be using the app to do some sort of sting operations, I don’t think there’s any major coordination between Kik and the FBI to conduct lots of sting operations or anything. If this is some sort of honeypot, it’s permitting quite a lot of vile stuff to get through the cracks.

KATELYN: I absolutely do believe that Kik should be held to the same standards as every other social media platform, including Parler. If they’re refusing to moderate and refusing to remove illegal content, [01:25:00] then they don’t have a place among the app store if that’s the way that they want to do business.

AZRAEL: I have really mixed emotions about that. This is my hacker origin story. I came from Kik. Everything I know came from Kik. To have it shut down would hurt me but after everything I’ve seen and everything I’ve experienced, I’m scared that there’s really no other way. I also strongly dislike the idea of the powers that be just flipping a switch. You know what I mean?

JACK: No, I don’t. Why would that be a thing? I think it would shut down this whole system and that’d be great.

AZRAEL: It would, but then that’s a scary line to cross, you know? With Parler, I never got on Parler, so I can’t exactly say they didn’t deserve it. But I don’t think anybody deserves it. You know what I mean? Nobody should be able to just flip a switch and just end things but in terms of Kik, I think that might be the only way.

(OUTRO): [OUTRO MUSIC] Thank you Azrael and Doc for sharing your very personal stories with us. Also, thank you to Katelyn Bowden and JP Rigaud for lending your voice and being part of this, too. If you ever encounter child porn yourself, please report it to the Cyber Tip Line which you can find at cybertipline.org. If you’re a listener all caught up and can’t wait for more episodes, then you must find this show valuable, so please consider donating to the show on Patreon. This will tell me loud and clear that you love it and want more of it, and it’ll give me the means to keep it going. So please head over to patreon.com/darknetdiaries and show your support. Thanks. This show is made by me, the leader of the operators, Jack Rhysider. Research and fact-checking by the disciple Sean Summers, editing help this episode by the pack leader Damienne, and our theme music is by the rust devil, Breakmaster Cylinder. Even though the NSA is one of the few government organizations that actually listens to you, this is Darknet Diaries.

[OUTRO MUSIC ENDS]

[END OF RECORDING]

Transcription performed by LeahTranscribes