Quantcast

Grand Canyon Times

Friday, May 17, 2024

Grand Canyon Times Podcast: Facebook Whisteblower Ryan Hartwig

Webp hartwiggrandcanyontimes

Phoenix-area resident Ryan Hartwig, who gained national attention for "hidden camera" footage about Facebook, joined the Grand Canyon Times Podcast. 

This episode is also available on Apple Podcasts and Spotify.

In 2020, Hartwig released hidden camera footage from his work as a “content moderator” for Facebook while working at the Phoenix offices Cognizant. That footage, released through Project Veritas, featured conversations with co-workers and “documents showing Facebook’s policies and patterns of bias,” said Hartwig.

Hartwig also co-authored the 2021 book, “Behind the Mask of Facebook: A Whistleblower’s Shocking Story of Big Tech Bias and Censorship.”

He recently starred in the movie “Police State,” by filmmaker Dinesh D’Souza, in which Hartwig is interviewed about censorship by tech companies. 


Full, unedited transcript of this podcast:

[00:00:00] Leyla Gulen: Welcome to the Grand Canyon Times podcast. I'm your host, Laila Gulen. In this episode, we welcome our guest, Ryan Hartwig. Ryan became the person that many people don't know who and where they exist. The mysterious, faceless, nameless wizard behind the curtain, if you will, moderating your social media posts.

[00:00:22] Leyla Gulen: Specifically, Facebook, for whom he worked through a third party contractor, as a content mob for nearly two years. That is, until he peeled back his own curtain with his 2021 book, Behind the Mask of Facebook, a whistleblower's shocking story of big tech bias and censorship. Ryan, welcome. Thank you. Thanks for having me on.

[00:00:44] Leyla Gulen: You have one of the most, I think, interesting jobs. Being a social media moderator, what was it like having access to accounts and seeing things that you often can't unsee? 

[00:00:57] Ryan Hartwig: It's definitely, definitely a very unique [00:01:00] job in that, yeah, we're seeing things that most people have never seen before in their lives.

[00:01:05] Ryan Hartwig: The worst of the worst on the internet. We were, we were reviewing cartel videos occasionally, pornographic content. Did she say anything? Yeah, it was really absurd and horrible content. We also had the Light Heritage contest. We had the memes and the political content. So it was kind of a mixed bag, because one moment you'd be reviewing a cartel video, the next moment you'd be reviewing a funny meme.

[00:01:25] Ryan Hartwig: So it was quite the experience. Sounds like 

[00:01:27] Leyla Gulen: a rollercoaster of emotions. 

[00:01:30] Ryan Hartwig: Yeah, no, definitely, because you're, you're dealing with stuff that's traumatic, and you're trying to keep it all together, and we had counselors on site, so it really was, and most of us were pretty young, like, I was in my late 20s, most of the people there were in their early 20s, so people with not much life experience, but we're seeing some of the craziest stuff on the internet.

[00:01:49] Leyla Gulen: Yeah, I want to dig into that a little bit in a moment, but as a moderator, to what extent did you have access to a person or a group's account? Were these private accounts, direct messages [00:02:00] even? I think people who are users of these social media platforms are always so curious as to who the eyeballs are behind the scenes that they don't know whenever they type a word or post a 

[00:02:13] Ryan Hartwig: picture.

[00:02:15] Ryan Hartwig: Yeah, it's definitely so. So the people that we Uh, the access we had, it was mainly, we didn't have access. Occasionally we had access to direct messages, but normally it was just posts, comments, videos, both for Facebook we were doing both and. Yeah, so we, and we didn't have, like, I, I couldn't just delete someone's account, I would, I would be deleting the content itself, so as far as control over whether it would be their whole account would be taken down, I didn't control that, I would just delete one post at a time, um, so it would I didn't have, yeah, it would just pop up on my screen, I didn't choose who I, I couldn't target people, or search accounts, but I could just make a determination on their account whether it violated, and I have had to follow a very, You Uh, Wordy, Facebook, [00:03:00] Community Standards Policy, and internal 

[00:03:03] Leyla Gulen: documents.

[00:03:04] Leyla Gulen: Yeah, it'd be very easy to go down a rabbit hole, I think, with what you did over at Cognizant. And that was the tech company, the third party company that was contracted through Facebook. How did that arrangement happen? 

[00:03:16] Ryan Hartwig: Yeah, so that, that arrangement, so it was a two year, 300 million contract. And Cognizant normally isn't in that space, and it's a very unique space.

[00:03:24] Ryan Hartwig: But basically what happened is, after 2016, and this is what our superiors would tell us, After 2016, Facebook realized that Russia stole the election, or influenced the election, and so they wanted to prevent election interference. So there's a big push in 2016 2017 to bring a lot of these jobs for content moderation on shore.

[00:03:45] Ryan Hartwig: So prior to this, most of the content moderators were overseas, which makes sense, it's a lot more affordable. You know, a lot of the people overseas don't understand American politics. So there's a huge push to, yeah, hire U. S. based content moderators. [00:04:00] And so that's when that big hiring thing started in 2017.

[00:04:03] Ryan Hartwig: So it was, yeah, three year contract from 2017 to 2020 with Cognizant. And so we had about, I think at its peak we probably had at least a thousand content moderators at our site in Phoenix, Arizona here, off the I 17 and Dunlap. So that's basically how it started. And they were hiring, we were making 15 bucks an hour.

[00:04:24] Ryan Hartwig: It was close to where I lived, so it was convenient. But yeah, that's kind of how they got the contract. 

[00:04:29] Leyla Gulen: I mean, 15 an hour to have to sift through some of the worst things that you couldn't even make up in your own mind. Now, I know that this company has come under a lot of fire for working conditions, both in the Phoenix location and also in Tampa, where people were suffering from Post traumatic stress disorder, one person had a heart attack, I mean, were working conditions dire, in your opinion?

[00:04:58] Ryan Hartwig: So, in my opinion, [00:05:00] so they did try to, at least in my site, I think Tampa may have been worse, but Phoenix, in Phoenix, we had the counselors on site, we had the counseling, how long we could call, the workplace itself, some people imagine like being working in the dungeon, a lot of black walls just in front of our screen all day.

[00:05:16] Ryan Hartwig: And Casey Newton with The Verge did an article about that and he came to our site, but you know, as far as like the workplace itself, it was pretty cheerful as far as the decor and we had a snack, a break room with snacks and you could literally take a break anytime you wanted, like a mental health wellness break.

[00:05:34] Ryan Hartwig: So I felt pretty well taken care of in that regard. I utilized the counselors pretty often. It was stressful. A 

[00:05:41] Leyla Gulen: lot of rangers though, they kind of know what they're getting, what they're signing up for too though, right? 

[00:05:46] Ryan Hartwig: Yeah, they let you know, I mean, they're not, yeah, they're not, heh heh, in the job interview itself, they tell you, they give you examples of what you might see, like, they showed me a, pulled a, put a laptop in front of me and showed me examples of like, uh, anime porn, [00:06:00] very graphic, pornographic content, and a cartilage video, I believe.

[00:06:05] Ryan Hartwig: And they said, Hey, this is what you're going to be saying, seeing on a daily basis. And I asked them, can I skip things that they're too disturbing? And they're like, they're like, no. So yeah, we knew what we were getting into. That's 

[00:06:15] Leyla Gulen: inherent to the job. The whole point is to not skip and to delete, 

[00:06:19] Ryan Hartwig: right?

[00:06:20] Ryan Hartwig: It's right where you review it. You'd be up to what you have to watch the video. So like if you have a video where someone's being literally like their arms are being cut off. And being tortured by the cartel that you have to review the thumbnails and watch the first 15 seconds of the video with audio.

[00:06:36] Ryan Hartwig: And if there's nudity in the video, that would be higher than, than like a mutilated arm. So if, so It would? Yeah, so that's why you have to watch the video because if you could delete it for the bodily, the severed arm, you would get it wrong. If there's, if you can see their genitalia in the video. Yeah.

[00:06:58] Leyla Gulen: Really? What's [00:07:00] the philosophy behind that? 

[00:07:02] Ryan Hartwig: I don't know. I don't know why they would care so much about the categorizing of that. 

[00:07:06] Leyla Gulen: I mean, it happens in video games all the time. Arms, heads, legs cut off, blood splattered everywhere. But then when it comes to the naked body, that's where they pump the brakes.

[00:07:18] Leyla Gulen: Yeah, I 

[00:07:19] Ryan Hartwig: don't know why it's higher in the hierarchy, but we did have that hierarchy to follow. And so, yeah, it was kind of like watching a Mortal Kombat video, but with real people. And yeah, it was 

[00:07:28] Leyla Gulen: pre graphic. That sounds awful. Well, you mentioned the cartel, and you dealt with the Latin American market.

[00:07:33] Leyla Gulen: You're fluent in Spanish. You also worked on behalf of North American content as well. So can you just talk a little bit about the differences between what was coming out of the two hemispheres? 

[00:07:45] Ryan Hartwig: Yeah. So, so I had access to the training decks from, for both North America and the Latin America. Hughes. So by the first year and a half or so.

[00:07:54] Ryan Hartwig: I was mainly just in the Latin American queues. So I was seeing a lot of content in Venezuela and Peru. There was [00:08:00] a lot of hate speech because a lot of Venezuelans were moving to Peru, like emigrating, leaving Venezuela and going to Peru. A lot of them destitute, kind of this mass migration thing. So there was a lot of hate speech toward the Venezuelans.

[00:08:12] Ryan Hartwig: And then we had the Mexico election in 2018, the Mexican presidential election. So I saw a lot of content related to that. There was a lot of sexting or explicit kind of sexual messages between minors in Mexico, teenagers. And yeah, so, so we saw there was some Venezuela election content as well, because there was, in that time period, there was kind of an uprising, an attempted coup, I believe.

[00:08:38] Ryan Hartwig: Um, and we had training decks for countries across the world as far as elections. So we monitored. We enforced election, like election content in, in Canada, there were training decks for like Taiwan, Spain, Poland, Hungary. That's something that kind of was disturbing is how focused Facebook was on election content.

[00:08:59] Leyla Gulen: Yeah, that's really [00:09:00] interesting. And I want to get into that as well, because you, I wanted you to describe how Katie Hobbs, just as a starting point, as a starting example, how Katie Hobbs, Uh, Governor of Arizona colluded with Twitter and Facebook to censor her political opponents and what legal cases are in progress on that end.

[00:09:21] Ryan Hartwig: So with Katie Hobbs, so she obviously was a secretary of state when she, uh, ran for governor. We have, I guess, again, there is evidence showing that she colluded with Facebook and Twitter, I believe, to censor her political opponents. And so that, that's part of this larger case that we now have with Missouri versus Biden.

[00:09:40] Ryan Hartwig: Where we're seeing, we just had a court give an injunction to the government saying hey, you can't communicate with these social media companies anymore. So that, that's something that I don't have direct evidence of. I mainly just show that Facebook was censoring conservatives and targeting Trump. And they have an amazing ability to influence elections and [00:10:00] suppress conservatives.

[00:10:00] Ryan Hartwig: So, Yeah, if you look at the sheer, so at our site, Phoenix, which is no longer active, the contract ended, we had what, maybe a thousand, let's just say a thousand content moderators, and if they're each all reviewing 200 pieces of content a day, you can do the math, and you're looking at thousands and thousands of pieces of content being moderated, and so, Peep.

[00:10:24] Ryan Hartwig: So that's, yeah, Facebook's ability to influence elections is tremendous. And it's even worse when you have government officials colluding with Facebook to target their political opponents. 

[00:10:36] Leyla Gulen: That's amazing. Yeah. In fact, in your book, you tell specific cases of personalities who are protected by Facebook or who had certain privileges, such as CNN's Don Lemon.

[00:10:48] Leyla Gulen: I mean, he's since been fired for other reasons, but I'm curious to know what other personalities were protected by Facebook. Obviously, you had mentioned left leaning, so we can kind of predict as to who some of those [00:11:00] names were, but can you tell us? Yeah. 

[00:11:02] Ryan Hartwig: So, so with Don Lemon, like he said, Don Laird, Don Lemon said white males are terror threats, and Facebook gave us the guidance that, yes, we know this goes in violation of our policies for hate speech, but we are making a newsworthy exception, so the rules don't apply whenever they want, don't want them to apply.

[00:11:20] Ryan Hartwig: And then Greta Thunberg was, she was being called retarded or being attacked, being called retarded, Greta, retarded, retarded, kind of planned words. And so she, Facebook said, Oh, you can't do that. You can't call her retarded. Even though any other minor public figure would be like under 18 would, you can call them retarded, but just not Greta.

[00:11:40] Ryan Hartwig: So they made a specific carve out of their policies for that. And then Alyssa Milano, the actress, she was voicing in about abortion in Alabama. And. She said men should not be allowed to make laws about women's bodies, and that violates Facebook's hate speech policy because it's excluding men from the [00:12:00] political process.

[00:12:01] Ryan Hartwig: And so Facebook said, Hey, we're going to allow that we're going to allow hate speech from Melissa Milano. So if you're Don Lemon or Melissa Milano or Greta Thunberg, you get special protections. 

[00:12:11] Leyla Gulen: Now, okay, so it all comes back down to freedom of speech, the first amendment. So obviously, once the horse has left the barn, that is it.

[00:12:22] Leyla Gulen: And that's exactly what Facebook and social media is. So, it's this thing that has just mushroomed and grown so exponentially. When you talk about a thousand moderators behind screens, moderating 200 pieces of content every single day, I mean, truly, Ryan, though, that's gotta be a drop in the bucket. How many members are on Facebook today?

[00:12:43] Ryan Hartwig: Yeah, I mean, you look at how many people are online, if you include WhatsApp, it's almost a third of the population of the world. Three billion. All right. 

[00:12:52] Leyla Gulen: Yeah. How many people? Well, it's 

[00:12:54] Ryan Hartwig: over. There's probably, you know, 9 or 10 billion people in the world 

[00:12:57] Leyla Gulen: now. Yeah, exactly. Exactly. So, so [00:13:00] everyone listening can do the math, but yeah, so, so.

[00:13:04] Leyla Gulen: Now, where is the freedom of speech component to this because essentially this is supposed to be a platform and I'm not here sitting here saying you can just post whatever you want. I mean, you'd hope that people would be responsible in the things that they post, but you can't trust everybody and they're going to do what they want to do.

[00:13:24] Leyla Gulen: But, but honestly, though, when it comes down to free speech, that's obviously not really harmful and it's obviously someone's opinion. How and why does that get. Lagged, deleted, locked, time and time again, and, and do you foresee that to ever change? 

[00:13:43] Ryan Hartwig: Yeah, so it's, there's a First Amendment issue to it, um, that's part of it, um, and, yeah, there's definitely algorithms and filters they use to, um, Detect certain types of content and flag it without human intervention.

[00:13:59] Ryan Hartwig: [00:14:00] So I'm sure even now, especially with artificial intelligence and AI, they're using that even more to filter out certain content and hashtags that they don't want on their platforms. So yeah, they really, their whole argument was, Hey, sign up for free. This is a free speech platform. Mark Zuckerberg has said that before.

[00:14:17] Ryan Hartwig: It's, it's not true. And they, yeah, so they're, they have a, they're able to shape the narrative. They're able to influence public discourse. And, you know, you think about three billion people, third of the world, third of the population on this platform, they, yeah, these companies have way too much power, Facebook and Google.

[00:14:34] Ryan Hartwig: Just one example that comes to mind is when Trump gave his State of the Union speech in, I believe, 2019, Facebook told us to look for hate speech coming from the President of the United States. They're, yeah, they're clearly not trying to help public discourse, and the cure to, there's bad speech, there's hate speech, there's, I don't like the word hate speech, but there's people who say bad things on the internet, or [00:15:00] in real life, and if someone's walking on the sidewalk and they said bad things, we can't throw them in jail for that, but Facebook does that all the time, and so, Facebook The answer to bad speech is more speech, so they're almost making the problem worse.

[00:15:13] Ryan Hartwig: If you look at the division in our country today, Facebook doesn't help our country at all as far as being united. So when, you know, there's a First Amendment issue, there's also a Fifth Amendment issue. So the Fifth Amendment deals with due process. And so if my, if you deprive someone of liberty or property, so if you have a Facebook page with a million followers, and you're a right wing conservative, and one day the next Facebook just decides to delete your page, that's, I mean, you're being deprived of liberty and property.

[00:15:44] Ryan Hartwig: So a friend of mine, Jason Fick, he had that happen where he had, you know, it wasn't even political content. He had a Facebook page in 2016 with over 40 million followers and he wasn't paying enough in advertising ads so Facebook Deleted, took on it down his page [00:16:00] and they sold it to his competitor. So who was his competitor?

[00:16:03] Ryan Hartwig: It's someone similar, uh, who like had a, who was paying more in advertising it, advertising to Facebook. So, so he sued Facebook. It was fic fic, FYK. So it was FIC versus Facebook. Since 2018, he's been suing them. He went to the Supreme Court, they chose not to hear the case. So he's never had his day in court, and now he's suing the government under a constitutional challenge.

[00:16:24] Ryan Hartwig: So those are, we've seen some First Amendment lawsuits against Twitter and Facebook. Like Trump had a lawsuit, a few other people. So that's one avenue, but now that we have the collusion between the government, like with the Missouri vs. Biden case, now we see, okay, Facebook is really just acting as a government agency, and they're being a state actor.

[00:16:43] Ryan Hartwig: So, the perfect example of this is earlier this year we had this landmark EPA case that came out about the same time as the Roe versus Wade case. So, the EPA, the Environmental Protection Agency, they are a, [00:17:00] they're not, they can't make laws, right? But, the Supreme Court basically is saying, hey, You can't make up your own rules, you have to let Congress make the laws.

[00:17:07] Ryan Hartwig: Like, they were deciding limits on like, pollution levels and things like that. So, really that's what Facebook is becoming, is a state agency, government agency, and they're making their own rules, because they're making their own rules as far as, okay, what should I take down? Should I take down Don Lemon?

[00:17:21] Ryan Hartwig: Should I take down Greta? They're making up, making it up as they go. And really, if the fact that they're colluding with the government, and the government is telling them who to shut down, like the FBI is telling them who to shut down, who to censor, Then the decision about those laws, those rules should be put in place by Congress, not Facebook.

[00:17:39] Ryan Hartwig: And so that's one argument and that's some legal background on some of these cases. That's 

[00:17:45] Leyla Gulen: so fascinating. And I want to talk also about your efforts that led to Mark Zuckerberg's criminal referral to the DOJ. Can we dig into that a bit? 

[00:17:56] Ryan Hartwig: Yeah, of course. And so, you know, I talked about that criminal referral to [00:18:00] the DOJ in my book, Behind the Mask of Facebook, which you can find on Amazon or Barnes Noble.

[00:18:05] Ryan Hartwig: And because my book has all this evidence of Facebook censoring conservatives, and Mark Zuckerberg in April of 2018, he testified that they don't, they do not censor political content. Those are his words when he was testifying to, to at a subcommittee hearing. And so when I went public in June of 2020, I supplied this evidence to Congressman Matt Gates, and he submitted a criminal referral to the DOJ for Mark Zuckerberg, because Mark Zuckerberg allegedly lied to Congress by saying they do not censor political content.

[00:18:38] Ryan Hartwig: And the 300 pages of evidence in my book, Behind the Mask of Facebook, show the absolute opposite, the opposite, that every time Trump was going to say something, We're talking about immigrants, we're talking about South Africa. Facebook is telling us to be on alert and to watch for this content. And it wasn't just Trump, it was any kind of person on the right that [00:19:00] Facebook would smear or put on a list from the Southern Poverty Law about being a white supremacist.

[00:19:06] Ryan Hartwig: So if you're right wing, if you're Tommy Robinson in the UK, who's an activist, or if you're Faith Goldie, a Canadian activist, or Gavin McInnes, the founder of Proud Boys, automatically you're on this hate figure list. And you're censored under the same policy as, it's called Dangerous Individuals and Organizations.

[00:19:26] Ryan Hartwig: You're in the same category of policy as terrorists. So that's basically, so that's what the criminal, that was the criminal referral to DOJ. I don't think anything came of it. But, you know, at least it's a step in the right direction where we need to hold these people, these individuals and organizations accountable.

[00:19:42] Ryan Hartwig: So, I also helped with a criminal and a FEC complaint against Facebook in Michigan. So there was a race between John James and Gary Peters, and John James was a Republican, and he was putting ads out against Gary Peters and talking about the transgender issue, like trans, uh, transgender women and, [00:20:00] or men in, in sports, uh, high school sports and whatnot.

[00:20:04] Ryan Hartwig: And so Facebook was flagging taking down these political ads because they were Supposedly hate speech against trans people. And so, I worked with the illegal organization in D. C. and they filed an FEC complaint using some of my evidence. So that, I helped do that and then also my evidence was used in Laura Loomer's lawsuit against Twitter and Facebook.

[00:20:26] Ryan Hartwig: It's more of a RICO lawsuit. And it's funny because I was just watching this documentary about John Gotti and the mafia 80s, how they went after the mafia. And they, they were able to successfully prosecute them using RICO laws. And if you look at what Big Tech is doing, it's their mafia organization.

[00:20:47] Ryan Hartwig: They're, they're criminal organizations because they're saying, Hey, come on our platform where it's free. And then they steal your data, which maybe you could agree to, arguably. But then they change the, and then they change the terms of services. And then they say, [00:21:00] okay, you need to, you need to pay money to be, to advertise.

[00:21:02] Ryan Hartwig: But they themselves are actually competing against their own users for advertising space. And then they're arbitrarily enforcing, they're arbitrarily censoring conservatives and, and influencing elections. So if you look at, if you think, okay, if you look at a crime organization, if you're influencing elections, if you're enforcing election law, if you're targeting the president of the United States and trying to silence half the country, you'd be like, okay, that's a, that sounds like a pretty bad organization.

[00:21:31] Ryan Hartwig: What, what kind of organization could do that? And the answer is Facebook has been doing all that and they're still doing it. 

[00:21:37] Leyla Gulen: Wow. I mean, do you ever feel that there's going to be a day that they're going to get their comeuppance? That, that they're going to have to face the music, or is this just a beast that is 

[00:21:53] Ryan Hartwig: unbeatable?

[00:21:55] Ryan Hartwig: I'm mildly optimistic. I think that we're making some progress with Jason Fick's lawsuit, [00:22:00] where we can maybe fix the internet law or have it interpreted the correct way. But I think it's going to be, it's just a matter of getting people off of the platform because people are Even after learning all of this, people have known for years that Facebook censors conservatives, and a lot of conservatives are still on there.

[00:22:15] Ryan Hartwig: So, we have made some progress in getting to other platforms like Rumble or Gab. And having these platforms that are blockchain based, kind of decentralized networks, where the government can't just shut down, what was it, uh, right after January 6th, they shut down that one, it wasn't Truth Social, it was, uh, Getter, I believe, they shut down one of those websites.

[00:22:36] Ryan Hartwig: Oh, right. Yep. So, we need to have an internet that's decentralized, and this is, yeah, this is vital for not only just elections, but for human rights activists. So, if you're in a country like in the Middle East, if you're in Gaza, or We're reporting on war crimes. We need to be able to get the word out and Facebook also was involved in, like a few years ago in 2018, they were making [00:23:00] exceptions to make it, make Israel look bad.

[00:23:03] Ryan Hartwig: So this one example I have from April of 2018, they made an exception for a video showing an Israeli sniper killing a Palestinian soldier. At that point in time, you normally delete that. But they made an exception for it because it was the Israeli killing the Palestinian. And so, yeah, it's just, so they just, it's a dangerous, Facebook is a dangerous organization if you're dealing with politics and elections and even human rights.

[00:23:31] Ryan Hartwig: They can shut down any, anything on their platform. So if someone, a human rights activist in Egypt is trying to get the word out about some atrocities, I mean, Facebook can willy nilly just censor anything they'd like. 

[00:23:43] Leyla Gulen: Yeah. How reliant on Facebook is the world population these days? I think it was such a novelty when it first came on to the scene and everybody wanted to open up an account and have a profile and use it, and I understand it's really become [00:24:00] A tool for all kinds of things, for positive things as well, a way for organizations that don't have two pennies to rub together to get the word out of the good things that they're doing in the community.

[00:24:11] Leyla Gulen: So it's not all bad all the time, however, what you're saying basically is that people who just have an honest moment with themselves have had enough, delete their profile, move on. Go outside, smell the roses, write a letter to their friends, and that's how we're going to start keeping in touch with 

[00:24:33] Ryan Hartwig: everybody.

[00:24:34] Ryan Hartwig: Yeah, and there's a good argument for just how much time we spend in front of a screen. I mean, and you look at it, just how unhealthy it is and it's addictive. And just recently, 41 attorney generals filed a lawsuit against Facebook, basically saying, Hey, this platform is being used against minors in a bad way.

[00:24:53] Ryan Hartwig: And yeah, it's just horrible for mental health. I mean, the more time you spend in front of a screen or on Instagram or Facebook. [00:25:00] It's bad. So the, as far as how big, how much Facebook is using, it's still a very powerful marketing tool for businesses. So if you're a business and you have, I don't know what percentage of the U.

[00:25:11] Ryan Hartwig: S. is on Facebook, but I would imagine at least half. So you've got like a hundred million people on Facebook, and if you're a business and you want to market, then obviously it's a no brainer. So yeah, it's a great marketing tool. And Twitter's a great marketing tool as well for businesses. But I think Facebook, Facebook more so.

[00:25:28] Ryan Hartwig: So if you've got the advertisers around that you've got the money, then people are going to continue using it. Yeah, it's going to be hard to kind of pull people 

[00:25:35] Leyla Gulen: away from that. Yeah, I gotta be honest though, I have put all of it down unless it's absolutely critical for work. And I'm telling you, Ryan, I feel like a new person.

[00:25:49] Leyla Gulen: Things that I just never cared about would inundate my feed and my mind and my heart throughout the day. [00:26:00] And it was such a bummer. And as soon as I just put it down, just started getting involved in other things, picked up hot, that I'd put down a long time ago, I just feel so much lighter, so much better.

[00:26:14] Leyla Gulen: It's And tangible shift, I will say in my mood. 

[00:26:20] Ryan Hartwig: Yeah, I think that's important. I think it's really about mental health and it's not just Facebook and social media, but that, that is a big part of it. But just the amount of time we spend per our email Inba Hs. And my Instagram, I had 4,000 followers on it and it got deleted the early this year for like sexual exploitation, which I don't understand how I was trying to exploit people.

[00:26:39] Ryan Hartwig: Yeah. what? So, yeah, so I, I, I don't know why they banned it. Oh, I know why they banned it. 'cause they know who I am, but. Yeah, it was nice to go out and connect with people, but it was time consuming, because every time something's happening, you post a little photo here, but you add up those seconds during the day, and then my Twitter got, with 35, [00:27:00] 000 followers, got deleted after January 2021, and so And it hasn't been re swerved by Elon yet.

[00:27:06] Ryan Hartwig: So I'm back on Twitter a little bit with a few followers. And there is that, it is addictive. There's that little dopamine release, that little, little kick. So, but I've been enjoying, actually, even though I was forced off of Instagram, I've kind of enjoyed that little break. And you really think you do, how much, sir?

[00:27:20] Ryan Hartwig: That isn't what they did in the way 

[00:27:21] Leyla Gulen: here, right? Yeah. I mean, I think that's what people need to remember is It's not some club that you endeavor to belong to your entire life or this legacy Saying that it's a favor to you if you belong. I mean truly like we're the ones they need us and then when they kick us out they do us a favor because it's So unhealthy in so many ways.

[00:27:48] Ryan Hartwig: Yeah, it really is not healthy. And in, I did an interview in Spain a couple weeks ago, and there was a company that was had a bunch of content moderators like myself, and they all did a mass walk off, like, [00:28:00] I don't know, or on strike or whatever, because of the working conditions. And so I think a big part of this is the mental health, not only for the content moderator, but also for the users.

[00:28:08] Ryan Hartwig: Like, is it really healthy to be scrolling through a feed? And my wife still uses Facebook, and I see her, like, scrolling through the feed sometimes with my brother in law. And yeah, it's, they, Facebook has hired, they've hired a lot of people, like psychologists and whatnot to learn how to keep people on their platform because the more time you're on their platform, the more money they make from their advertisers.

[00:28:32] Leyla Gulen: Yeah, well, that's the thing. They need us. We're dollar signs to them. Yeah. Each user is a dollar sign. I want to talk about what states can do at the local level to prevent Facebook from influencing 

[00:28:43] Ryan Hartwig: elections. Yeah, that's a good question. So, on a state level, so for example, in Arizona, there's a lot of election laws, and we didn't get training about state specific election laws when we were at Facebook enforcing election content, so If I go to the Slotsky's [00:29:00] deli here in Arizona on election day, Oh, hang on, why can't I have my vote, I voted sticker?

[00:29:05] Ryan Hartwig: If Slotsky's gives me a new, gives me a sandwich, because, oh, you voted, here's a free, here's a coupon, or here's a free sandwich, that violates state election law. And so, what I'd like to see is more, I mean, I think Facebook pretty much ignores state governments as far as, unless you're Katie Hobbs and you want to censor your political opponents, then they'll talk to you.

[00:29:26] Ryan Hartwig: But, Why isn't there a local office for Facebook in each state where they are, where they're liaising with state election officials, like Ken Bennett, he's the former Secretary of State, or he could, they could create an office and communicate with Facebook, but I didn't, I don't think that is, I don't think that doesn't exist.

[00:29:43] Ryan Hartwig: I know Facebook had like a national election content center where they're tracking elections, but why are they in that business to begin with? Like if it's, if I get to run across anything that's election related, Facebook just should just delete it, just ignore it across the board. I mean, It doesn't affect them like they [00:30:00] There's no reason for them to be in there unless they're trying to influence the election.

[00:30:04] Ryan Hartwig: And so on a state level, it's kind of hard because the, you got section 230 and you, which is a, and then, which is passed by Congress. So on a local level, like the legislature can maybe pass, try to pass some kind of legislation. I know Texas and Florida have tried not too successfully to pass some kind of laws preventing censorship and whatnot online.

[00:30:24] Ryan Hartwig: I know. There's a legislature passed some anti bullying legislation that made it easier to, like, push back and, like, misdemeanor if you're, like, continually harassing a person or someone or doxxing someone online. But what I'd like to see is more of, like, a free speech bill, something that would say, hey, you can't, uh, it's tough, because Facebook is operating.

[00:30:47] Ryan Hartwig: Online in our states. And if you think about how the internet is affected locally, well, you need, if you go online, if you go on Amazon or any store online in it, and you're in Arizona and you buy something, there's a sales tax, [00:31:00] right? So Arizona gets a chunk of that. And it wasn't always like that. So they pass a law.

[00:31:04] Ryan Hartwig: So now they can do that. Yeah. So if they can do that, if they can pass. a law about how you use the internet and you get, you see errors against their taxes, why can't they pass a law about how Facebook isn't involved on the internet in Arizona, right? 

[00:31:20] Leyla Gulen: Well, I think the answer is they can, they just don't, 

[00:31:22] Ryan Hartwig: right?

[00:31:23] Ryan Hartwig: And I haven't, I'll be honest, I haven't gotten much, like, I mean, I've talked to, I know a few legislatures, but I know the last couple years they've just been focusing on election laws and haven't really gotten anything done. But I would like to see a little more push about big, yeah, that would be, that would be great just to, for people to be aware of what, you know, why are they, why are they enforcing, why is this company out in California, why are they enforcing state election law?

[00:31:49] Leyla Gulen: Yeah, and you had mentioned Section 230, and before we go, I wanted to talk a bit about that, because Section 230, it got a lot of, [00:32:00] Light, uh, shed on it. Just a couple of years ago, there was a lot of discussion about it. It was pretty bipartisan too, I think. Mm-Hmm. as far as trying to wrap everyone's arms around what that meant and that it wasn't supposed to be this kind of a free for all, for big tech to just do whatever they want without any 

[00:32:21] Ryan Hartwig: consequences.

[00:32:23] Ryan Hartwig: Yeah. It, it's really, so section two 30 is largely misunderstood because it deals with. Like that law from 1996, the Communications Decency Act, was basically, they wanted online forums to be able to exist and not be sued out of oblivion. So basically saying, hey, if you're on this online forum and you, let's say you're on a news site, you post a bunch of comments, a news article, and that, that website's not responsible for those comments.

[00:32:47] Ryan Hartwig: So, I can't sue the online newspaper because there's some comments, online users, that, uh, that are defamatory or whatever. So, that's kind of the basis for Section [00:33:00] 230. And that was at the beginning of the internet, like the bandit was buried like a baby back then. So since then we've seen the whole landscape of the internet has changed dramatically, but really what happened is the courts have misinterpreted section 230.

[00:33:14] Ryan Hartwig: So there's two parts, there's section 230c1 and section 230c2. And one of those says, has that phrase, in good faith. So they're supposed to enforce, like remove content with good faith, in good faith. And so the courts have misinterpreted and given. these protections under C, I believe it's C1 or C2, let's see here.

[00:33:39] Ryan Hartwig: Yeah, so, so they've kind of mixed it up and given the one protection that's, that's blanket, two faced, when it should only be protected from the, under the paragraph that says in good faith. So it's allowing them to delete anything they want, um, that's without the good faith. So this says, yeah, because they can restrict access to anything that's obscene, lewd, [00:34:00] lascivious, filthy, excessively violent, harassing, or otherwise objectionable.

[00:34:03] Ryan Hartwig: Whether or not such material is constitutionally protected. So, that's the thing about the First Amendment. It's like, oh, I can post whatever I want, free speech. Well, this says, if Congress gave them the authority to take stuff down, even if it's constitutionally protected. Yeah, so that's, so the courts have misinterpreted it, and we're trying to fix that.

[00:34:21] Ryan Hartwig: So, I don't think writing a new legislation was the right answer. And now, if we try to write a new law, the lobby would just have too much influence on it. So, I'm, if we can reinterpret the law and have the Supreme Court fix, reinterpret the correct way, then that would be great. And there's also an antitrust argument.

[00:34:38] Ryan Hartwig: So that's section 230 in a nutshell, but it's been misinterpreted. We need to fix it. If you look at antitrust, you just look at how these companies are colluding, both with each other, and the percent of market share they have. That's that's also super dangerous for just society having these monopolies and so we should yeah So that that's a little bit about section two that you [00:35:00] want more on on section 230 or the more in detail or I think it's 

[00:35:03] Leyla Gulen: Good to just kind of remind our audience what section 230 is by definition And how Congress was really trying to bring it back into the light, I think, in, in, in light of the 2016 elections and, and everything else.

[00:35:23] Leyla Gulen: I mean, it's, there was a confluence of events that had taken place that sort of brought 230 back onto the tips of everyone's tongues and kind of curious where, where we go from there because it's no longer the flavor of the month anymore. So. Other things have replaced it, have people behind the scenes been continuing along the path of trying to rework it or has it just kind of faded into the background and it's gone until we get another surge of 

[00:35:54] Ryan Hartwig: interest?

[00:35:56] Ryan Hartwig: Yeah, that's the thing with changing laws and [00:36:00] for a while Section 230 and Big Tech is always a big hot button topic and they bring members of the Big Tech, they testify to Congress, blah, blah, blah, and the senators all say they're all, they're doing, they're fighting against Big Tech. Right. Is this just a good talking point, but really has anything changed?

[00:36:15] Ryan Hartwig: And you look at these members of Congress, even on the right, like even Jim Jordan, who've accepted money from Facebook, it's, we talked about the unit party and the left and the right colluding and just working together. And that's basically what's happening with big tech, where we, they're not doing anything.

[00:36:32] Ryan Hartwig: And we have members of. Of big tech going from Twitter and going back to government, there's this revolving door between, especially in the Biden administration, between big tech officials or government officials and big tech executives. So they spend a couple years in San Francisco, then they go back to D.

[00:36:51] Ryan Hartwig: C. So I don't think a lot, most senators don't want to disrupt the status quo. And there was even a chance for, earlier this year, [00:37:00] Ted Cruz filed an amicus about a case. Involving Section 230, trying to hold Facebook liable for a terrorist who posted terrorist content and then there was an attack in California.

[00:37:12] Ryan Hartwig: And even Neil Gorsuch on the Supreme Court, he said, hey, we're not going to make any changes to Section 230 or it's too disruptive. He literally said in his opinion, we're not going to disrupt the status quo. So here we are like 20 years into the internet age, this revolutionary time. And the laws and the lawmakers are just kind of like behind the times with the holistic change of technology.

[00:37:33] Ryan Hartwig: And granted, it's been a rapid change the last few years, just amazing revolutionary as far as how society has changed and been influenced by the internet, but here now, here we're 20 years in and at these huge companies, Facebook and Google control more about our, in our world than any other dictator in the history of the world.

[00:37:58] Ryan Hartwig: I mean, they're more powerful than. [00:38:00] Stalin and all the big, biggest companies, the steel companies, uh, Carnegie, and they're more powerful than any other entity in the world, in the history of humanity, I would argue. 

[00:38:10] Leyla Gulen: That's really incredible. This has been such a fascinating conversation, Ryan, I have to tell you.

[00:38:15] Leyla Gulen: Now, what social platforms are you on today where people can find you? 

[00:38:21] Ryan Hartwig: Yeah, so I'm on, I'm primarily on Gab, G A B dot com, and I'm on, it's, my kick tag is just RealRyanHeartwig. So, I like Gab, I like Mines, there's one called Bastion as well, which is more of a blockchain based decentralized network. Yeah, so Gab, Mines, Bastion, those are all great sites to be on.

[00:38:42] Ryan Hartwig: I'm not, I need to, I think I created one, I need to go back on True Social. I met Devin Nunes last week and I think he works with True Social. Oh yeah. Yeah, that tends to be, those are all good. Yeah, 

[00:38:54] Leyla Gulen: okay, alright, I was just curious. And then of course you've got a website. 

[00:38:57] Ryan Hartwig: Yeah, my website's just ryanhartwig.

[00:38:59] Ryan Hartwig: [00:39:00] org. And you can find my book on there as well, it's Behind the Mask of Facebook on Amazon or anywhere else. Um, but yeah, RyanHartwick. org. So you can reach out to me and read my book and let me know what you think about it. That sounds 

[00:39:13] Leyla Gulen: great. And real quick before I let you go, what's coming up for you in the near future in the next couple of months in 

[00:39:18] Ryan Hartwig: 2024?

[00:39:20] Ryan Hartwig: So 2024, I'm going to, yeah, I'm going to work on another book about Section 230 with Jason Fick and just continue to fight against and make people aware of what Facebook's doing during an election year. That's, 

[00:39:34] Leyla Gulen: yeah, I mean, it's coming up and it's coming up real fast. It's hard to imagine that we're already in the month of November.

[00:39:40] Leyla Gulen: So I wish you luck and Ryan, thank you so much for being so bold and brave and coming out with this book. Again, Ryan Hartwig, author of Behind the Mask of Facebook, a Whistleblower's Shocking Story of Big Tech Bias and Censorship. He's an officer with the Social Media Freedom Foundation and Facebook Insider.

[00:39:59] Leyla Gulen: with Project [00:40:00] Veritas. Ryan, thank you for joining us. Thank you.

ORGANIZATIONS IN THIS STORY

!RECEIVE ALERTS

The next time we write about any of these orgs, we’ll email you a link to the story. You may edit your settings or unsubscribe at any time.
Sign-up

DONATE

Help support the Metric Media Foundation's mission to restore community based news.
Donate

MORE NEWS