Hateful Comments on Twitter? Comments on Meta Pedestrians, Facebook CEO Mark Zuckerberg, Reels, and TikTok
Often, I’ll come across it as a comment on a video of a teenager on TikTok doing something that could easily be considered embarrassing, like singing out of key or obsessing over something others find cringe. Even more upsettingly, you’ll find that comment, word for word, on videos from people with disabilities simply existing. They say this because they believe it is easier to be mean to people on Meta platforms. The subtext is that if these users posted to Reels instead of TikTok, they’d receive the harassment the commenters believe they deserve.
And yet, it’s TikTok staring down the barrel of a nationwide ban this week. Meta’s Mark Zuckerberg, on the other hand, has decided to make his platforms more dangerous to appease the incoming president.
As you’ve likely already heard, Meta announced on Tuesday that it would be ending its third-party fact-checking program, replacing it with X-style Community Notes, and “restoring free expression” to its platforms. To accomplish the latter part, the company will relocate its trust and safety operation from California to Texas, purportedly to avoid liberal bias (but not conservative bias, I guess?) and focus its moderation filters on illegal content like terrorism and child sexual abuse material rather than what it calls “lower-severity violations.”
All of this comes with an update to Meta’s Community Guidelines, including its Hateful Conduct policy, that essentially allows users to make blatantly homophobic, transphobic, sexist, and racist posts without consequences, as my colleague Kate Knibbs reported this week. For Platformer, Casey Newton noted that, lost amongst other changes, Meta removed a sentence from its guidelines “explaining that hateful speech can ‘promote offline violence.’” Doing so immediately following the anniversary of January 6 is truly something to behold!
Why destroy protections for Meta’s most vulnerable users? The policies were out of touch with mainstream discourse and recent elections feel like a cultural tipping point towards once again prioritize speech according to a video statement released on Tuesday by Facebook founder and CEO Mark Zuckerberg. (Zuckerberg, no one’s idea of a political theorist, didn’t really explain why fact-checking, itself the sort of speech that free-speech activists have long held is the appropriate reaction to bad speech, isn’t worth prioritizing, nor did he explain what he has against the many forms of speech that Meta will still suppress. It seems that free expression is the same as if Meta isn’t banning at a given moment.
Source: The Internet’s Future Is Looking Bleaker by the Day
On Zuckerberg’s lie about the Obama administration, the WH, and the case against the Censorship of Covid Origins
The Supreme Court will also take up TikTok’s lawsuit against the US government and its attempts to ban the app nationwide. The court doesn’t have much time left to save the app because it’s less than two weeks away from the deadline.
Those emails also reveal Zuckerberg wanted to blame the Biden White House for how Facebook chose to moderate the “lab leak” conspiracy theory of covid origins. “Can we include that the WH put pressure on us to censor the lab leak theory?” He asked in a conversation. He said, “I don’t think they put specific pressure on that theory.”
In his letter to Jordan’s committee, Zuckerberg writes, “Ultimately it was our decision whether or not to take content down.” Emphasis mine. “Like I said to our teams at the time, I feel strongly that we should not compromise our content standards due to pressure from any Administration in either direction – and we’re ready to push back if something like this happens again.”
But the biggest lie of all is a lie of omission: Zuckerberg doesn’t mention the relentless pressure conservatives have placed on the company for years — which has now clearly paid off. Zuckerberg is particularly full of shit here because Republican Rep. Jim Jordan released Zuckerberg’s internal communications which document this!
Rogan sets his own tone by referring to moderation ascensorship, when he serves up a series of softballs. The idea that the government was forcing Zuckerberg to “censor” news about covid and covid vaccines, Hunter Biden’s laptop, and the election is something of a running theme throughout the interview. When Zuckerberg isn’t outright lying about any of this, he’s quite vague — but in case you were wondering, a man who was formally rebuked by the city of San Francisco for putting his name on a hospital while his platforms spread health misinformation thinks that “on balance, the vaccines are more positive than negative.” Whew!
Some research suggests conservatives are more likely to get truth-checking than the other way around. That means conservatives are more likely to be moderated. In this sense, perhaps it wasn’t Facebook’s fact-checking systems that had a liberal bias, but reality.
Still, in the 2020 election, Facebook — along with other social media networks — took a harsher stance on fake news, making it harder for Macedonian teenagers to make a profit off Trump supporters. During his Rogan interview, Zuckerberg now characterizes this intervention as giving “too much deference to a lot of folks in the media who were basically saying, okay, there’s no way that this guy could have gotten elected except for misinformation.”
Early on in the interview, Zuckerberg tests out the water to see how much pushback he’ll get; Rogan is a notoriously soft interviewer — it’s like listening to your dumbest stoned friend hold a conversation — but he does occasionally challenge his guests. The First Amendment can have limits, like saying that you can’t shout fire in a crowded theater.
I was not born yesterday, so I remember the first attempt to get rich, FaceMash, a clone of Hot OrNot where he uploaded photos of his fellow female students to be rated without their consent. I think giving people a voice is one of the ways of describing that. Personally, I’d call it “creep shit.”
At some level, you can only start a company if you believe in giving people a voice, and that is what Mark said at the very beginning.
But Zuckerberg wants us to believe this isn’t about politics at all. Getting Rogan’s listeners riled up about Zuckerberg’s enemies and finding Republicans a new tech company target is just a coincidence, as are the changes to allow more hate speech on his platforms happening now, changes that just happen to pacify Republicans. All of this has nothing to do with the incoming administration, Zuckerberg tells Rogan. “I think a lot of people look at this as simply a political event because they think the timing is good for it, and they are like, hey, you are doing this right after the election.” he says. “We try to have policies that reflect mainstream discourse.”
They wanted to investigate the theory that they found. They were trying really hard, right? To like, to like, find, find some theory, but it, like, I don’t know. I don’t know how this stuff works, and it was just the party and the government, there was just sort of. I mean, I’ve never been in government. I don’t know if it’s a directive, or a quiet consensus, that we don’t like these guys. They are not doing what we want. They are going to be punished. It is very difficult to be at the other end of that.
This is a compelling demonstration that jujitsu and MMA training (or hunting pigs in Hawaii or making your neck real thick or whatever) isn’t going to help you act aggressive if you’re constitutionally bitchmade. Blaming the consumer protection bureau for a witch-hunt is very disrespectful to those who have watched the Republicans target Facebook. That’s what this whole performance is about: getting Trump, Vance, Jordan and the rest of the Republican party to lay off. After all, the Cambridge Analytica scandal cost Facebook just $5 billion. If Zuckerberg plays ball, his next privacy whoopsie could be even cheaper.
In fact, Zuckerberg even offers Republicans another target: Apple. According to Facebook founder, the way Apple makes money is by squeezing people. Among his complaints:
Some of these Apple issues matter and there is a legitimate antitrust case against the company. But that isn’t what’s on Zuckerberg’s mind. The important point is what he considers the last point. He is angry that Apple put anti- tracking features into its default browser. Facebook criticized the changes in newspaper ads. The policy cost social media companies almost $10 billion in lost revenue, according to The Financial Times. It turns out if you ask people whether they want to be tracked, most of them will not. This is bad for the business of Facebook.
Is this work? The bros were upset about how social media needed moremasculine energy to win. Dave Portnoy, Barstool’s CEO is not fooled by this shit.
Lying is said to fly halfway around the world, while truth is getting its boots on. Meta, the company that owns Facebook, said this week that it plans to scrap the fact-checking program, which was first set up in 2016 and paid independent groups to verify articles and posts.
The company said that it wanted to counter political bias on fact checkers. “Experts, like everyone else, have their own biases and perspectives. This showed up in the choices people made about what to fact check and how to do it.
Nature spoke to researchers from communication and misinformation about what it was like to be fact-checking.
The goal was to not have people form misperceptions in the first place. “But if we have to work with the fact that people are already exposed, then reducing it is almost as good as it as it’s going to get.”
Alexios Mantzarlis, a former fact-checker and director of the Security, Trust, and Safety Initiative at Cornell Tech in New York City, says that fact-checks can still be useful as long as there is no change of minds.
On Facebook, articles and posts deemed false by fact-checkers are currently flagged with a warning. Mantzarlis says that they are shown to fewer users by the platform, as well as being more likely to ignore flagged content.
Flagging posts as problematic could also have knock-on effects on other users that are not captured by studies of the effectiveness of fact-checks, says Kate Starbird, a computer scientist at the University of Washington in Seattle. “Measuring the direct effect of labels on user beliefs and actions is different from measuring the broader effects of having those fact-checks in the information ecosystem,” she adds.
“It’s largely because the conservative misinformation is the stuff that is being spread more,” he says. “When one party, at least in the United States, is spreading most of the misinformation, it’s going to look like fact-checks are biased because they’re getting called out way more.”