Social Media Coercivity and Murthy v. Missouri: The Last Damned Effort of the Senate Subcommittee Ranking Member Plaskett
Joel Kaplan, a public policy executive at the time, presciently warned against suggesting Facebook censored the lab leak theory at the White House’s behest because it would “supercharge” conservative critiques that it’s “collaborating” with the Biden administration “to censor speech.” (Kaplan previously served under the George W. Bush administration. While employed as a Facebook executive, he sat behind Brett Kavanaugh in putative support during the Congressional hearing over the Supreme Court nominee’s alleged sexual assault of Christine Blasey Ford.)
If the Supreme Court decides that the communications were illegal, it would be because they rose to the level of coercion.
Newly released private conversations among Meta’s top executives, for example, give a new glimpse into how the company navigated a tense relationship with the Biden administration in the early days of the campaign to vaccinate Americans against covid-19, particularly after President Joe Biden himself accused the company of “killing people.”
This article did not get a response from Amazon, Meta, or any other company. The Verge also reached out to the White House and Democratic Judiciary Committee staff for comment on the report but did not receive a response in time for publication.
The case about the Biden administration’s alleged coercive use of social media platforms into certain types of moderation decisions is still pending at the Supreme Court. Murthy v. Missouri is all about where to draw the line between (entirely legal) persuasion from the government versus (illegal) coercion. The shift in language means the Jordan report becomes more aligned with the core arguments in that case.
At Wednesday’s hearing, Select Subcommittee Ranking Member Stacey Plaskett, a Democrat who represents the US Virgin Islands, accused Republicans of holding the hearing now as a “last ditch effort to influence the Supreme Court opinion in the case of Murthy v. Missouri.”
The staff gathered hundreds of hours of testimony showing that social media companies only took action when the content violated their internal policies.
But Republicans, Plaskett said, have repeatedly declined to make the testimony public and declined to give Democrats “hundreds of hours of video taken during those investigations.” There was an objection when the Plaskett asked to enter several transcripts of interviews with tech executives. Jordan stated that they plan to release the rest once they talk to everyone and their counsel to make sure they are comfortable with it.
This is a question the Supreme Court will be answering in the coming months, as to whether any of the pressure from the White House was coerced.
The exchange also appears to show that, rather than feeling beholden to Biden’s will, the incident actually pushed Facebook’s top executives to want to engage less with the federal government. The cause to engage with them further is questionable if they are more interested in being criticized than actually addressing the problems.
Zuckerberg chimed in, asking, “Can we include that the WH put pressure on us to censor the lab leak theory?” But Clegg threw cold water on that, saying, “I don’t think they put specific pressure on that theory — it was always ‘do more’ generic pressure.”
It appears like a good reminder that when we compromise our standards due to the pressure of an administration in either direction, we will often regret it later.
And another thought. Did Trump say things this irresponsible? If Trump blamed a private company not himself and his govt, everyone would have gone nuts.”
In June 2021, a trust and safety executive explained in an email to Zuckerberg that some of the third-party fact-checkers they relied on either rescinded their false rating or acknowledged uncertainty about the lab leak theory. The executive says that the company had removed posts including any of five claims rated as false by its fact-check network in February 2021, including that the disease was man-made or engineered by a government or country. The decision was made in response to continued public pressure and tense discussions with the new administration, which, based on the timing, would have been the Biden administration.
The trust and safety executive said that, in February, the team was asked to review the decision further into the year to determine if we should remove posts or reduce them.
Source: Republicans release tech executives’ internal communications
Wired Politics Lab: How Facebook and Twitter pushed the Biden White House to stop censoring of free speech, and that is exactly what we want to see
The phrase “compromise our standards due to pressure from an administration” is one that the Conservatives will no doubt zero in on, as it seems that Meta was pressured initially into flagging the lab leak hypothesis as misinformation. Buyer’s remorse is only possible when you’re free to make (or not make) a purchase.
The committee says it’s reviewed “tens of thousands of emails and other relevant nonpublic documents” that it says show that the “Biden White House coerced companies to suppress free speech.”
That framing is significant since it’s also the focus of a major Supreme Court case expected to be decided by the end of June that will have major ramifications on the federal government’s ability to communicate with social media firms.
According to the report, a new proposed vaccine safety policy was shared byYouTube in September of 2021, after months of engagement. Back in July of that year, YouTube’s public policy team did not commit to a Biden administration official to any new policies and responded to a question about what it calls “borderline content” with stats about the low reach that content already receives. On September 21st a member of the YouTube policy team asked a White House official about dates to see the new policy and seek feedback on it. Flaherty apologizes for not responding to the previous message but says that he saw the news and that it seemed like a great step.
During arguments in Murthy v. Missouri, Justice Elena Kagan was skeptical of a monthslong gap between the Biden administration asking Facebook not to distribute a post about vaccine hesitancy and the platform allegedly blocking a health group as a result.
We know Jordan — who chairs the committee and subcommittee that released this report — is invested in the outcome of the Supreme Court case because he actually attended the oral arguments. An opinion is expected by the end of June. The House Republicans are working on legislation that would allow people to file a lawsuit against executives for censoring their speech, as reported in a report.
Leah Feiger: Welcome to WIRED Politics Lab, a show about how tech is changing politics. I am the senior politics editor at WIRED. We are talking about an exclusive WIRED story that was just released today. It shows how far-right extremists are organizing on Facebook. militia extremists have been reorganizing after laying low for several years. They formed a couple hundred Facebook groups and profiles, both public and private to recruit and assemble local militia activity across the country. This isn’t allowed according to Meta’s own standards, but these groups are on Facebook working together to promote combat training and prepare for the election. David Gilbert is a senior reporter on WIRED and he is joining me to discuss all of this. Hey, David.
Leah Feiger is @LeahFeiger. David Gilbert is @DaithaiGilbert. Tess Owen is @misstessowen. Write to politicslab@ WIRED.com. You should subscribe to the WIRED Politics Lab newsletter.
Tune in and Subscribe to WIRED Politics Lab: A podcast all about disinfo (with a podcast on iPad and iPod, i.e.m.)
You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:
If you’re on an iPad, you can just tap the link to open the app called Podcasts, if you don’t have an iPad. You can also download an app like Overcast or Pocket Casts, and search for WIRED Politics Lab. We’re on Spotify too.
There may be errors in this transcript. David Gilbert: Hey there. I’m David Gilbert, senior reporter on the WIRED Politics team. Before we begin today’s podcast, I wanted to mention that we’re putting together an episode all about disinformation. We need you to be involved in this because it is a guide to dis info. Do you have any questions about disinfo? We want to hear all of it. We’ll be reading through the mailbox and answering your questions on the show. Please send your questions to [email protected]. That’s politicslab@wired. And one more thing, if you haven’t checked out our newsletter, please sign up. There’s a link in our show notes. Thanks.