The leaders of Google, Facebook and Twitter testified on Thursday before a House committee in their first appearances on Capitol Hill since the start of the Biden administration. As expected, sparks flew.
The hearing was centered on questions of how to regulate disinformation online, although lawmakers also voiced concerns about the public-health effects of social media and the borderline-monopolistic practices of the largest tech companies.
On the subject of disinformation, Democratic legislators scolded the executives for the role their platforms played in spreading false claims about election fraud before the Capitol riot on Jan. 6. Jack Dorsey, the chief executive of Twitter, admitted that his company had been partly responsible for helping to circulate disinformation and plans for the Capitol attack. “But you also have to take into consideration the broader ecosystem,” he added. Sundar Pichai and Mark Zuckerberg, the top executives at Google and Facebook, avoided answering the question directly.
Lawmakers on both sides of the aisle returned often to the possibility of jettisoning or overhauling Section 230 of the Communications Decency Act, a federal law that for 25 years has granted immunity to tech companies for any harm caused by speech that’s hosted on their platforms.
These Big Tech companies are among the wealthiest in the world, and their lobbying power in Washington is immense. Besides, there are major partisan differences over how Section 230 ought to be changed, if at all. But lawmakers and experts increasingly agree that the tide is turning in favor of comprehensive internet regulation, and that would most likely include some adjustments to Section 230.
To get a sense of where things stand, I caught up by phone with Jonathan Peters, a professor of media law at the University of Georgia, who closely follows Big Tech regulation. Our conversation has been lightly edited and condensed.
In her introductory remarks at the hearing today, Representative Jan Schakowsky of Illinois said, “Self-regulation has come to the end of its road.” What does she mean when she talks about an era of “self-regulation” on the internet? And how was that allowed to take hold?
The background of this hearing is that platforms like Facebook, Twitter, Instagram and YouTube, and big parent companies like Google, have come to have an enormous amount of power over the public discourse. And the platforms routinely conduct worldwide private speech regulation, through enforcement of their content rules and their community guidelines, deciding what may be posted, when to honor any request to remove content and how to display and prioritize content using algorithms.
Another way of putting it is that they are developing a de facto free-expression jurisprudence, against the background of the platforms’ business and legal interest and their self-professed democratic values. That has proved extremely difficult in practice.
The internet exists on a layered architecture of privately owned websites, servers and routers. And the ethos of the web, going back to its early days, has been one governed by cyber-libertarianism: this theory that by design this is supposed to be a relaxed regulatory environment.
What these hearings are trying to explore is the question, as you mentioned: Have we reached the end of that self-regulatory road, where the government ought to have a greater role than historically it has had in this space?
With all of that in mind, is antitrust legislation from Congress likely? How does President Biden’s arrival in the Oval Office change the prospects?
It’s interesting: If you look at what Biden has said as a candidate and what Biden has done as president, they’re a little bit different. As a candidate, Biden said he would favor revoking Section 230. He does not have even the Democratic votes to go through with a full revocation of Section 230, although an amendment might be possible. I think he’s facing the political reality that that is going to be a harder sell than he had initially thought.
In terms of whether broad antitrust legislation might pass this Congress, it does seem possible. Antitrust issues in the social media space have generated a lot more interest in the last couple of years than they have in the last 15 or 20 combined. If I could put that in just a little bit of historical context for you: 2019 marked the 100th anniversary of a monumental dissenting opinion in a Supreme Court case called Abrams v. United States. That was a case in which Justice Oliver Wendell Holmes really gave rise to our modern First Amendment, and the enduring concept of the value in a market of free trade in ideas.
With the rise of social media, our free-speech landscape today looks exceedingly different than it did when Holmes wrote those words. He was warning of the dangers of the government’s ability to censor critics or other disfavored speakers, whereas now the entities best able to restrict our speech are nongovernmental internet and web platforms.
So, many traditional First Amendment principles don’t map easily onto our reconstructed speech landscape. And I think the central concern at the heart of these antitrust cases is the power that is at the heart of what these companies do. It’s not that they produce widgets; they play a significant role, every day, in public discourse on matters of public interest.
Have the events of Jan. 6 and the entire experience of the 2020 election — which was riddled with false information about elections and voting — affected the likelihood of change? Did it really turn up the urgency in a meaningful way around web regulation?
I would say that it did. And it also clarified the differences, in terms of why the Democrats believe that reform is necessary and why the Republicans believe that it is. There is a growing consensus that we need more regulation to ensure the openness and usefulness of the web, but Democrats and Republicans disagree on why.
Democrats generally would argue that the platforms allow too much harmful user content to be hosted and spread — the kind of misinformation and disinformation we saw around the 2020 election, some of which of course contributed to or caused the Capitol insurrection. I would say that Democrats are also concerned with bullying, harassment and threats; hate speech; criminal activity that occurs on social media platforms; and the presence of dangerous organizations like terrorist groups or violently graphic content, and the effect those might have.
Republicans, by contrast, have sounded some of those same concerns. But they have focused a lot more on their concern that platforms censor conservative viewpoints — that the platforms are engaging in viewpoint discrimination. I’m not convinced that there is evidence of that, but that claim was made more loudly after President Trump was deplatformed by several of these major social media companies. I think it gave them another arrow in their quiver to try to advance that rhetorical argument that they had been making before the Capitol attack.