It was deja vu all over again.
The chief executives of Google parent Alphabet Inc. GOOGL, -0.38% GOOG, -0.43%, Facebook Inc. FB, +1.54%, and Twitter Inc. TWTR, +0.10% alternately were filleted, grilled and otherwise pummeled before the House Committee on Energy & Commerce on Thursday.
“It is now painfully clear that neither the market nor public pressure will force these social-media companies to take the aggressive action they need to take to eliminate disinformation and extremism from their platforms,” committee chairman Frank Pallone Jr. (D., N.J.) said in opening remarks during the hearing. “And, therefore, it is time for Congress and this Committee to legislate and realign these companies’ incentives to effectively deal with disinformation and extremism.”
“These platforms are hotbeds of disinformation despite new policies,” committee member Jan Schakowsky (D., Ill.), who has introduced a bill to protect consumers online, said in a lacerating statement. “Disinformation was rampant” during the 2020 election and pandemic, she added.
Federal lawmakers’ distrust of tech may have taken on a personal edge following the Jan. 6 attack on the Capitol fomented in great part by far-right vitriol on Facebook, Google’s YouTube, and Twitter.
Twitter CEO Jack Dorsey acknowledged that social media bore some responsibility for spreading false information that led to the insurrection but the problem was “more complex” as part of a larger information ecosystem and overheated political climate.
Whether all the talk and hand-wringing leads to legislation is the next logical step, say antitrust and tech law experts. Tech legal expert Jenny Lee said members from both sides of Congress demonstrated in Thursday’s hearing that they are “galvanized and ready to proceed with serious legislation.”
“If Jan. 6 wasn’t enough to get them to acknowledge their role, it’s unclear anything ever will be,” Elizabeth Renieris, founding director of the Notre Dame-IBM Technology Ethics Lab at the University of Notre Dame, told MarketWatch. “It’s time to move beyond self-regulation and pass meaningful legislation to limit the power of these platforms, including through comprehensive federal privacy legislation, competition-related measures, and new consumer protection rules, among others.”
Read more: Zuckerberg, Dorsey to argue their platforms mirror a fractured society in House hearing Thursday
Facebook CEO Mark Zuckerberg said the lashing tone of a small percentage of the platform’s content reflects a bitterly divided nation, and that the company had taken several steps to remove it. He embraced the idea of accountability standards for clearly illegal types of content such as child trafficking and terrorism.
Read more: Google, Facebook undertake appeasement campaigns before Thursday CEO showdown in House
But it did little to mollify the committee — particularly Reps. Mike Doyle (D., Pa.) and Cathy McMorris Rodgers (R., Wash.) — who relentlessly pressed the executives on the effects of social media on children and underrepresented groups, among others. They, and other members, were especially alarmed by Facebook’s plans to develop a version of Instagram for those 13 and younger.
Social-media companies have been vilified for spreading falsehoods about the 2020 presidential election result, COVID, and vaccinations, as well as helping organizers of the Jan. 6 assault on the Capitol recruit attendees and stoke violent unrest.
Nearly two-thirds of anti-vaccine content on major social-media platforms are linked to a dozen individuals or organizations, according to a new report by the Center for Countering Digital Hate and Anti-Vax Watch. The report is based on analysis of a sample of anti-vaccine content shared or posted on Facebook and Twitter over 812,000 times between Feb. 1 and March 16.
“The platforms’ inability to deal with the violence, hate and disinformation they promote on their platforms shows that these companies are failing to regulate themselves,” says Emma Ruby-Sachs, executive director of SumOfUs, a nonprofit advocacy organization. “After the past five years of manipulation, data harvesting, and surveillance, the time has come to rein in Big Tech.”
Still, the “half measures” floated during the hearings, “aren’t going to help,” Ben Pring, co-founder of Center for the Future of Work at Cognizant, told MarketWatch.
“Repealing Section 230 [a law that grants tech companies immunity from liability for content that appears on their platforms], prohibiting political advertising on social media, and making people own what they say on social media are steps that need to be taken ASAP, before things get really out of hand,” Pring said.
“The issue with platforms like Facebook and Twitter? There aren’t steadfast policies in place to curb misinformation,” American University professor Jason Mollica told MarketWatch.
Representatives from Apple Inc. AAPL, +0.51% and Amazon.com Inc. AMZN, +0.19% were spared the congressional inquisition, but their companies are under increasing scrutiny from European regulators.