The legislature called on top politicians on Tuesday, Twitter and YouTube for more transparency regarding the on the platforms and criticized their business models for relying heavily on user engagement.
At a hearing for the Senate Judiciary Committee, Democrats and Republicans raised concerns that tailor-made scheduling algorithms also lead users to extremist content and amplify incorrect information faster than it can be removed.
“Algorithms have great potential for good,” said Republican Senator Ben Sasse from Nebraska. “They can also be abused and we, the American people, need to think and think about them.”
Sasse, the senior member of the Privacy, Technology and Legal Committee, said that while the services offered by the social media companies are free, “there is someone who really gets our attention, shortens our attention spans, and us want to drive. ” in often poisonous echo chambers. “
The Tuesday morning hearing took place when lawmakers on both sides signaled an increased appetite for regulating social media platforms in recent months, specifically addressing Section 230 of the Communications and Ethics Act of 1996, which put internet companies off liability for user content.
The House Democrats tabled a bill last month to tightly amend Section 230 and hold social media companies accountable when algorithms share and amplify harmful content that leads to offline violence. In the past few months, there have been other proposals from both Democrats and Republicans to address antitrust issues and rule the power of big tech companies.
Monika Bickert, Vice President for Content Policy at Facebook, said it was not in Facebook’s financial or reputation interests to induce users to extreme content. She said the platform employs a ranking algorithm that users can disable in order to sift through thousands of posts to see and bring them to the top content that they find most useful.
“The algorithm looks at many signals, including how often the user typically comments or likes content from that particular source, how recently that content was published, and whether the content is in a format like a photo or video,” said Bickert.
Alexandra Veitch, YouTube director of government affairs, said her algorithms recommend videos to users, arguing that the company’s “efforts to source content from authoritative sources and reduce recommendations for marginal content and harmful misinformation outweigh other recommendation signals.”
Laura Culberson, head of US policy on Twitter, said the company is researching the possible negative consequences of using algorithms. However, she urged lawmakers to consider the positive effects algorithms have on social media companies.
Culberson also argued that algorithms can be used to eradicate harmful content. “We need to make sure that regulations allow companies to access technology to solve some of the problems the technology itself poses,” said Culberson.
In addition to representatives from the three platforms, Tristan Harris, co-founder and president of the Center for Human Technology, and Joan Donovan, director of research at the Shorenstein Center for Media, Politics and Public Order at Harvard, also testified at the hearing. Both experts filled in gaps in knowledge regarding the technology and provided context to help lawmakers with their questions.
Harris, a former design ethicist at Google, described the social media companies as digital drug users and discredited officials present from Facebook, Twitter and YouTube as hostages held by their companies’ business models.
He said their business models. “That means we are more valuable as people and as citizens of this country when we are addicted, outraged, polarized, narcissistic and disinformed,” Harris said.
Harris also dismissed arguments from social media companies that they are working to improve trustworthy and authoritative content.
“They’re still creating that kind of digital addiction, the dopamine loop,” Harris said. “Nothing they say makes much sense until you realize a gun is being held off the stage and they are saying the things they are saying,” he added.
Missouri Republican Senator Josh Hawley agreed with Harris that the business model of social media companies is based on bringing users back for more. “It’s an attention treadmill, it’s an addiction economy,” said Hawley. “They designed it that way, addiction is the design,” he added.
Donovan said the mass distribution of misinformation via social media companies was a design feature, and not a flaw in the system.
“Social media products are expanding novel and outrageous statements to millions of people faster than timely, local, relevant, and accurate information,” said Donovan. She argued that the repetition of posts, the redundancy of displaying content on multiple platforms, and the reinforcement of algorithms to display related content are all factors that drive users down the rabbit hole.
Delaware Democratic Senator Chris Coons, chair of the Justice Committee on Technology, Privacy and Legal Affairs, said he wanted to work with colleagues on bipartisan solutions that would be either voluntary or regulatory reforms.
Coons opened the hearing and urged the social media platforms to learn from each other and build on best practices.
“Algorithms affect what literally billions of people read and see every day and what they think every day,” Coons said, adding that it makes sense for businesses to have tools to sort content, that users want to deal with.
But Coons also said that trust in algorithms is proving to be detrimental to public discourse.
“None of us want to live in a society that is hopelessly politically divided as the price for remaining open and free,” said Coons. “But I am also aware of the fact that we don’t want to unnecessarily restrict some of the most innovative and fastest growing companies in the West. Reaching that balance will require more discussion.”