Uncategorized

Zuckerberg says Facebook not responsible for Capitol riots

[ad_1]

Facebook chief executive Mark Zuckerberg has denied his social media platform paved the way for the storming of the US Capitol on January 6, as Silicon Valley executives faced a fresh barrage of criticism over their content moderation failures during a bruising hearing in Congress.

Appearing before a House panel on Thursday via video link alongside Twitter head Jack Dorsey and Google’s chief Sundar Pichai, Zuckerberg rejected suggestions from politicians that Facebook bore responsibility for the riots by allowing misinformation, hate speech and online extremism to flourish on its platform.

“The responsibility here lies with the people who took the actions to break the law and . . . also the people who spread that content, including the [former] president [Donald Trump], but others as well,” Zuckerberg said.

He also said claims that Facebook’s advertising-driven business model amplifies provocative and polarising speech were “not accurate”, adding: “I believe that the division we see today is primarily the result of a political and media environment that drives Americans apart.”

However, he later acknowledged that the company needed to do “further work” to make its moderation “more effective”. 

Dorsey took a more conciliatory tone, saying: “We make mistakes in prioritisation and in execution.”

The hearing marked the third time the executives have been hauled before US politicians in less than six months, as lawmakers seek to rein in Big Tech.

In a sign of how much political anger the companies have generated, the three chief executives faced overwhelmingly hostile questioning from both parties.

While Democrats wanted to focus on misinformation, Covid-19, and the Capitol riot, Republicans were more interested in complaining that social media companies were censoring conservatives.

Several members from both parties, however, spoke approvingly of limiting the legal protections given to online platforms under Section 230 of the 1996 Communications Decency Act.

Under that law, companies are not legally responsible for the content users post on their websites. But many members of Congress want to limit when those protections should apply.

Michael Doyle, a Democratic representative from Pennsylvania, said: “Time after time you are picking engagement and profit over the health and safety of your users, our nation and our democracy . . . We will legislate to stop this. The stakes are simply too high.”

Zuckerberg said he would back Section 230 reforms, suggesting the government set up a third-party body to assess whether platforms are doing enough to remove unlawful content. He later suggested smaller platforms could be exempt from that oversight.

Google’s Pichai spoke in more cautious terms about potential changes to the law, citing fears over “unintended consequences” including the harming of free expression.

By contrast, Dorsey argued that neither a government nor a private company should be the arbiter of truth — instead touting Twitter’s early efforts to build a “decentralised” content moderation system, which would be open source and not run by any one organisation.

The hearing comes after the platforms ushered in eleventh-hour changes to their content moderation policies in the lead-up to the 2020 US election — and after the vote — in reaction to the fierce criticism from academics and the press. 

Following the Capitol riot, in which five people died, many critics argued the measures were too little, too late and that enforcement was patchy — pointing to the platforms’ failure to curb unfounded conspiracies pushed by Trump and his supporters of rigged voting machines. 

During Thursday’s hearing, lawmakers made demands for more transparency and auditing of the platforms’ secretive algorithms.

When asked if he would consider opening up Facebook’s algorithms to scrutiny, Zuckerberg was hesitant, citing privacy concerns. But he added that it was an “important area of study”. 

Dorsey said that “giving people more choice” about the algorithms that are served to them was vital to tackling misinformation and called for “more robust appeals processes”. 

Zuckerberg also faced repeated questioning about Facebook’s effect on children’s mental health. The executive confirmed earlier reports from BuzzFeed that it was exploring setting up a child-friendly version of Instagram called “Instagram for Kids”.

While both Zuckerberg and Pichai were keen to engage with the committee’s questions, Dorsey could at times barely conceal his contempt. 

Exasperated with some of the executives’ responses, Billy Long, a Republican from Missouri, asked the chief executives: “Do you know the difference between these two words: ‘yes’ and ‘no’?” Soon afterwards, Dorsey took to Twitter and put out a poll asking yes or no. 

Dorsey’s online response did not seem to have impressed the committee. Kathleen Rice, a Democrat from New York, later noted drily: “Your multitasking skills are quite impressive.”

 

[ad_2]
Read More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *