Facebook remained in slow motion as online hatred raged over Trump post, leaked documents say

0

COLUMBUS, Ohio (AP) – Reports of hateful and violent Facebook posts started pouring in on the night of May 28 last year, shortly after then-US President Donald Trump, sent a warning on social media that the Minneapolis looters were shot.

It had been three days since Minneapolis cop Derek Chauvin knelt on George Floyd’s neck for more than eight minutes until the 46-year-old black man passed out, showing no signs of life. A video taken by a passerby has been viewed millions of times online. The protests had invaded the largest city in Minnesota and would soon spread to every city in America.

But it was only after Trump published Floyd’s death that reports of violence and hate speech increased “rapidly” on Facebook across the country, a company internal analysis of the publication reveals. the former president on social networks.

“These thugs dishonor the memory of George Floyd and I won’t let that happen,” Trump wrote at 9:53 a.m. on May 28 from his Twitter and Facebook accounts. “Any difficulty and we’ll take control but, when the looting begins, the shooting begins!”

The former president has since been suspended from Twitter and Facebook.

Leaked Facebook docs provide insight into how Trump’s social media posts sparked more anger in an already deeply divided country that was ultimately ‘on fire’ by reports of hate speech and violence on the platform . Facebook’s own automated internal checks, designed to detect posts that break the rules, predicted with almost 90% certainty that Trump’s post broke the tech firm’s rules against incitement to violence.

Still, the tech giant took no action on Trump’s message.

Offline the next day, protests – some of which turned violent – engulfed nearly every American city, large and small.

US President Donald Trump speaks with members of the press on the South Lawn of the White House in Washington, May 30, 2020 (Patrick Semansky / AP)

“When people think back to the role Facebook played, they won’t say Facebook caused it, but Facebook was definitely the megaphone,” said Lainer Holt, professor of communications at Ohio State University. “I don’t think there is a way for them to say that they have exacerbated the situation.”

Social media rival Twitter, meanwhile, reacted swiftly at the time by covering Trump’s tweet with a warning and banning users from sharing it further.

Facebook’s internal discussions were exposed in disclosures made to the Securities and Exchange Commission and provided to Congress in a form drafted by legal counsel to former Facebook employee-turned-whistleblower Frances Haugen. The drafted versions received by Congress were obtained by a consortium of news organizations, including the Associated Press.

The Wall Street Journal previously reported that Trump was one of many high profile users, including politicians and celebrities, who were exempt from some or all of the company’s normal enforcement policies.

Hate speech and reports of violence were mostly confined to the Minneapolis area after Floyd’s death, the documents reveal.

“However, after Trump’s May 28 message, the situation really escalated across the country,” according to the memo released on June 5 last year.

Internal analysis shows a five-fold increase in reports of violence on Facebook, while complaints of hate speech tripled in the days following Trump’s post. Reports of fake news on the platform have doubled. The repartitions of Trump’s post generated a “substantial amount of hateful and violent comments,” many of which Facebook has struggled to suppress. Some of those comments included calls to “start shooting these thugs” and “f—- white. “

On June 2, “we can clearly see that the whole country was basically ‘on fire’,” a Facebook employee wrote of the increase in hate speech and reports of violence in the June 5 memo.

A protester watches a National Guard soldier as protests continue against the death of George Floyd near the White House in Washington, DC on June 3, 2020 (Alex Brandon / AP)

Facebook says it’s impossible to separate how many reports of hate speech have been prompted by Trump’s post himself or the controversy over Floyd’s death.

“This spike in user reports is the result of a critical moment in the history of the racial justice movement – not a single article by Donald Trump about it,” a Facebook spokesperson said. in a press release. “Facebook often reflects what is going on in society and the only way to avoid spikes in user reports during these times is to not allow them to be discussed on our platform at all, which we do not. never would. “

But the internal findings also raise questions about Facebook CEO Mark Zuckerberg’s public statements made last year as he defended his decision to leave Trump’s post untouched.

On May 29, for example, Zuckerberg said the company had taken a close look at whether Trump’s comments violated any of its policies and concluded that it did not. Zuckerberg also said he left the post because he warned people about Trump’s plan to deploy troops.

“I know a lot of people are upset that we left the president’s posts in place, but our position is that we should allow as much expression as possible unless it causes imminent risk of harm or specific dangers stated. in clear policies, ”Zuckerberg wrote. on his Facebook account on the night of May 29, as protests erupted across the country.

Still, Facebook’s own automated app checks determined that the post was likely breaking the rules.

“Our violence and incitement classifier was almost 90% certain that this (Trump) post violated… Facebook policy,” the June 5 analysis reads.

This contradicts conversations Zuckerberg had with civil rights leaders last year to allay concerns that Trump’s post was a specific threat to blacks protesting Floyd’s death, said Rashad Robinson, president of Color. of Change, a civil rights group. The group also launched a boycott of Facebook in the weeks following Trump’s post.

“To be clear, I had a direct argument with Zuckerberg a few days after this post where he turned me on and he specifically rebuffed any idea that this violated their rules,” Robinson said in an interview with the AP. last week.

Facebook CEO Mark Zuckerberg sits down to testify before a joint hearing of the Commerce and Justice Committees on Capitol Hill in Washington, April 10, 2018 (Alex Brandon / AP)

To limit the ex-president’s ability to elicit hateful reactions on his platform, Facebook employees suggested last year that the company limit reshare on similar posts that could violate Facebook’s rules to the future.

But Trump continued to use his Facebook account, followed by more than 32 million people, to inflame his supporters for much of the rest of his presidency. In the days leading up to a deadly siege on Washington on January 6, Trump regularly promoted false claims that widespread election fraud caused him to lose the White House, prompting hundreds of his fans to storm the United States Capitol and to demand fair election results. to be knocked down.

It wasn’t until after the Capitol riot, and as Trump stepped out of the White House, that Facebook pulled him off the platform in January, announcing that his account would be suspended until at least 2023.

There’s a reason Facebook has waited so long to take action, said Jennifer Mercieca, a professor at Texas A&M University who has studied the former president’s rhetoric closely.

“Facebook has really benefited from Trump and Trump’s ability to gain attention and engagement through outrage,” Mercieca said. “They wanted Trump to continue. “

Leave A Reply

Your email address will not be published.