Michigan Senator Asks Big Tech CEOs How They Stop Calls Of Violence From Spreading Online

0

A man accused of raising a bomb threat to the Library of Congress broadcast his clash with the police live on Facebook.

His posts remained online for several hours before Facebook deactivated his account and removed the videos, although clips of his anti-government tirade continue to circulate on social media. The incident has rekindled criticism of how social media platforms respond to users who advocate violence or broadcast crime in real time.

US Senator Gary Peters, of the Township of D-Bloomfield, referred to the live broadcast of the bomb threat in a letter asking CEOs of Facebook, YouTube and Twitter to provide information on how they are moderating content that advocates violence. Peters said each platform has been used to disseminate manifestos or to post videos of violent acts.

Related: Stopping hate speech online is difficult. New research is teaching machines to find white nationalist content.

The request is part of an ongoing investigation into domestic terrorism, sparked by the January 6 riot at the United States Capitol, which Peters leads as chairman of the Senate Committee on Homeland Security and Government Affairs. In letters to big tech CEOs, Peters expressed concern about “violent extremists” using social media platforms to raise funds, radicalize others and organize attacks.

Peters also asked for information on how targeted advertising tools allow advertisers to deliver messages to certain groups based on keywords such as “white supremacists” and “neo-Nazi”. Micro-targeted ads and algorithms designed to boost engagement suggest that tech companies are profiting from content that amplifies political violence, Peters said.

“There is a financial incentive for social media platforms like Facebook to keep users engaged on their platforms and to view content, including extremist content,” Peters wrote in his letter.

Related: Peters-led Senate group continues investigation into domestic terrorism in America

The Michigan senator acknowledged that private companies are protected from legal liability and have the right to decide what is allowed on their websites. He also acknowledged efforts by Facebook, YouTube and Twitter to remove content that violates their terms of service, but said questions remain about how the platforms are used to incite violence. Peters said more transparency is needed to ensure that white supremacist and anti-government propaganda is not easily accessible with just a few clicks.

Specific examples of incidents involving each of the three tech companies were described in Peters letters to Facebook CEO Mark Zuckerberg, YouTube CEO Susan Wojcicki, and Twitter CEO Jack Dorsey.

The three platforms were used by a network of groups and users to organize “stop the flight” gatherings in Washington, DC The rallies were promoted by focusing on the false narrative that former President Donald Trump won the 2020 election.

The rallies were followed by a violent riot at the United States Capitol where participants broadcast live as they entered the building and confronted the police. Rioters used YouTube and Facebook to live stream their actions inside the Capitol during the attack, and some used the streams to ask for donations. Mentions of “civil war” and calls for violence dope during the riot.

Group of men accused of plotting Gov. Gretchen Whitmer’s kidnapping also used private Facebook groups to share recordings of training exercises, discuss their plans and recruit new members, according to federal prosecutors.

Facebook was also used to organize a violent neo-Nazi rally in Charlottesville, Virginia, in 2017. Twitter users promoted wild claims that a child sex ring was being exploited in a pizza place in Washington, DC, DC, which led a gunman to break into the restaurant in 2016.

A man who killed 51 worshipers in mosques in New Zealand in 2019 said racist YouTube videos were a important source of inspiration for the attack.

Peters cited a report who discovered that YouTube allows advertisers to use epithets and racial phrases associated with domestic extremist groups as keywords to find videos and channels for targeted ads. This allows advertisers to use hateful terms to target ads on YouTube, according to the report.

Peters also expressed concern about Facebook’s advertising tools. The senator cited a Buzzfeed Report who found Facebook users who posted incorrect information about the January 6 election or riot received targeted ads for armored vests, weapon accessories and other military style equipment.

“It is not known to what extent Facebook’s policies and practices continue to generate similar targeted ads for individuals associated with domestic extremist groups and how they may generate income through their placement,” Peters wrote in his letter to Facebook.

LEARN MORE ABOUT MLIVE:

About 1,300 Afghan refugees arrive in Michigan. This is more than what we have seen in the past decade.

Most abortions take place in the first trimester and more facts about abortion rates in Michigan

Proposed Congressional Districts Call for Big Changes in Mid-Michigan, Metro Detroit

Michigan set to set road fatality record this year, yet again


Source link

Leave A Reply

Your email address will not be published.