For Your Information

Facebook Algorithms to Promote Violence, Hatred and Conspiracy Theories

The Wall Street Journal has published a series of articles called the "Facebook Files," based on tens of thousands of pages of internal research provided by a former Facebook employee, Frances Haugen. Haugen subsequently testified before a U.S. Senate subcommittee.

The information from the documents, as well as people's experience with Facebook, shows that the algorithms Facebook uses serve the promotion and "amplification" of incitement to violence, including communal violence, and racist, misogynist and homophobic content as well as division of the people. The actual algorithms are secret; their effect is known but not exactly how they work.

Facebook introduced new algorithms in 2018 because "engagement" was declining, which was a threat to its profits. People were spending less time on Facebook. This is still the case and Facebook's new markets are not in North America. India alone said to have 340 million Facebook users.

The new algorithms were said by Facebook to promote "engagement" between friends and family, etc., portraying Facebook as a platform where all are "equal participants."

The Facebook Files provided the internal documents which show Facebook was more than aware that the algorithms served to "amplify sensational posts," including incitement to violence, racist and misogynist content, the promotion of hate and self-loathing, etc. This much comes out in the Facebook Files and elsewhere, with other news outlets publishing further information based on redacted documents presented to the U.S. Senate committee hearings.

Internal Facebook research concludes as much and that the algorithms have had a negative effect on public discourse. The proposals to fix this from the groups Facebook set up internally to study the matter have been largely ignored because implementing them would "reduce engagement" and have a negative impact on the pursuit of maximum profits.

The suggestion is that this content, which harms society, is "amplified" because this is what people respond to and comment on, share, and "like." In fact the algorithms have changed the way news media operate, with news media complaining to Facebook that news items or features on matters such as healthy eating could not get any "reach." Political parties which form the cartel party systems of government also acknowledged that Facebook had changed how it frames their platforms.

Facebook's "amplification" of what it calls sensational posts, together with the use of fake accounts and single user multiple accounts (SUMAs) to post material which the algorithms will give preference to, changes the whole equation by presenting a fraudulent portrait of what is "popular." For instance, one report says eight people were the original source of false information about COVID-19 and negative effects of vaccinations which "went viral" -- this is to say that Facebook algorithms took it "viral." It is not a matter of what people using their own name and account are in fact saying and sharing. The rulers, pundits and media then use the hysteria created by the algorithms to declare that "people" are racist, misogynist, extremist and so on.

Another thing these hearings on the Facebook files have shown clearly is that community standards, by which violent posts and whatever else secret vested interests have been assigned to remove, do not apply to politicians with legislative seats or cabinet posts, "well known personalities," entertainers, and the like. Millions of users are considered VIPs who are exempt from the "community standards." Facebook uses a program called "Xcheck" that in 2020 included 5.8 million people who were "whitelisted" and exempt from the usual standards. They have impunity to post anything they want; it will either not be removed, or only removed with the approval of the top executives of Facebook, or only after it has had time to go viral.

The role of fake accounts and SUMAs to post huge amounts of content is also significant. Facebook algorithms permit a single user with multiple accounts to create a large amount of traffic conducive to promotion by the algorithms. Facebook researchers found that links popular with "heavy users" were disproportionately associated with false information, and that viral content favoured conspiracy theories, hate speech and hoaxes.

Facebook claims to regularly remove fake accounts, although researchers agree that there is no systematic approach. SUMAs are not against its standards, and multiple accounts are considered a significant source of new accounts. Of course, some SUMAs are legitimate as a person might have an account strictly for communicating with  family and close friends, and another which is public. However, Facebook is not forthcoming about the significance of fake accounts or SUMAs to its bottom line, or its relationship to what it calls "heavy use" and the association it has found with hate, violence, false information and conspiracy theories. The researchers also conducted dozens of experiments where they discovered that as the speed and length of a sharing chain grew, so did the odds that the content was toxic.

Facebook researchers created an account for a fictitious person they named Carol Smith. Her account was one of other fictitious "users" created in 2019 and 2020 by the researchers who were studying the effect of the algorithms introduced in 2018. Smith described herself as a politically conservative mother from Wilmington, North Carolina with an interest in politics, parenting and Christianity and followed Fox News and then-President Donald Trump. In just two days, Facebook was recommending groups dedicated to QAnon for Smith to join. Within one week her feed was filled with groups and pages that had violated Facebook's own rules, including those against hate speech and disinformation, NBC News reported. The researcher described Smith's Facebook experience as "a barrage of extreme, conspiratorial, and graphic content." This conclusion was repeated consistently with other accounts of fictitious people created by the researchers. These findings were included in the documents presented to the Securities and Exchange Commission and provided to Congress in redacted form.

Finally, the documents indicate that Facebook has resisted all recommendations from its own staff to make the necessary changes to its algorithms to stop its active role in promoting hatred, violence, and racist, misogynist and homophobic content. Instead it is "experimenting" with what it calls "disaggregating harmful networks," which means Facebook decides which groups or organizations are creating "social harm" and then deprives the group of new members and minimizes connections between its existing members. Any group or organization so targeted would have its content "demoted" in news feeds, and users would not be notified of its posts. The documents show that Facebook actively suppressed the "Patriot Party Movement" after the January 6 rampage on Capitol Hill and a German group called Querdenken. Whether such suppression is actively used at present against other groups and organizations has not been revealed.


This article was published in

Volume 51 Number 11 - November 7, 2021

Article Link:
https://cpcml.ca/Tmlm2021/Articles/M510116.HTM


    

Website:  www.cpcml.ca   Email:  editor@cpcml.ca