PublicWire | Emerging Market Stock News
  •  Home
  • Technology
  • Medical
  • Energy
  • Cannabis
  • Finance
  • Retail
  • General
  • Podcast
  • Videos
  • Services
  •  Home
  • Technology
  • Medical
  • Energy
  • Cannabis
  • Finance
  • Retail
  • General
  • Podcast
  • Videos
  • Services
No Result
View All Result
PublicWire
No Result
View All Result

Home » Technology » Four revelations from the Facebook Papers

Four revelations from the Facebook Papers

by PublicWire
October 25, 2021
in Technology
Reading Time: 5 mins read
0

Facebook is battling its gravest crisis since the Cambridge Analytica scandal after a whistleblower accusing the company of placing “profit over safety” shed light on its inner workings through thousands of pages of leaked memos.

The documents were disclosed to US regulators and provided to Congress in redacted form by Frances Haugen’s legal counsel. A consortium of news organisations, including the Financial Times, has obtained the redacted versions received by Congress.

Earlier this month, Haugen testified in Congress that the social media company does not do enough to ensure the safety of its 2.9bn users, plays down the harm it can cause to society and has repeatedly misled investors and the public. The Wall Street Journal also ran a series of articles called the Facebook Files.

Here are four surprising revelations the documents contain:

Facebook has a huge language problem

Facebook is often accused of failing to moderate hate-speech on its English-language sites but the problem is much worse in countries that speak other languages, even after it promised to invest more after being blamed for its role in facilitating genocide in Myanmar in 2017.

One 2021 document warned on its very low number of content moderators in Arabic dialects spoken in Saudi Arabia, Yemen and Libya. Another study of Afghanistan, where Facebook has 5m users, found even the pages that explained how to report hate speech were incorrectly translated.

Facebook allocated only 13 per cent of its budget for developing misinformation detection algorithms to the world outside the US © Ilana Panich-Linsman/The Washington Post/Getty

The failings occurred even though Facebook’s own research marked some of the countries as “high risk” because of their fragile political landscape and frequency of hate speech.

According to one document, the company allocated 87 per cent of its budget for developing its misinformation detection algorithms to the US in 2020, versus 13 per cent to the rest of the world.

Haugen said Facebook should be transparent on the resources it has by country and language.

Facebook often does not understand how its algorithms work

Several documents show Facebook stumped by its own algorithms.

One September 2019 memo found that men were being served up 64 per cent more political posts than women in “nearly every country”, with the issue being particularly large in African and Asian countries.

While men were more likely to follow accounts producing political content, the memo said Facebook’s feed ranking algorithms had also played a significant role.

Facebook found that men were being served up 64 per cent more political posts than women in ‘nearly every country’ © Paul Morris/Bloomberg

A memo from June 2020 found it was “virtually guaranteed” that Facebook’s “major systems do show systemic biases based on the race of the affected user”.

The author suggested that perhaps the news feed ranking is more influenced by people who share frequently than those who share and engage less often, which may correlate with race. That results in content from certain races being prioritised over others.

As its AI failed, Facebook made it harder to report hate speech

Facebook has long said its artificial intelligence programs can spot and take down hate speech and abuse, but the files show its limits.

According to a March 2021 note by a group of researchers, the company takes action on only as little as 3 to 5 per cent of hate speech and 0.6 per cent of violent content. Another memo suggests that it may never manage to get beyond 10 to 20 per cent, because of it is “extraordinarily challenging” for AI to understand the context in which language is used.

Nevertheless, Facebook had already decided to rely more on AI and to cut the money it was spending on human moderation in 2019 when it came to hate speech. In particular, the company made it harder to report and appeal against decisions on hate speech.

Facebook said “when combating hate speech on Facebook, our goal is to reduce its prevalence, which is the amount of it that people actually see”. It added that hate speech accounts for only 0.05 per cent of what users view, a figure that it has reduced by 50 per cent over the past three quarters.

Facebook fiddled while the Capitol burned 

The documents reveal Facebook’s struggle to contain the explosion of hate speech and misinformation on its platform around a January 6 riot in Washington, prompting turmoil internally.

Facebook had switched off hate speech emergency safeguards before the Capitol riot in Washington on January 6 © Leah Mills/Reuters

Memos show that the company switched off certain emergency safeguards in the wake of the November 2020 election, only to scramble to turn some back on again as the violence flared. One internal assessment found that the swift implementation of measures was hindered by waiting for sign-off from the policy team.

Even proactive actions failed to have the desired effect. In October 2020, Facebook publicly announced it would stop recommending “civic groups”, which discuss social and political issues. However, because of technical difficulties in implementing the change, 3m US users were still recommended at least one of the 700,000 identified civic groups on a daily basis between mid-October 2020 and mid-January 2021, according to one research note.

Facebook’s response

Facebook declined to comment on some of the specifics of the allegations, instead saying that it did not put profit ahead of people’s safety or wellbeing and that “the truth is we’ve invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook”.


This post was originally published on this site

Previous Post

Employees pleaded with Facebook to stop letting politicians bend rules

Next Post

BlackRock partners with Saudi infrastructure fund

PublicWire

At PublicWire, we know the vast majority of all investors conduct their due diligence and get their news online in a variety of ways including email, social media, financial websites, text messages, RSS feeds and audio/video podcasts. PublicWire’s financial communications program is uniquely positioned to reach these investors throughout the U.S. and Canada as well as on a global scale.

Related Posts

Technology

Apple taps TSMC’s latest tech and BYD races into Japan

September 15, 2022
0
Technology

Fortress China: Xi Jinping’s plan for economic independence

September 15, 2022
0
Technology

Patreon: fight for talent makes creator economy more costly

September 15, 2022
0
Technology

Wall Street shudders after seeing US inflation data

September 14, 2022
0
Technology

After the tech sell-off: will growth investors keep the faith?

September 14, 2022
0
Technology

UK university develops device to restore sense of touch to stroke patients

September 14, 2022
0
Next Post

BlackRock partners with Saudi infrastructure fund

Please login to join discussion

Subscribe To Our Newsletter

Loading
Ad
PublicWire | Emerging Market Stock News 24/7 | Investor Relations US Stock Market

© Copyright 2022 publicwire.com

Navigate Site

  • About
  • Contact Us
  • Disclaimer
  • Watch LIVE
  • Privacy Policy
  • Terms and Services
  • Contributors

Follow Us

No Result
View All Result
  • LIVE Investor News Channel
  • Cannabis
  • Energy
  • Finance
  • General
  • Medical
  • Podcasts
  • Retail
  • Technology
  • Videos

© Copyright 2022 publicwire.com

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.