Facebook whistleblower, Frances Haugen, testifies before the Senate.

Haugen accuses Facebook of causing upheaval in communities all around the world.

Salem Ghebremedhin

Committee Senators listen as former Facebook employee and whistleblower Frances Haugen (center) testifies before a Senate Committee on Commerce, Science, and Transportation hearing on Capitol Hill, October 5, 2021, in Washington, DC.

Whistleblower Frances Haugen's testimony before Congress on the practices of her former employer Facebook was fascinating and unsettling for many. Haugen's testimony came after Antigone Davis, Facebook's global head of safety, was questioned about the company's harmful influence on children and teenagers. Davis adhered to Facebook's script, frustrating senators by failing to directly address questions. However, Haugen, a former Facebook project manager on civic misinformation, was unexpectedly more helpful with information. 

Haugen is an algorithm expert who has worked as a project manager for companies such as Google, Pinterest, and Yelp. It was alleged that while at Facebook, she worked on topics such as democracy, misinformation, and counter-espionage.

During the hearing, Haugen said in her opening statement “Having worked on four different types of social networks, I understand how complex and nuanced these problems are. However, the choices being made inside Facebook are disastrous for our children, for our public safety, for our privacy and for our democracy and that is why we must demand Facebook make changes.”

Of course, Facebook has been involved in a number of controversies over the years, including the Cambridge Analytica privacy scandal, the Myanmar genocide, and Russian electoral meddling. However, Haugen, among other things, appears to be drawing comparisons between Facebook's conduct and those of corporations selling cigarettes and narcotics. To illustrate her point, she highlighted, among many other practices. Facebook's:

  • The use of "engagement-based ranking," which emphasizes extreme emotions and damaging material, has implications such as increasing adolescent ladies' exposure to anorexia content, fueling political conflicts within families, and leading to ethnic violence in Ethiopia.

  • Served as a platform to plan the January 6 uprise
  • Decided to dissolve its civic integrity team after the 2020 election and before the January 6 attack on the Capitol. 
  • Failure to stop the spread of misleading vaccine information
  • burying its own internal research on the impact of Instagram on the mental health of adolescent girls.
  • Failure to protect the most vulnerable groups, for example, widows and individuals who have relocated to new cities, from misinformation that has dragged them into rabbit holes.
  • Allowing authoritarian or terrorist leadership to use the platform to spy on rivals 
  • Allowing high-profile users to violate its content policies.

Later on, Haugen told '60 minutes,' she was part of a civic integrity committee that Facebook dissolved right after the 2020 election. Facebook put in place precautions to minimize misinformation ahead of the 2020 presidential election in the United States. It disabled those protections following the election. However, after the assaults on the United States Capitol on January 6, Facebook turned them back on. Which, according to Haugen, is a problematic/irresponsible approach of dealing with political matters.

Haugen concluded the interview by stating “The thing I’m asking for is a move [away] from short-termism, which is what Facebook is run under today. It’s being led by metrics and not people. With appropriate oversight and some of these constraints, it’s possible that Facebook could actually be a much more profitable company five or 10 years down the road, because it wasn’t as toxic, and not as many people quit it."

Senator Blumenthal further suggested that Haugen return for another hearing regarding her concerns that Facebook is a threat to national security. Despite the fact that Facebook executives spoke out against Haugen at the hearing, it was reported that policymakers were impacted by her speech.


Sources

Givhan, Robin. "The Whistleblower Come To Advocate For Human over Algorithms," Washington Post, October 5, 2021. 

Granville, Kevin. "Facebook and Cambridge Analytica: What You Need To Know as Fallout Widens," New York Times. Accessed date: April 4, 2022. 

Prentice, Robert. "A whistleblower Faces Down Facebook." Ethics Unwrapped, McCombs School of Business. Accessed date: April 4, 2022. 


Comments

  1. Whenever I see these types of hearings, it's disappointing to see how brushed up the defense is with their legal jargon. It seems the only time hard hitting questions are land are outside the hearing by the press.

    ReplyDelete
  2. It is interesting to see that she thinks that Facebook would be more profitable if they negated the misinformation on the site. I would like to see the numbers of what percent of people sign up for Facebook for the misinformation vs the amount of people that quit due to it.

    ReplyDelete
  3. It is really disappointing the extent that tech companies will go to garner clicks and keep their engagement up regardless of the impact that harmful/false content can have on their users. It doesn't matter if the site makes their users happy or frustrated as long as people keep on coming back.

    ReplyDelete

Post a Comment

Popular posts from this blog

Facebook's Background Information; Mission and Vision Statements

Can Facebook's algorithm be improved?