A Facebook employee testifying on Capitol Hill is not exactly newsworthy these days. Nevertheless, its most recent entrant distinguished herself for how she called out abuses within the company she has worked for over the past couple of years. Frances Haugen left Facebook in the spring of 2021 after she became disillusioned with the company’s tendency to prioritize profits over the public good.

While various Facebook scandals have subjected executives to the scrutiny of congressional hearings over the years, Ms. Haugen’s testimony is one of the first times that a Facebook insider has spilled the beans about the inner workings of the social media behemoth. She also had reams of internal documents to back up her claims that she shared with lawmakers. What we learned about the company and its priorities may change the way we regulate social media in the future.

Who is Frances Haugen

A big part of Facebook’s problem stems from having such a polished and well-spoken witness articulate the problems that she observed in her two plus years of employment. Ms. Haugen started with Facebook in June 2019 with an eye toward improving how Facebook handles misinformation. She began as a product manager on its civic integrity team, which concentrated on issues with elections around the globe. A data scientist with an MBA from Harvard, Ms. Haugen had also worked at Google, Pinterest, and Yelp before joining Facebook. However, she maintains that Facebook was far worse than the others with respect to valuing user engagement above the safety of its products.

Facebook and the Riot on the Capitol

Ms. Haugen acknowledged Facebook’s mission of trying to connect people around the world while explaining that content with higher engagement (in the form of likes, comments, and shares) gets higher distribution. Apparently, Facebook changed its algorithm in 2018 to expand “meaningful social interactions” via “engagement-based rankings.”

She continued that much of Facebook’s research showed that angry content had a better chance of engagement, which content producers and political parties were only too eager to use to their advantage.

Nevertheless, Facebook strengthened its safety measures to cut down on misinformation during the campaign between Joe Biden and President Donald Trump. Yet Ms. Haugen added that these systems were diminished or simply shut down after Biden’s victory for the sake of higher growth.

Indeed, just after the 2020 U.S. election, Facebook decided to disband the Civic Integrity Team. Ms. Haugen believes this decision contributed to the Jan. 6 Capitol riots, which were predicated on the lie that our election was stolen. Consequently, she felt that she had to speak out because the company was no longer interested in the safety of its community.

Instagram and the Damage to Young Girls

Ms. Haugen also highlighted the pernicious effects of Facebook’s Instagram app on young women, especially teenage girls. The internal research she leaked revealed that Instagram was more damaging to mental health than other social media platforms. In fact, one survey showed that 30% of teenage girls felt Instagram made their body dissatisfaction worse. As young women view this eating disorder content, they become even more depressed, and then they use the app even more. As a result, their body image becomes more negative.

Facebook Around the World

Nonetheless, Facebook’s dangerous influence does not stop at U.S. borders. Ms. Haugen also charged that the social media giant does not devote enough effort to their safety programs to handle all the different languages that the platform supports. Each time Facebook reaches a different language area, they need to spend the same amount to make safety systems for that language as they did for English. However, it simply doesn’t make sense economically for Facebook to be safe in many of these areas of the world.

Moreover, as Ms. Haugen noted, Facebook is well aware that permitting dangerous content actually keeps users engaging with their platform. Unfortunately, she witnessed Facebook repeatedly choosing profits in conflicts of interest between what was best for the public and what was best for Facebook.

What Have We Learned?

One of the things that we witnessed during Ms. Haugen’s testimony is that both Democratic and Republican lawmakers seem to be united against curtailing the harm done to teenagers by Facebook. Indeed, a few senators talked about bills that would add safeguards for younger users.

We also found out that lawmakers have become more tech savvy. In their questions, they examined the role that Facebook’s algorithms play in spreading questionable content and how the company changes its algorithm to favor some content over others. This represents a marked change from a few years ago when some lawmakers still didn’t know how the company made money.

Maybe one of the most jarring revelations, however, was that Ms. Haugen only unveiled the tip of the iceberg. For this reason, she enjoined lawmakers to ask for more documents and internal research from Facebook. She maintained that much more transparency was necessary before we could begin to fully understand and hopefully regulate social media.

Given Congress’ recent history regarding tech regulation, few people are holding their breath waiting for them to draft laws to address the issues raised. Nonetheless, Ms. Haugen has opened a virtual Pandora’s box and pushed Congress to consider vital questions about how to restrain the enormous power of social media and make it accountable to the public good. Whether Congress springs to action or not, she forced many of us to confront the uncomfortable truth that some social media companies are less concerned about the harm they cause than the enormous profits they stand to make.

Leave a comment