Should Facebook protect us from ourselves?

Photo courtesy of Casey Gomez

More than two years ago, I came to Tech as an undergraduate freshman. I am an international student, and this would be the first time that I would be spend an extended amount of time in a country other than my own.

Having previously travelled around Europe, I had certain ideas about the culture of the “West.” It was safe, predictable and liberal, in stark opposition to the conservatively Muslim culture of my native Pakistan, a state in perpetual political turmoil.

It goes without saying that in my time here I have realized that things are more nuanced than that. The United States is a country deeply divided and the truth is there was never a time when it was not like this.

Perhaps most revealing is what happened during the last presidential election: Trump won.

The day after the election, many of us, including myself, who live inside liberal echo chambers, tried to work out how we had gotten to this point. Everything I had ever seen in my Facebook newsfeed had been anti-Trump, so how could we find ourselves here? What happened? Hillary Clinton even wrote a book about it.

Nearly a year after his election, an unlikely target of confusion and grief has emerged from the ashes: Facebook.

Since the days following the elections, Facebook has been blamed as the conveyor of fake news. With the ongoing investigation into the Russian government’s efforts to influence the election, special counsel Robert Mueller asked for details on Russian ad purchases on the website. Since then, Zuckerberg has vowed to hire 1000 more ad reviewers for the website. But there’s little that Facebook can do to combat false news articles that are read and shared by ordinary Americans. Regardless of where those news stories originate, it is the kind of content users wish to see.

Our feeds are designed to be echo chambers. It relies on our responses and “likes” to show us the kind of content we want to see. This does not bode well in terms of access to information and unbiased reporting, but we need to see Facebook for what it is: a social media platform and, in a way, a reflection of the bigger world. It thrives on user-generated content. Whether it’s on our phone or offline, we look for information that reaffirms what we want to believe is true. And in either case, we know that not everything we see and hear is true.

But the facts do not matter anymore. If you want proof of that, just take a look at Washington, D.C.’s offerings of “alternative facts.” Facebook had nothing to do with it.

There is talk of the social media giant verifying the news stories that are shared, and
removing those that are false. As it officially stands, Facebook does not allow advertisers to run ads that link to stories that have been marked false by third-party fact-checking organisations.

However, giving moderators too much power over content that is uploaded onto the platform raises further questions about censorship and free speech.

The social media giant has published an article titled “Tips to Spot False News,” including
advice like checking dates, watching for unusual formatting and using multiple news sources to verify stories. There is little, however, that they can do beyond advising caution.

Trump was voted in by ordinary Americans. Social media gives everyone a very loud voice, which is something that American politics has not had significant trouble with in the past.

By doing so, it deepens a divide that has existed in American society since its inception. An algorithm is not to blame for it.

Pushing social media platforms to police content shared by users for accuracy would not address the root of the problem: people will believe what they want to believe.

Advertising