News Beat Web Exclusives

The Rise of Facebook & Downfall of Social Discourse

Written by Rashed Mian | Oct 3, 2017 6:28:08 PM

After reports began surfacing Sunday night of a gunman indiscriminately shooting concertgoers during an open-air show in Las Vegas, it didn’t take long for social networks and search engines to promote erroneous conspiracy theories and fabricated posts on their respective sites.

Alas, it’s a phenomenon that continues mostly unabated despite tech companies saying they’ve committed resources to combat the rampant spread of politically motivated faux news.

Case in point: Web sleuths on 4Chan, a popular destination for the alt-right, said they had identified the supposed gunman shortly after the Las Vegas shooting. This discovery was serendipitous: Not only did they stumble on the man they believed to have committed this heinous act of terror, but it turned out he happened to also be a member of several anti-Trump Facebook pages. This aligned perfectly with their politics: a dangerous liberal who belonged to the so-called “alt-left.”

We’ve seen this story before. After the Boston Marathon bombings in April 2013, internet detectives also took it upon themselves to seek out the suspects. They thought they had their man in a 22-year-old Brown University student, who it turned out, was actually killed in the blast. Before the authorities identified the real perpetrators, it was presumed that the suspects had to be brown men.

Not only had these witch hunts manifested in people having been wrongfully identified for horrific acts of violence, but both examples fit perfectly with various narratives espoused in the dark sectors of the web, and now in mainstream politics.

Unfortunately, what these examples demonstrate is a desire among people in the 21st century to use the Internet to formulate a fictitious motive and disseminate widely within a matter of seconds, with little consequence.

In fact, they might actually reap rewards for their actions. The 4Chan narrative that emerged after Sunday’s shooting actually found its way onto Google’s search results. So for a brief period, people searching Google for information about the attack were drawn to this erroneous exchange on 4Chan.

After the search giant was called out, it issued a statement that said in part: “This should not have appeared for any queries, and we’ll continue to make algorithmic improvements to prevent this from happening in the future.”

This paradigm of how people consume news and information and the manner in which it's circulated to friends and family shifted a long time ago. It’s natural now for consumers to open their Facebook and Twitter apps immediately upon hearing reports of traumatic events, with the hope of instantaneously gathering facts. Yet what began as an avenue to inform innocent users seeking truths about breaking news events has transformed into a dangerous game of misinformation, whereby anyone with an ideology and false narrative to weave can pervert social discourse on a grand scale.

It also demonstrates how America's era of tribal politics is sewed into our social fabric, even offline.

"This partisan polarization affects the way Americans of all political stripes consume information," Amanda Taub and Brendan Nyhan wrote in The New York Times in March. "People are more likely to believe stories that come from their side of the political divide, particularly if an authority figure vouches for them. And they are more likely to share news with their preferred slant as a way of showing they are good members of their political tribe."

When Facebook first came into existence, it was a way for college kids who had grown out of MySpace to share drunken photos with friends at other universities. Innocent enough. But Facebook has quickly grown into one of the most powerful media sites in the world. It now boasts more than 2 billion active users, and its algorithm is designed in such a way as to make sure people stay on Facebook once they sign on. That means delivering news stories—however credible—that fit their perceived ideologies. Thus, they’ll continue accessing the app or site to reinforce their perception of the world—a “world” engineered by an algorithm based on users' habits and likes. It’s simple, really: an alternative reality based on your actions within the site. Since we can’t create one on our own, Facebook does it for us. What's more, you have access to this perverted realm at any moment: in bed, on the train, at the ballgame, or even during the most intimate moments with a loved one.

Google is not much different. As Michael Patrick Lynch, professor of philosophy and director of the Humanities Institute at the University of Connecticut, told me: “What the internet is good at doing is keeping track of our preferences and predicting our preferences, our desires. So in a sense, it’s a sort of desire machine, and we get what we want from it. Of course, what we want and what’s true are two different things.”

Therefore, depending on your ideology—whether you’re a climate change skeptic or gun control idealist—Google delivers search results that will most likely confirm your biases.

The Internet can be a vexing place, given the millions of sites around the world crying out for our attention. But search engines and social media sites personalize our Internet existence as to make it a more pleasurable experience. Tech sites can make millions by generating content we want to consume. And once Facebook knows I like a certain type of razor or footwear, I can be targeted by companies who are confident I’ll be interested in their product.

Or our interests can be used in a much more nefarious way.

According to ProPublica, Facebook permitted advertisers to target users who appeared interested in such topics as “Jew haters” or “History of ‘why jews ruin the world.’”

It was reported recently that "Russians" allegedly paid Facebook to target people based on their political beliefs. Of course, it’s almost impossible to know how these ads changed people’s perception of a certain candidate. This election was so divisive that it’s not a stretch to suggest people’s minds had already been made up. It goes without saying that all forms of disinformation should be purged from social media sites. And it’s no surprise that fake news was rampant on Facebook during the presidential election, given how much credence users place on posts that show up on their news feeds. A Pew Research Center study released in May 2016 revealed that more than 60 percent of Americans get their news from social media sites, which is to be expected, given Facebook’s dominance. Perhaps even more disconcerting, and what needs to be addressed, is a desire among social media users to share posts based on headlines alone, without ever reading the story.

In response to the controversy over anti-semitic targeting, Sheryl Sandberg, Facebook’s chief operating officer, said: “Hate has no place on Facebook—and as a Jew, as a mother, and as a human being, I know the damage that can come from hate. The fact that hateful terms were even offered as options was totally inappropriate and a fail on our part.”

It seems social media, as The Atlantic recently pointed out, is failing us. Perhaps the only way to rectify what society has collectively broken is to go back to the basics: Make media literacy courses more available and teach younger generations that Facebook, Twitter, Instagram, Google, or wherever they’re going to get their news from in the future, is not reality. It’s what we want it to be. And that’s probably not a good thing.