The Game Being Played Around You (And that Qur'an Burning Fiasco)
Updated May 11, 2021
This post is part of a series on the #FiqhOfSocialMedia which has now been published as a book. Please visit Amazon to purchase.
There are two things I came across recently that completely rocked my world in terms of how I understand my interaction with the internet, and social media in general. "Mark Zuckerberg, a journalist was asking him a question about the news feed. And the journalist was asking him, "Why is this so important?" And Zuckerberg said, "A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa." And I want to talk about what a Web based on that idea of relevance might look like."
That's the opening to this powerful, 9 minute, Ted Talk about 'online filter bubbles' by Eli Pariser.
I've done some freelance social media consulting on the side, and one of the most frustrating things has been figuring out Facebook. I don't mean the basics, but I mean understanding what Facebook does behind the scenes that makes your post be seen by a fan.
See, on Twitter, when you follow someone, you automatically see everything they post. That's intuitively how it would work. Facebook was kind of the same way. If you "liked" a page, or "friended" someone - you would expect to see their updates. After all, you voluntarily chose to follow them.
But they decided to implement an algorithm to decide for you, whether a post was important or not. A number of factors go into this - how many other people liked it, commented on it, shared it, and how many of the people who did that are friends with you. The only way to get around this is for the one posting it to pay Facebook to 'boost' the post (one obvious motivation for mucking with the system so much).
Google does something similar. If you and a friend both search the same query, chances are you will receive completely different results. It tries to be smart, factoring in things like your location, whether you're on a mobile device, and so on.
Naturally, as you start to consume one type of content, or prefer one viewpoint over another, these algorithms start serving up content that you agree with.
"So it's not just Google and Facebook either. This is something that's sweeping the Web. There are a whole host of companies that are doing this kind of personalization. Yahoo News, the biggest news site on the Internet, is now personalized -- different people get different things. Huffington Post, the Washington Post, the New York Times -- all flirting with personalization in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see. As Eric Schmidt said, 'It will be very hard for people to watch or consume something that has not in some sense been tailored for them'....in a broadcast society, there were these gatekeepers, the editors, and they controlled the flows of information. And along came the Internet and it swept them out of the way, and it allowed all of us to connect together, and it was awesome. But that's not actually what's happening right now. What we're seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don't yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, then we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important..."
This point made me think, but it wasn't until I read Trust Me I'm Lying by Ryan Holiday that it really hit home at just how deep the manipulation game around us is. Ryan has worked with a number of best-selling authors (like Tim Ferriss), and was the director of marketing for clothing brand American Apparel.
In this book he gives a behind the scenes look at the economics of the blogging industry. This is important because he illustrates how the financial bottom line causes reporting that is unethical, exaggerated, unverified, and sometimes simply fabricated.
One of the examples he used to illustrate this was something most (if not all) of us are familiar with. That crazy guy, Terry Jones, who wanted to do the Qur'an burning. Jones put up billboards in front of his church, and got covered by a small local paper. This was picked up by a small website (Religion News Service). Yahoo then linked to the short article, and in turn a number of other blogs started picking it up. As the story quickly moved up the chain, CNN decided it needed to jump in and interview Jones. The media finally came to its senses, realizing that airing such a video would have potential consequences and decided not to air the video. Jones was under pressure and backed down. Crisis averted.
A few months later, he decided to try again. The media threatened a blackout but he went ahead with the Qur'an burning. About 20 people attended, and no one covered the story. Except one freelance reporter from Agence France Presse. Since AFP is syndicated on Google and Yahoo, the story started to gain publicity. As more and more blogs linked to it, the story got too big to ignore. Now everyone had to comment on it. Within days, 27 people were killed in the ensuing riots in Afghanistan - a very real consequence of Journalism 2.0 resulting in death.
If you've ever been on a news website, you've no doubt been drawn to their 'most emailed' articles list. If you've ever visited Buzzfeed or Upworthy, you notice an overt formula at play with the headlines and article structure. You've seen lists of things split up into annoying slideshows. These are not by accident. Trying to go viral (and get views, traffic, and revenue) is the name of the game. It's the only way to stay alive.
When that is the motivator, people no longer have a motivation to present something valuable, authentic, or even challenging. Instead you get fake news. The best way to explain it is by watching this clip (I hate Southpark, but this is unbelievably illustrative of what actually goes on). In a world of shock and awe viral content, we stop seeking honesty and reality. It's simply too boring.
Holiday goes on to show how a lot of these root elements end up causing things like snarkiness, character assassinations (ever seen that happen to an Islamic scholar online?), and online vigilantism and mob justice.
There used to be a time where we trusted journalism. If something was reported on CNN, or the New York Times, or Time magazine - we assumed there was a journalistic integrity behind it. There was fact-checking, source-checking, and editorial oversight. We can no longer make these assumptions. In fact the yellow journalism of old has simply been reincarnated into the online information diet we each consume daily.
We're sometimes a bit too trusting of everything we see online. We're quick to read and share things without really verifying what happened - but by then it's too late in this ultra-fast information age.
A little bit of pause, caution, and healthy skepticism can go a long way as we peruse online content.
Imagine seeing debates on Facebook, and then as you 'like' the viewpoints you agree with, Facebook stops showing you the opposing viewpoints. Whether it be an Islamic debate, political, or even sports.
Watch the Ted talk and let me know your thoughts. If you can read the book, I highly recommend it (I consider it a 5 star book).
What do we need to teach, particularly from a faith perspective, about understanding this online beast? What issues related to this do you want to understand in more detail, and what type of material would you find helpful?
You can subscribe to the email list to send me your thoughts or Tweet me.