If you’ve been on any of the usual social networks today, you’ve heard that Facebook has made some changes to its news feed that have frustrated a lot of Facebook users. I’m one of those frustrated users, but I’ve heard enough people ask “What’s the big deal?” to start to wonder why these changes bother me so much. More on that in a bit. First, a summary of the changes:
- When you log into Facebook, the main thing you see is your news feed. This is a stream of posts, photos, links, and other items shared on Facebook by your Facebook friends. Prior to last night, the news feed had two modes: “top news” and “most recent.” In “top news” mode, you saw a selection of items shared by your friends, a selection determined by Facebook’s algorithms. How those algorithms worked wasn’t made public by Facebook, but they seemed to prioritize items shared by Facebook friends with whom you interact regularly, by commenting on or liking their posts. In “most recent” mode, you saw all items shared by your friends, in reverse chronological order, with the most recent items at the top.
- Last night, Facebook rolled out changes that combined these two modes, with an emphasis on the algorithms. Now in your news feed, you see “top stories” (Facebook calls them stories now, not posts) that Facebook thinks you’ll find interesting. Again, it’s not clear how Facebook’s algorithms guess what you’ll find interesting, but the idea is that no matter how long it has been since you logged in (minutes or weeks or something in between), you’ll see a selection of “stories” that you missed while you were away, in theory the “stories” you’ll find most interesting.
- Although the “most recent” option for the news feed has been taken away, Facebook added a “ticker” over on the right sidebar that displays, in real-time, everything your friends are doing on the site: sharing stories, liking stories, commenting on stories, and so on. The ticker updates automatically, and (in theory) includes everything that every one of your friends is doing, so it functions similarly to the “most recent” mode of the old news feed.
Facebook has made some other changes recently, like adding automatically created lists of friends and allow people to “subscribe” to other people’s posts without friending them, but it’s the news feed changes that bug me the most. Why is that?
As I mentioned recently, I’m big on “input.” I follow a lot of blogs, listen to a lot of podcasts, and spend a fair amount of time on Twitter. That’s how I learn about new ideas, find new resources, and discover new perspectives. I take the same approach to each of those sources in that I read or listen to the oldest items in the “stream” first and work my way to the latest items in chronological order. This probably strikes some people as odd, since there are many who just dip into their Twitter feeds as time allows to read the latest tweets. I don’t, for the most part. There’s something about missing a tweet or a blog post or a podcast episode that I don’t like. The latest tweet / post / episode was created by someone who experienced those earlier tweets / posts / episodes, so if I skip that older content, I might not have sufficient context for understanding the newer content. It’s a bit like missing a few episodes of a TV show like 24 or Fringe; you want to catch up on the episodes you missed, not just see the newest one.
I’ve used Facebook in the same way, typically reading my Facebook friends’ posts in chronological order. I couldn’t always keep up, since I rarely check Facebook during the work day. I would often check after work and “fast forward” to the latest posts. And that’s one thing that bothers me about the new Facebook news feed: I’ve lost the ability to read everything, in chronological order. Facebook has taken that option out of my hands. I can’t access Facebook content in the way I prefer to do so, and I have less control than I did within the world of Facebook before the changes.
Where has that control gone? It’s been given to Facebook’s algorithms, the ones that guess which of my friends’ stories I’ll be most interested in seeing at any given time. I don’t trust the algorithms, and I’m not the only one. Eli Pariser has a new book out, The Filter Bubble: What the Internet Is Hiding From You, in which he explores our increasing reliance on algorithms to filter the massive amount of content we can access through the Internet. When you search for something on Google, you’re trusting Google’s algorithms to return the most useful and relevant content. Google’s algorithms are incredible, but they’re still returning certain results and not showing you other possible results. You have to trust those algorithms to give you meaningful results.
I’m not arguing that you shouldn’t use Google to find useful content among the billions of pages on the Internet, and I don’t think Pariser is, either. But it’s important to know that the algorithms you use have inherent biases. The example Pariser often cites is how the “top news” mode of his Facebook news feed stopped showing him posts by his conservative friends. Facebook’s algorithms had noticed that Pariser interacted more (commenting, liking) with posts from his liberal-learning friends, so the “top news” view started showing him just those posts. He had been placed in a “filter bubble,” where certain opinions and perspectives were hidden from him by Facebook’s algorithms.
The “filter bubble” isn’t a big problem if you know it’s there and can change your behavior so as to mitigate its effects. For instance, Pariser could start commenting on and liking his conservative friend’s Facebook posts. That would teach the algorithms that he finds them interesting, and they’d start appearing in his news feed. Of course, Pariser might not want to “like” certain opinions expressed by his friends, but Facebook doesn’t give us other quick ways to mark a post as interesting. As Pariser often says, there’s a “like” button, but no “important” or “interesting” button. The terms Facebook has chosen for its interactions make it challenging to fight the filter bubble.
The big problem is that many people don’t realize they’re in filter bubbles and, as a result, they don’t know what they’re missing. Consider the implications of this. There are many people who get most of their news through Facebook, through the links posted by their Facebook friends and the pages they’ve “liked.” Might their views of the world be incomplete if Facebook’s algorithms filter out certain perspectives or topics? It’s not that the algorithms target particular political views or categories of news, it’s that people might not interact with such posts (through commenting or liking) often enough for the algorithm to consider them of possible interest. Without knowing you’re in a filter bubble and working to break outside of that bubble occasionally, you run the risk of ending up in an echo chamber, not exposing yourself to new ideas and perspectives.
Algorithms aren’t all to blame for the echo chamber effect. As Ethan Zuckerman argues in this TED talk, we very often choose to follow / friend / like people who have similar perspectives to ours. In other words, most of us put ourselves in echo chambers more than we’d like to admit. How many of your Facebook friends have political views very different from yours? But when algorithms put us in filter bubbles, and thus echo chambers, without us knowing it? That’s a big problem. And with the “most recent” mode of the Facebook news feed no longer available, everyone using Facebook has now been put inside a filter bubble.
In short, I don’t trust the algorithms. They’re hiding content from my Facebook friends from me, and I don’t know if they’re making good choices when they do so. I would much rather see all the content from my friends, whether I interact with that content or not. Facebook has made that much more difficult now. What am I missing that the algorithms aren’t showing me? I have friends whose posts I always read, although I rarely interact with those posts. How will the algorithms know that I value those posts? I don’t want my relationships with my friends managed by Facebook’s algorithms, I want to manage those relationships directly.
Moreover, Facebook’s algorithms are hiding my content from my friends, in ways over which I have no control. Just as I’d like to control how I access my friends’ content, without interference by the algorithms, I would like my friends to control what they see from me. Why should Facebook stand in the way of that?
So what is there to do? I could leave Facebook. There’s nothing stopping me, right? Certainly, Google+ is looking like a more user-friendly social networking site now, and it’s quite possible that these news feed changes on Facebook will lead to a migration of Facebook users over to Google+. But, frankly, most of my friends are on Facebook, and relatively few of them are on Google+. I have friends from high school and college and church and work that are all over the world now. I use Facebook to maintain these relationships. Leaving Facebook would mean severing ties with many people I still like very much. Sure, that’s how things worked before Facebook came along, but now that I’ve experienced the benefits of maintaining connections with old friends through Facebook, walking away from all those friends would be a real loss.
And, really, if my beef is that the new news feed makes it more difficult to see what my friends are posting, leaving Facebook—and thus not seeing any of their posts—doesn’t really seem like a solution.
In summary, I’m invested in Facebook since so many of my friends are there. I want to see everything that those friends share on Facebook, with the option of doing so in chronological order. I want control over how I access content within Facebook. I want my Facebook friends to control how they access my content. I don’t want to cede that control over to the algorithms whose decisions don’t always align with mine. And I don’t want the millions of Facebook users to end up in filter bubbles created by those algorithms. Given the key role that Facebook plays in our local and global communities, I think it’s appropriate that Facebook is accountable to those communities, not just to the advertisers who are paying to be seen by members of those communities.
It’s true that when you’re on Facebook, you’re not having lunch, you are the lunch. (That is, Facebook is serving up our attention to its advertisers.) And while a restaurant might not care if the hamburger complains about the service, we aren’t hamburgers. We’re what makes Facebook a place worth visiting. And if Facebook isn’t responsive to our concerns about the ways they treat us, then that’s a problem.
Image: “sky-net,” Hani Amir, Flickr (CC)