How to improve your Facebook feed, so we see the next Trump coming


What would've happened if America had seen this coming?

For a large mass of the world, the election of Donald Trump was unfathomable mere hours before it happened. Helping that along were a slew of polls along with nonstop commentary—from experts, entertainers and laymen—indicating Hillary Clinton as a superior candidate, on course for a cake-walk of a win.They were wrong, to an almost universal degree. And the fact that so few could anticipate Trump's victory calls into question how America gets our information—what were we missing, and why? 

It's a question that inevitably leads us to Facebook.
Facebook certainly isn't our only source of news, but no single platform reaches as many people: roughly 170 million daily active users in the U.S.—tens of millions more than who voted. It's been argued, convincingly, that Facebook isn't doing enough to combat blatantly untrue news articles that appear in the news feed, and that it hasn't lived up to (or even worse, has actively shirked) its responsibilities as a distributor of content to do so.

But do Facebook's users bear some of that responsibility, too? The site's algorithm is complex, and inherently adaptive; it serves up content based on your behavior—after all, its only mission is keeping you on Facebook. If you like and engage inflammatory articles from the alt-right, you'll probably end up seeing commentary from National Review. Likewise, if you share John Oliver's latest diatribe, you'll probably be more likely to see Samantha Bee's next monologue in your feed.

This is how "filter bubbles" are made. The term entered the lexicon after a 2011 TED Talk from Eli Pariser, who warned against immersing ourselves in content that's only—or at least predominantly—agreeable. Filter bubbles are fueled by confirmation bias (or: our inherent tendency to engage with ideas we already agree with, and dismiss the ones we don't).

We're now seeing the large-scale effect of a nationwide filter bubble, and it's not healthy. Whether or not Facebook actually had a role in the outcome of the election is debatable, but there's no question it was a primary mover in the conversation. Only now it looks like there were actually two conversations going one, with little discourse between them.

It doesn't have to be that way, though. Facebook's algorithm isn't inherently biased, and you can even make it work against your confirmation bias, if you try. Here are a few straightforward ways you can get a news feed with a more diverse point of view, and pop that filter bubble.