I’ll start with the question this experiment set out to answer: if your Facebook News Feed is delivering too much content you don’t care about and not enough that you do, can you fix it by giving the algorithm the right signals? In a word, yes. But it takes some effort.

Cameron Uganec, who leads the content and social teams here at Hootsuite, recently shared on Instagram a post-it note I stuck to my monitor that read “Must Facebook.” He thought it was funny (“I can’t imagine there are many offices around the world where this is appropriate,” he said), but for me it was a necessary reminder. After years of neglecting my News Feed, using Facebook only for basic messaging and events, it wasn’t part of my daily content consumption routine. As I discovered, persistence is the key to shaping your feed.

Now that I’m about three weeks into this experiment, I don’t need the reminder. I’ve re-established the habit of consuming content that my friends share on Facebook. And when I say “my friends” I mean my actual friends. Which brings me to my first of three takeaway observations from this experiment:

The News Feed algorithm is more adept at figuring out your social graph than your interest graph

Of the five things I did to fix my feed, the change that had the biggest impact was adding some of my closest friends to my “Close Friends” list. My plan was to game this functionality by finding a select group of people I know and like who share great content and elevating them to ‘close’ status. I figured if I populated this list not with my IRL best friends, but instead identified those whose content I’d be most interested in, I’d achieve a positive effect on the quality of my feed. But instead I inadvertently ended up adding my closest IRL friends anyway.

It just happened. Whenever I saw content from a person who really is important in my life, I added that person to the list. Before long, my “Close Friends” were, as Facebook intended, my close friends. Which really shouldn’t be a surprise. The core insight that drove Facebook’s meteoric growth was that our social graph is a huge determinant of what content we will find relevant. In other words, who we know predicts what we care about.

As our friends list grew and friending became a matter of basic politeness, the effectiveness of the social graph in determining relevance diminished. That’s why Facebook created functionality intended to restore the social graph signals that made the News Feed great when we all had fewer than 100 friends. It worked. I was very glad, for instance, to learn about my friend Melissa’s Kilimanjaro charity climb and the fundraisers she’s planning. I set out to make my feed a better content discovery tool, not to become better informed about what’s happening in my friend’s lives, but achieved both.

Where I saw a less pronounced change was in the algorithm’s response to the signals I gave it about the topics I’m interested in. Maybe too few of my friends are sharing content about the intersection of social media and politics. Or maybe the algorithm is designed to pay closer attention to who’s doing the sharing than what is being shared. Whatever the reason, the part of the plan intended to fill my feed up with things I’m interested in by strategic Like-ing didn’t produce significant results.

I was disappointed that I’d failed to teach Facebook to interpret my interest graph until I read my colleague Evan Lepage’s post last week about Social Network Features You Need To Stop Ignoring. As he explained, Facebook’s Interest lists can accomplish much of what I was trying to do: “In addition to being a great way to categorize your Facebook interests and create a smoother, more efficient experience, this saves you the hassle of having to Like multiple pages, and then having a News Feed flooded with their updates.”

Mow the grass and your favourite neighbours will stop by more often

This experiment demonstrated the importance of basic maintenance. In the top-right corner of every News Feed item there’s a downward arrow that indicates a drop-down menu containing the two most important buttons for anyone who wants to curate a better content experience: “I don’t want to see this” and “unfollow [name].” Unfollow is a simple but effective tool that does exactly what it sounds like it does. If you don’t want to take the big step of un-friending someone whose content is fouling your feed, you can ensure you don’t see anything more from that person. Careful un-following had a dramatic effect, not only on the quality of the content in my feed, but on my daily experience of Facebook. If you un-follow one person every day for a month, you will be guaranteed a better feed.

The “I don’t want to see this” function will hide a post, so if there’s something you don’t want to see getting a lot of engagement you can hide it, no matter how many Likes and comments are popping up. If I were to speculate as to how the mysterious algorithm works, I’d suggest using this as a way to curate your feed. While I don’t know how Facebook treats these signals, beyond hiding posts I want to ignore, the effect appears to be positive. Through a mix of hiding posts and unfollowing friends regularly I’ve managed to significantly decrease the frequency of items in my News Feed I don’t care about.

If you don’t tell Facebook what you want to click, read, and share on a regular basis it becomes overgrown with content. The News Feed works best with just a little effort put toward trimming and pulling up the weeds. Because we’re always accumulating new friends, and by extension new content, we have to put in a modicum of work if we want the algorithm to know what to feed us. Among the five changes I made to my Facebook habits over the course of this experiment, identifying the things I don’t want to see more of has been the had the most dramatic effect. Again, it just takes some effort.

The “filter bubble” effect has been misunderstood (including by me)

Activist and media theorist Eli Pariser coined the term “filter bubble” in his 2011 book warning against the evils of algorithms that feed us too much content confirming what we already believe and too little that makes us uncomfortable. In a subsequent TED talk he explained how he personally experienced this effect in his social media feeds: “Facebook was looking at which links I clicked on, and it was noticing that I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared.”

Matt Honan’s Like experiment seemed to confirm the notion that the bubble reinforces our existing political tendencies when Facebook’s algorithm responded to his Like of a conservative post by delivering ever more conservative posts. The difference is that Pariser’s theory describes what he believes is the typical outcome of a typical user’s authentic behaviour. Honan, however, deliberately sent Facebook the wrong signals—there’s only so much we can learn about fixing a News Feed from the results of an experiment intended to break it.

My experience was different from both Pariser’s and Honan’s. In Vancouver, where I live, a long and bitter labour dispute between the provincial government and the teacher’s union came to a head over the course of this experiment. Not being a parent, I had spent most of the months-long conflict sitting on the fence. But as my feed filled up with content shared by parents whose kids weren’t in school due to the strike, I empathized more with their predicament and viewed the issue more through their eyes.

Polling throughout the dispute showed that parents supported the teachers in BC , but prior to this experiment I experienced the local media’s coverage daily update on who appeared to be winning. Through my Facebook feed, I learned that most of my friends who are parents were more supportive of the teacher’s union than I had been, and with the benefit of their perspectives, my opinion shifted their way. I can’t conclusively say that this experiment has broadened my political horizons, but it did show me what lots of people I respect thought about an important issue.

If the only content I consumed were to be filtered through Facebook’s News Feed algorithm, I’d likely find myself agreeing with my friends more often, and the filter bubble would arguably prevent me from being exposed to opinions and ideas I don’t agree with. If, however, like most people, I continue to get news and information from a diverse range of sources, it will have the opposite effect: my Facebook News Feed will be a collection of links to stories I’d never have read if I hadn’t been exposed to my friends’ divergent perspectives.

Maintain your social networks with Hootsuite. Sign up for a free 30-day Hootsuite Pro trial today!