Apparently Facebook becomes a really bad place when you or someone you’re friends with “likes” everything. That appears to be the lesson we learn from an experiment by Wired writer Mat Honan, who tried liking every single thing he came across on Facebook for two days (other than a friend’s post about a death in the family).
Chances are, none of your friends are doing this, but it highlights just how much the things our “friends” engage with actually impact what see in our News Feeds. The way Facebook has presented content to us has changed a lot over the years. Gone are the days where the News Feed simply showed you posts from your Friends and the Pages you’ve liked in chronological order. Things are so much more complicated now.
Is Facebook’s News Feed content selection adequate for your own needs? Do you think they should be doing things differently? How so? Share your thoughts in the comments.
You remember Honan. He was the guy who wrote about getting hacked in Wired a couple years ago, and his story led to Apple taking more security precautions with AppleID.
There’s a lot of intrigue when it comes to the Facebook News Feed, especially for marketers who have in recent months, experienced a major downturn in organic post reach. Facebook has made changes to its algorithm aimed at improving the quality of the stuff users see, or at least that’s how the company presents it. If Honan’s story is any indication, however, this is all quickly derailed by anyone who wants to screw with it.
It’s not really surprising that the News Feed would turn to garbage for the person liking everything they come across. If you “like” a bunch of stuff that you really don’t like, it’s going to send Facebook a message that you really do like that kind of stuff. You shouldn’t be surprised when it completely pollutes your experience.
The more interesting, and somewhat troubling part, is that someone doing what Honan did can pollute someone else’s feed.
“While I expected that what I saw might change, what I never expected was the impact my behavior would have on my friends’ feeds,” he writes. “I kept thinking Facebook would rate-limit me, but instead it grew increasingly ravenous. My feed become a cavalcade of brands and politics and as I interacted with them, Facebook dutifully reported this to all my friends and followers.”
“That first night, a small little circle with a dog’s head popped up in the corner of my phone,” Honan continues. “A chat head, from Facebook’s Messenger software! The dog turned out to be my old WIRED editor, John Bradley. ‘Have you been hacked,’ he wanted to know. The next morning, my friend Helena sent me a message. ‘My fb feed is literally full of articles you like, it’s kind of funny,’ she says. ‘No friend stuff, just Honan likes.’ I replied with a thumbs up. This continued throughout the experiment. When I posted a status update to Facebook just saying ‘I like you,’ I heard from numerous people that my weirdo activity had been overrunning their feeds. ‘My newsfeed is 70 percent things Mat has liked,’ noted my pal Heather. Eventually, I would hear from someone who worked at Facebook, who had noticed my activity and wanted to connect me with the company’s PR department.”
This is obviously a problem, and now that Honan’s story is out and is quickly racking up the social shares, it’s probably going to give others some mischievous ideas. Hopefully Facebook is paying attention, and doesn’t let this get out of control. We’ve reached out to Facebook for comment on the issues with its algorithm and how abuse can potentially affect other users’ experiences. We’ll update accordingly.
The good news for brands and publishers is that brand and publisher content dominated Honan’s News Feed after he did this. What a powerful propaganda machine Facebook can be.
The fact that Honan’s friend asked if he’d been hacked is worth paying attention to as well, as Facebook account hackings are not all that uncommon, and if friends don’t know any better, there’s no telling what kind of impressions they’re bound to get. I had at least two different friends discover their accounts had been hacked within the past couple months.
The broader issue is that Facebook insists on determining what to show users in their News Feed algorithmically, and there appears to be a real flaw. It doesn’t help users’ perception that this comes less than a month after the public caught wind of Facebook’s controversial “emotion” experiment, which freaked a lot of people out. If you missed that, it was discovered that Facebook tested showing people more positive and more negative content in their News Feeds back in 2012 to see what kind of effects it had on their emotions. More negative content unsurprisingly put people in worse moods.
In light of Honan’s experiment, you have to wonder to what degree your friends’ Facebook activity is affecting your own mood. You obviously see content from some people more than others – those Facebook has decided it should show you more from. Let’s hope Facebook has decided to show you stuff from people that won’t make you feel bad (I have to admit, I see some pretty sad stuff on there day to day – not that this is always necessarily a bad thing – it’s just worth considering that Facebook is choosing what to show).
Honan’s experiment also comes after Facebook has been cracking down on unreliable likes. Clearly they have some more work to do. In a recent platform update, Facebook banned developers from incentivizing users to like their Facebook pages on Facebook and within apps.
Caleb garling at The Atlantic also recently blogged about manipulating Facebook’s algorithm, forcing his article to the top of people’s News Feeds after being angered by Facebook burying it the first time he shared it:
I posted: “Hey everyone, big news!! I’ve accepted a position trying to make Facebook believe this is an important post about my life! I’m so excited to begin this small experiment into how the Facebook algorithms processes language and really appreciate all of your support!”
The first like and comment came almost instantly. I liked back. Then a few more. People were playing along. I liked them all back. Then momentum began to pick up: You could almost feel two great blue hands ratcheting the post up my friends’ feeds. Then victory: Around the 39-minute mark after I published the status update, my friend Casey told me my status—rather than possible updates from about 1,000 friends—was at the top of his feed. Nine minutes later, another friend confirmed the same. More and more people said the post was firmly at the top of their feed—and not just (actual) friends, but former colleagues I hadn’t talked to in years. After 90 minutes, the post had 57 likes and 25 commenters.
For the next two days, the likes and comments poured in, and people reported my status was still at the top of their feed. (Some even asked how to make it go away.) As of this writing, it has 134 likes and 62 comments.
At least Facebook has basically admitted it’s algorithm isn’t very sophisticated at determining quality content. Apparently it’s not very sophisticated in certain other areas either.
Does Facebook do enough to prevent manipulative “engagement” to justify its algorithmic approach to News Feed content delivery? Tell us what you think.
Image via Facebook