You're the only one who truly knows what is most meaningful to you so Facebook is giving you more ways to control what you see in your news feed. Photo / AP
Cognizant, perhaps, that its algorithm doesn't give users enough control over what they see, Facebook have announced a subtle update to its much-criticised news feed: For the first time, users will be able to choose for themselves which posts and pages appear at the top of their feeds.
"We know that ultimately you're the only one who truly knows what is most meaningful to you," product manager Jacob Frantz said in a statement, "and that is why we want to give you more ways to control what you see."
To try out the new feature, users on iOS (Android and desktop versions are rolling out later) can open "news feed preferences" and tap "prioritise" to see a list of friends and followed pages whose posts appear in their feed.
Selecting preferred friends puts a star above their photos. Those friends' posts will then appear above the algorithmically ranked news feed, in their entirety.
This is a pretty significant change from how news feed works now. Facebook's home stream is, by all accounts, a pretty mysterious beast: 30 per cent of American adults get their news there, according to a recent study, but most don't understand its mechanisms - or, when it comes to controlling it, their own personal lack of agency.
Facebook uses a slate of factors, including "whom you tend to interact with, and what kinds of content you tend to like and comment on" to surface the posts it thinks you're most likely to read. But the system is pretty opaque; we don't know quite how our inputs map to Facebook's outputs, if we're aware that we're "inputting" anything. (According to one oft-quoted paper, more than 60 per cent of Facebook users don't even realise that a system algorithmically ranks and filters the posts they see.)
In either case, that's made the news feed a common target of critics, who argue that it is fundamentally disempowering: While algorithms helpfully power much of what we see online, this one makes choices for you without your conscious or considered input.
The questions that concern me are how these algorithms work, what their effects are, who controls them, and what are the values that go into the design choices.
"The questions that concern me are how these algorithms work, what their effects are, who controls them, and what are the values that go into the design choices," the sociologist Zeynep Tufekci wrote in May. "At a personal level, I'd love to have the choice to set my news feed algorithm to 'please show me more content I'd likely disagree with.'"
This change doesn't go quite that far, of course; nor does it give users the ability to see exactly what signals they're sending to Facebook, or to correct misinterpretations in that data. (Just because I clicked a high school classmate's baby picture once - once, for Zuck's sake! - does not mean I want to see every post from that person.)
Still, it's a step in the right direction. Not that such steps don't have their downsides, too. Everything you tell Facebook, from your home town to your LGBT-ally status to the posts you want to see in your news feed, is a new data point in the constellation Facebook uses to monetise you.