Finally, Facebook trusts you to choose what you want to see in your news feed


Cognizant, perhaps, that its algorithm doesn’t give users enough control over what they see, Facebook this week announced a subtle update to its much-criticized news feed: For the first time, users will be able to choose for themselves which posts and pages appear at the top of their feeds.

“We know that ultimately you’re the only one who truly knows what is most meaningful to you,” product manager Jacob Frantz said in a statement, “and that is why we want to give you more ways to control what you see.”

To try out the new feature, users on iOS [Android and desktop versions are rolling out later] can open “news feed preferences” and tap “prioritize” to see a list of friends and followed pages whose posts appear in their feed.

Selecting preferred friends puts a star above their photos. Those friends’ posts will then appear above the algorithmically ranked news feed, in their entirety.

This is a pretty significant change from how news feed works now. Facebook’s home stream is, by all accounts, a pretty mysterious beast: 30 percent of American adults get their news there, according to a recent study, but most don’t understand its mechanisms — or, when it comes to controlling it, their own personal lack of agency.

Facebook uses a slate of factors, including “whom you tend to interact with, and what kinds of content you tend to like and comment on” to surface the posts it thinks you’re most likely to read. But the system is pretty opaque; we don’t know quite how our inputs map to Facebook’s outputs, if we’re aware that we’re “inputting” anything. [According to one oft-quoted paper, more than 60 percent of Facebook users don’t even realize that a system algorithmically ranks and filters the posts they see.]

In either case, that’s made the news feed a common target of critics, who argue that it is fundamentally disempowering: While algorithms helpfully power much of what we see online, this one makes choices for you without your conscious or considered input.

“The questions that concern me are how these algorithms work, what their effects are, who controls them, and what are the values that go into the design choices,” the sociologist Zeynep Tufekci wrote in May. “At a personal level, I’d love to have the choice to set my newsfeed algorithm to ‘please show me more content I’d likely disagree with.’”

This change doesn’t go quite that far, of course; nor does it give users the ability to see exactly what signals they’re sending to Facebook, or to correct misinterpretations in that data. (Just because I clicked a high school classmate’s baby picture once — once, for Zuck’s sake! — does not mean I want to see every post from that person.)

“Overall Facebook has moved in the right direction — more, simplified user controls — but it still greatly structures the feed to its own priorities,” Tufekci wrote in an email to the Post, “defined by pushing stories that garner ‘likes’ rather than stories that matter to their users. ”

Credit: Facebook Newsroom | Washington Post

0 comments:

Please keep your comments civil and to the point. Any off remark on gender, racism, etc will be deleted.
................................................................................................................................................................

Disclaimer: Comments on this blog are not written by the owner, therefore, the blog owner will not be liable for any comment made by readers. Privacy Policy | Terms and Conditions | Contact Us |

_

RealTalk Undressed