Leave it to the users—leveraging audiences to vet content

By Kara Tabor

via SEOPlanter/Flickr


Speak the truth, or get demoted on social media. That's the impetus behind Facebook's newest feature allowing its users to sort the truth from the fake --and no, I'm not referring to whether or not your friends' Instagram snap truly has #nofilter. With the addition of an option to label a post as false information (within the post reporting feature that already gives users the ability to flag content that they may object for any of a variety of reasons), users can now have their say in one of the fundamental questions of the internet: Is this for real?


In Caroline O'Donovan's article for Nieman Lab, she points out that while Facebook said the scam-and-hoax flagging mechanism isn't controlled by humans (at least by humans working at Facebook), it's utilizing the discerning power of its very human users to call the bluff on articles that aren't really what they purport to be. As the article states, fake news isn't a massive problem on the social platform, yet it's taking measures against the spread of misinformation from parody news sites and users who are trying to trick their friends.

But in looking at this added level of content flagging, I wonder what effects other than filtering will result. Will users start to flag content that they don't agree with as false? Will true articles with unbelievable titles get flagged by users who don't bother to click on the link to investigate and make assumptions instead? Most importantly though, will this tool help legitimate, solid articles and content rise up in the Newsfeed?

It leaves me wondering if this new feature, if used frequently enough by a large enough portion of Facebook's users, will have a more democratizing effect similar to Reddit. Since its early days, the site has capitalized on the number of upvotes or downvotes a post receives in determining how content appears to users and what ends up on the front page. This feature doesn't always result in the most serious or newsworthy of content rising to the front, but in particular subreddits like r/news, it helps the good stuff rise to the top.

The basic liking feature on Facebook already has a role in helping the algorithm determine how to display content on the Newsfeed. Moreover, we'll have to see if this next level of user control really makes a difference that reputable sites and pages can leverage to get their content more visibility.


No comments:

Post a Comment