Advertisement
Under the guise of combating “fake news,” social media giant Facebook has unveiled a new censorship scheme that basically allows the platform to either award or penalize users based on what it deems to be the overall “trustworthiness” of their browsing, liking, and sharing habits.
Similar to a social engineering program already being used throughout communist China to control people’s speech both online and off, Facebook says its new “trust ratings” system will track the online behavior of its users and assign secret ratings that ultimately determine whether or not users are “visible” online.
So-called “malicious actors,” as defined by Facebook, might have their content blanked out from other people’s Facebook timelines, for instance. Users who are determined by Facebook to be purveyors of “fake news” can expect to be similarly censored, or taken less seriously when reporting content to Facebook moderators – and all of this in the name of improving the “credibility” of the Facebook experience.
“One of the signals we use is how people interact with articles,” stated Tessa Lyons, head of Facebook’s anti-fake news initiative.
“For example, if someone previously gave us feedback that an article was false and the article was confirmed by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true.”
While Lyons says the Facebook trust ratings system “isn’t meant to be an absolute indicator of a person’s credibility,” it is being actively used as a metric in determining the overall “risk” of each Facebook user’s actions online.
What this means for the average Facebook user remains unclear, however. A company spokesperson told The Sun that the trust ratings system isn’t a “centralized ‘reputation’ score,” but rather a catch net for protecting the platform against users who might try to indiscriminately flag news they don’t like as being fake in an attempt to “game the system.”
“The reason we do this is to make sure that our fight against misinformation is as effective as possible.”
But those familiar with China’s so-called “social credit” system say that Facebook’s new trust ratings system is eerily similar to it. In case you’re unfamiliar with it, the Chinese government basically tracks people’s social media, online shopping, and other habits and assigns them a score that affects their ability to function in society.
Chinese citizens with a lower social credit score might have more trouble than others taking out loans or even using public transport, while those with higher scores are granted more freedoms by the authoritarian leaders of their country.
It’s all a matter of how obedient Chinese people are to the government status quo that determines their overall social credit score. Those who best comply are considered “model” citizens, while those who engage in online behaviors that the government doesn’t like are oftentimes “blacklisted,” preventing them from traveling, buying, and selling.
In many ways, Facebook’s new policies to overcome the “anxiety and division” that founder Mark Zuckerberg says his platform created are creepily similar to China’s totalitarian social credit score system. If you do what Facebook likes, then you’ll presumably have a higher “trust rating.” But if you don’t, then you can expect to be penalized.
“Facebook has a lot of work to do,” explained Zuckerberg at a hearing earlier this year on the role that his social media site may have played in the alleged Russian collusion scandal that liberals have been parroting nonstop following the results of the presidential election. Keep in mind, however, that it was Facebook that’s been actively censoring conservative content, presumably to thwart the upcoming midterm elections.
To keep up with the latest news about technocratic totalitarianism, visit Precrime.news.
Sources for this article include:
Submit a correction >>
Advertisement
Advertisements