Popular Articles
Today Week Month Year


Google doubles down on deceiving customers about its political manipulation
By News Editors // Jun 26, 2019

After multiple sources corroborated the longstanding accusation that Google stealthily infuses its political preferences into its products, the company has continued to claim neutrality, leading to incongruous answers by its executives to lawmakers’ questioning.

Brighteon.TV

(Article by Petr Svab republished from TheEpochTimes.com)

A June 24 exposé by Project Veritas showed several Google employees and a cache of internal documents describing methods Google has used to tweak its products to surreptitiously push its users toward a certain worldview.

One employee even appeared to say, when caught on hidden camera, that Google’s goal was preventing President Donald Trump, or anybody like him, from being elected again—an assertion confirmed by another employee who spoke under the condition of anonymity.

Google spokespeople have failed to produce an official response, but two of its executives were questioned about the revelations—one at a June 25 Senate hearing and one at a House hearing the following day.

During the June 26 House Homeland Security Committee hearing, Rep. Debbie Lesko (R-Ariz.) confronted Derek Slater, Google’s global director of information policy, with one of the leaked documents on “algorithmic unfairness” (pdf).

“Imagine that a Google image query for ‘CEOs’ shows predominantly men. Even if it were a factually accurate representation of the world, it would be algorithmic unfairness,” the document says, explaining that in some cases “it may be desirable to consider how we might help society reach a more fair and equitable state, via … product intervention.”

“What does that mean Mr. Slater?” Lesko asked.

“I’m not familiar with the specific slide,” he said. “But I think what we’re getting at there is when we’re designing our products, again, we’re designing for everyone. We have a robust set of guidelines to ensure we’re providing relevant, trustworthy information. We work with a set of Raters around the world, around the country, to make sure those Search Rater Guidelines are followed, those are transparent, available for you to read on the web.”

“All right. Well, I personally don’t think that answered the question at all,” she replied.

Similarly, Maggie Stanphill, Google’s head of Digital Wellbeing, was questioned by Senate Commerce Committee member Ted Cruz (R-Texas) the day before.

He asked whether Stanphill agreed with a quote from one of the leaked documents saying that Google should “intervene for fairness” in its machine-learning algorithms. Stanphill said she didn’t agree with it.

But Google has already put the “fairness” doctrine into practice, based on what the employees and the documents in the Project Veritas report say.

‘Algorithmic Unfairness’

“Our goal is to create a company-wide definition of algorithmic unfairness that … establishes a shared understanding of algorithmic unfairness for use in the development of measurement tool, product policy, incident response, and other internal functions,” says a document last updated in February 2017.

“What they’re really saying about fairness is that they have to manipulate their search results so it gives them the political agenda that they want,” the unidentified insider said.

For instance, when one types in the Google search bar “men can” and makes a space, the search engine suggests phrases like: “men can have babies,” “men can get pregnant,” and “men can have periods.”

When one types in “women can” and makes a space, the suggestions would show phrases like: “women can vote,” “Women can do anything,” and “women can be drafted.”

This isn’t because these phrases are so popular among users, but because the “fairness” algorithm pulled them from so-called “sources of truth”—they reflect the political narrative Google desires, the insider said.

Moreover, Google has adopted the doctrine while keeping its users in the dark, he said. One of the document says “it is not a goal at this time to release this definition [of algorithmic unfairness] externally.”

Read more at: TheEpochTimes.com



Take Action:
Support NewsTarget by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NewsTarget.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.

NewsTarget.com © 2022 All Rights Reserved. All content posted on this site is commentary or opinion and is protected under Free Speech. NewsTarget.com is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. NewsTarget.com assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published on this site. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
News Target uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.