It’s hardly a secret that in today’s technologically-driven world, almost nothing is private or sacred anymore. This is especially true in the social media-centric happenings of people’s daily lives, where every tidbit of one’s day, from the foods we eat to how often our babies cry, is on full display for the world to see. But what about our emotions? A former Facebook executive says that Facebook is all over that, too, despite claims by the social media giant to the contrary.
In a recent op-ed piece for The Guardian, former Facebook product manager Antonio Garcia-Martinez warns that Facebook definitely has the ability to target advertising based on people’s emotional states, a fact that was revealed just days prior in a report by two Australian executives. But the extent of how such information is used to sell people more things that they don’t need is the subject of much debate.
According to the report, a “leaked” presentation showed that Facebook has the capacity to gather data and pinpoint people’s current emotional states in order to target them with exploitative advertising. Teenagers who feel “insecure” or “worthless,” for instance, can be identified based on the way they use Facebook, and that information is used to drive them towards products or services – all without their consent, of course.
Following the report’s release, Facebook denied that this micro-targeted advertising scheme was real, suggesting that the company in no way offers “tools” that target people based on their emotional states. But according to Garcia-Martinez, this isn’t exactly true, and appears to be more public relations marketing spin to deceive people into thinking that Facebook’s endeavors aren’t as nefarious as they actually are.
“Converting Facebook data into money is harder than it sounds, mostly because the vast bulk of your user data is worthless,” says Garcia-Martinez, the “whistleblower” who’s blowing the lid on what really happens at Facebook unbeknownst to its users.
“But occasionally, if used very cleverly, with lots of machine-learning iteration and systematic trial-and-error, the canny marketer can find just the right admixture of age, geography, time of day, and music or film tastes that demarcate a demographic winner of an audience. The ‘clickthrough rate,’ to use the advertiser’s parlance, doesn’t lie.”
Again, while it’s debatable just how often Facebook utilizes this technology – or even if it does at all – there’s no denying that it exists. And if it can bring in even more profits for Zuckerberg and Company, then surely it’s being utilized to the full extent that it can be, presumably in accordance with the law (though this isn’t guaranteed), and all at the expense of your personal privacy.
This is the price that people pay for using “free” services like Facebook, of course. Whatever the data-miners can gather from your browsing habits, including which Facebook pages you visit and how often you visit them, they will do everything in their power to monetize. The only way to stop it, in Garcia-Martinez’s view, is for people to make it known that they don’t approve of such intrusion.
“The hard reality is that Facebook will never try to limit such use of their data unless the public uproar reaches such a crescendo as to be un-mutable,” Garcia-Martinez says.
“Which is what happened with Trump and the ‘fake news’ accusation: even the implacable Zuck had to give in and introduce some anti-fake news technology. But they’ll slip that trap as soon as they can. And why shouldn’t they? At least in the case of ads, the data and the clickthrough rates are on their side.”
Sources for this article include:Submit a correction >>