Think before you ‘like,’ Facebook’s way of controlling its users

October 1, 2016 — by Patrick Li

Sophomore talks about problem of Facebook's targeted ads.

In the midst of this year’s presidential campaign, leading candidates are trying every method to sway the average American voter, whether through billboards, television ads or social platforms like Twitter. However, they have another tactic up their sleeves: one that is a bit more downplayed, but much more effective.

At this time of the year, Facebook users’ news feeds are cluttered with the videos and text posts of presidential campaigning. What the ordinary user  may not be aware of is that Facebook tailors video selections based on previous views — after watching a video endorsing Hillary Clinton, for example, Facebook will assume that the user likes and wants liberal-leaning content.

The problem lies in the probability of error in  Facebook’s estimation of a person’s political views, as a user’s political category is also largely based on their Facebook friends’ online activity. Therefore, a liberal user can be presented with a barrage of ads promoting Donald Trump if their friends are classified as conservative.

This type of advertisement exposure might seem harmless with food products or weight training programs, but when applied to something as important as making decisions about who will be the next president of the United States, it is a lot more dangerous.

Usually, when a TV ad comes up, it's easy for the user to just ignore it. In Facebook, the advertisements are mixed in with timeline updates or political pages posting about the presidential campaign, making it hard to distinguish between an advertisement and a post.

As a result, this increases the chance that the user’s opinion will be swayed by the ad. In effect, the person may not realize they are viewing an ad.

People need to be able to consciously make their decisions, but the critical thinking process that goes into decisionmaking is being distorted by social media.

Furthermore, school systems today are heavily centered around encouraging students to form their own opinions about certain issues. When Facebook classifies its users, it’s essentially choosings the user’s political preference without showing them the other side.

Because Facebook is a financially sustained corporation, and advertisers pay to expose their products or ideas, it's unlikely that it will discontinue its tactics for the sake of political objectivity. For their part, users must be aware of these influences and keep in mind that many of these posts are just advertisements — and they’re not getting the full story.


2 views this week