By Steven Adamo
Facebook has been a convenient way of keeping up-to-date with news, friends, family and various interests. Its involvement, however, in spreading misinformation during the 2016 presidential election is unacceptable.
False information is commonly spread using Facebook, which creates biases by using algorithms to specifically target news stories and advertisements based on the user’s data. This leaves a lot of people uninformed on issues that don’t fit their political and social leanings.
According to the Washington Post, a Russian troll farm called the International Research Agency placed ads on Facebook that focused on politically divisive issues like gun rights, race and immigration.
Targeted ads is a big industry, and companies like Facebook made $2.7 billion in targeted ad sales in 2011 alone.
Last year, ProPublica reported that Facebook’s ad system also allowed advertisers to see who can see ads based on race.
When logging into Facebook, the user is immediately met with news stories and “trending stories.”
With Facebook’s uniform design, stories from reputable news sources look identical to those focused on conspiracy theories, lies and hate speech.
The Guardian reported last year that Facebook fired a group of human employees whose job it was to remove false news articles from the “trending” section of the news feed.
Facebook has 1.18 billion users who rely on their news feeds for information. The Guardian reported that Facebook has become the world’s largest distributor of news.
During the 2016 presidential campaign, misinformation and divisive speech was heavily spread on Facebook— stories like “Clinton emails linked to political pedophile sex ring.”
To combat these issues, a network of news and technology companies (that include Facebook and Twitter) created the First Draft Coalition.
According to their website, their goal is to “improve the quality of information on social media.” Facebook also created the Facebook Journalism Project, a program that is too little, too late.
In a live broadcast last month, Facebook CEO Mark Zuckerberg gave specifics on his plan to combat this issue. One solution Zuckerberg announced was the potential hiring of 1,000 humans to look for “subtle expressions of violence” in all new ads.
In December of last year, Facebook announced that it would combat false news by working with five fact-checking organizations including FactCheck.org and Snopes.
The feature that allows Facebook users to flag misinformation was introduced in March. However, this had little impact due to the fact that most of the false news stories that are tagged “disputed” went viral prior to being labeled as such.
There are also other versions of the same false news stories that don’t have the disputed tag and are still being shared.
Social media sites such as www.ello.co provide free profiles and an opt-out option for data reuse — a move that should be more popular among all other social media sites.