Facebook’s passive-aggressive, love-hate, on-again off-again, relationship with news

Facebook has been struggling in recent weeks in how it handles news and following that struggle been interesting. Here are a few recent milestones:

  1. May 2015: Facebook courts the big news outlets with “Instant Articles” – a way to get their content in front of more users through the news feed while Facebook tries to prevent users from leaving to view news on external news sites.
  2. May 2016: Conservative groups blame Facebook for censoring right-wing news in their trending news section. Facebook denies the allegations.
  3. June 2016: Facebook changes the newsfeed algorithm to prioritize personal stories over news.
  4. August 2016: Facebook fires trending news editors, lets the algorithm make decisions. Algorithm is gamed and features fake stories.
  5. September 2016, yesterday: Facebook removes a Pulitzer-winning photo of the Vietnam War and accompanying article for violating its community standards. It restored the photo a day later after much admonishment.

It is the last point of contention that I find truly interesting. The photo, known as The Terror of War, is largely credited with helping end the Vietnam War in spite of, or maybe even because, it is so disturbing. Taking the photo was only the first step. It was publishing it on every media platform that caught American’s attention and influenced public opinion. It wasn’t just an editorial choice that drove publication, it was also the competitive nature of media at the time. If one paper wouldn’t print it, another would. In today’s world, that competition is almost gone. The staggering stat that has to figure into this discussion is just how many people get their news from Facebook, be it via shared links in the newsfeed or trending news topics. According to Pew Research “Facebook [reaches] 67% of U.S. adults… two-thirds of Facebook users who get news there, then, amount to 44% of the general population.”

Facebook is stuck between a rock and a hard place. It strives for engagement, it truly wants users to spend lots of time on the site and app. Facebook would prefer that “engagement” mean sharing original content and personal news, but that requires user effort. It is much simpler for users to share a link, and many of those links are on various news sites. It is then up to Facebook’s opaque newsfeed, Trending News and content removal algorithms to decide who sees those shares. It’s a tough place for an algorithm to be.

In the machine learning session at Google I/O this year, Aparna Chennapragada said something that stuck with me regarding machine learning. “You want to look at problems that are easy for machines and hard for humans, the repetitive things, and then make sure that those are the problems you throw machine learning at.” Sorting news and adhering to community standards is the opposite: it’s not easy for machines and easy for humans. It’s a task that throwing more computational power at won’t necessarily improve performance and that performance will almost always reflect the bias of its creators.

Algorithm-driven Trending News on Facebook: nothing about the US election, Syria, or North Korea's nuclear tests, front page news elsewhere.

Algorithm-driven Trending News on Facebook: nothing about the US election, Syria, or North Korea’s nuclear tests, front page news elsewhere.

Zeynep Tufekci, who has done a lot of work on the intersection of Facebook and news, had this interesting observation about the difficulty of creating unbiased algorithms: “If Google shows you these 11 results instead of those 11, or if a hiring algorithm puts this person’s résumé at the top of a file and not that one, who is to definitively say what is correct, and what is wrong? Without laws of nature to anchor them, algorithms used in such subjective decision making can never be truly neutral, objective or scientific.

Regarding news, Ms Tufekci claimed that Facebook’s newsfeed algorithm “largely buried news of protests over the killing of Michael Brown by a police officer in Ferguson, Mo., probably because the story was certainly not “like”-able and even hard to comment on. Without likes or comments, the algorithm showed Ferguson posts to fewer people, generating even fewer likes in a spiral of algorithmic silence.” Ms Tufekci makes the argument that algorithms alone cannot decide what news items to show, much less promote.

So back to the rock and hard place. Nearly half of all Americans get their news from Facebook, which means that even though Mr Zuckerberg says that Facebook is “a tech company, not a media company,” it really cannot avoid its growing influence on what stories Americans (and others around the globe) are exposed to. In the controversy surrounding the takedown of The Terror of War photo, Espen Egil Hansen, the editor-in-chief and CEO of the Norwegian news outlet that shared it, accused Mr Zuckerberg of “thoughtlessly abusing your power over the social media site that has become a lynchpin of the distribution of news and information around the world.” He added: “I am worried that the world’s most important medium is limiting freedom instead of trying to extend it, and that this occasionally happens in an authoritarian way.”

Restoring the photo may have resolved this particular case but it definitely opened a can of worms. Facebook will need to find a way to deal with news items and based on how algorithms have performed in the past, it won’t be able to rely on them. Contrary to Mr Zuckerberg’s stated desires, that action may involve human decision makers. It will be interesting to see what Facebook decides to do.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s