This week in Facebook: two steps forward, one step back

It’s been an interesting week (or was it a month?) of Facebook related news and as I’ve been critical about Facebook in the past year since the election, it’s nice to see that change is coming. Even more so, it’s commendable to see Facebook move from itsthe idea that fake news on Facebook influenced the election in any way is a pretty crazy idea” denial and to an awareness and admission of the monster they have created. To be fair, I haven’t seen other businesses, tech or otherwise, even come close to admitting something that detrimental to their business. However, it’s not all rainbows. Facebook seems to have taken two steps forward, but also one step back. Let’s start with the positive:

Step forward number one: admitting that Facebook may be bad for users mental health.

Can social media create truly meaningful social interactions?

Back in December, Facebook admitted something that critics and academics have been saying for a long time “Spending Time on Social Media Bad for Us.” In a separate post, Mark Zuckerberg said he would try to refocus the newsfeed on “One of our big focus areas for 2018 is making sure the time we all spend on Facebook is time well spent” and that he’s “changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” Now, meaningful social interactions is a fine goal but it’s one that seems to contradict Facebook’s business goal, where the metric is engagement and the goal is to keep users on the site for as long as possible so that they can see more finely-targeted ads.

Spending less time on social media may be good for humanity but not so good for Facebook’s bottom line. This is why I admire this post. It’s extremely difficult as a successful business to admit that it harms humanity. Cigarette companies were forced to do so, big sugar hasn’t done so, and big soda is fighting it every step of the way. This isn’t and has never been a business norm so it’s refreshing that Facebook is going past denial and trying to do better.

Joe Edelman wrote two very interesting posts addressing this change. The first on what meaningful social interaction can mean online and a second that outlined possible ways to improve the experience and to actually promote significant interactions, and why it will mean completely remaking Facebook to reflect those different values. It’s fascinating to read someone who has both the deep understanding of what meaningful social interactions really mean and how to apply those principles to a virtual space. I highly recommend both.

Step forward number two: admitting that Facebook is bad for democracy.

In another candid post, Samidh Chakrabarti, Product Manager of Civic Engagement, went through all the talking points that Facebook critics (including me) have been making about how Facebook encourages divisiveness and doesn’t do enough to either block fake news stories or promote legitimate ones. Side note: Sarah Frier has mentioned that the same team at Facebook is also responsible for international work with “world leaders, some of whom use it against their citizens.”

In this article, the Washington Post takes an in-depth look into what happened a bit before and since the 2016 election to change Facebook’s mind. It was a long process and one mainly driven by Facebook’s employees.

The step back: using “the community” to judge the trustworthiness of news sources.

Facebook said last week that it would change its newsfeed algorithm (again) to prioritize “trustworthy” news sources and that it will let “the community” decide what are those are, with the goal of using that feedback to prioritize the newsfeed. Today it turns out that Facebook is doing that with a short, two-question survey:

  • Do you recognize the following websites? (Yes/No)
  • How much do you trust each of these domains? (Entirely/A lot/Somewhat/Barely/Not at all)

Will this simple, but lacking in nuance and impartiality, survey be the solution to rating trustworthiness? I doubt it. In the part of his post discussing false and misleading news items, Samidh Chakrabarti expressed awareness that anything Facebook does might not be enough. Said Mr Chakrabarti: “ Even with all these countermeasures, the battle will never end. Misinformation campaigns are not amateur operations. They are professionalized and constantly try to game the system. We will always have more work to do.”

While yes, there will always be work to do, maybe, given the extent of the problem and its far-reaching negative implications, it’s time to go beyond user-driven/community monitoring and machine learning and bring in actual, professional humans to be the editors. This may be Facebook’s intent in hiring “10,000 workers, including academics, subject matter experts and content moderators” – a team to understand and classify the content shared on Facebook. If that’s true, that, along with significant changes in the newsfeed, may lead to less fake and divisive posts shared which might lead to a better experience. While normally I’d end by saying that this will be interesting to follow, today I add to that the wish to see real change and results before more democratic institutions are harmed. Here’s to a better 2018.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.