We’re almost at one year post election and today there are hearings on Capitol Hill on how Russian media and advertisers on Facebook, Google, and Twitter may have influenced the decision. Facebook especially is in the hot seat as it has been deemed to be the most influential. Only this week it was revealed that 59% of Americans saw the Russian ads before the election. That’s an astounding number, especially considering that it targeted the undecided voters and those susceptible to be swayed.
This should be a period of reckoning for tech, especially Facebook, about the enormous influence it has on the world today and how it’s handling that power. It shouldn’t just focus about Russian advertising but more on why that advertising was so effective. This comes down to four points:
- Divisiveness as a product. Let’s start with Mark Zuckerberg’s opening statement in today’s earnings report. “Our community continues to grow and our business is doing well, but none of that matters if our services are used in ways that don’t bring people closer together.” That has certainly been Facebook’s mission for a while now, but it’s not exactly what it’s doing. It’s enhancing the more extreme societal viewpoints and encouraging arguments. It’s a sad truth that controversy drives engagement. Maybe that metric needs to change.
- Dwindling trust in media. This is the bulk of what prompted me to write last year’s post-election post and there is also recent research by Omidyar Network and Edelman Intelligence that shows a steep decline of trust in media. There are several factors contributing to this decline in trust. First is the rise of citizen journalism, where everyone can contribute. It makes it difficult for readers to separate the wheat from the chaff, the truth from the partial truth from the completely fake. Second is the corner that traditional media, institutions with newsrooms and research staff and, above all, a reputation to uphold, have been pushed into in the last two decades. First the internet took away their advertising revenue and then social media (especially Facebook) took away their traffic and constantly drove them to change strategies based on seemingly fickle algorithmic changes in the newsfeed. It was only this week that Facebook, yet again, changed their media policies obstinately to combat “fake news” but in reality capturing the big fish in its net as well.
- Censorship, or, using a friendlier term, deciding what people see. Facebook does this every single time a user peruses the Newsfeed: it decides what to show them. It does that by showing users what it thinks they will like, and what they’ll like will keep them on the site for a longer period of time. By showing users what they want to see, they effectively drown out opposing political, religious, and social views. The fact that those views are presented by friends and family, people the user respects and knows personally, gives them even greater significance. Choosing what to show users in their newsfeed may be the most controversial decisions Facebook makes, and the filter bubble around users only strengthens that control.
- Effective persuasion. This is the most dangerous because it’s the most subversive: Facebook already has so many data points about every individual user, and the ability to reach that user in the most effective manner today. It also knows how to engage that user and keep them coming back to the service, be it on the site or via the app, time and time again. That’s their secret sauce, the reason their profits from advertising dwarf everyone’s except Google’s. Listen to Zeynep Tufekci’s talk from last week to understand just how well Facebook targets and, for lack of less evil sounding word, manipulates users: “It’s because it works great as a persuasion architecture. But the structure of that architecture is the same whether you’re selling shoes or whether you’re selling politics. The algorithms do not know the difference. The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows, and that’s what’s got to change.”
Finally, for further reading, Ben Thompson wrote an excellent summary of today’s hearings but ended with an interesting conclusion. He said that “I still believe that, on balance, blaming tech companies for the last election is, more than anything, a convenient way to avoid larger questions about what drove the outcome. And, as I noted, the fact is that tech companies remain popular with the broader public.”
I disagree, though I don’t think “blame” is the right term here. Could the proliferation of fake news, the erosion of trust in fact-based media, the rise of highly divisive rhetoric, and the specific targeting of undecided individuals happened without social media? It’s about how platforms such as Facebook are built and how they are being subverted. I’m more concerned about how Facebook harms our democracy than about how foreign entities are playing the game these platforms created. Facebook is troubling because while it’s built to connect people in a good way, it’s also built to bring out the worst of our collective behavior.
In conclusion, I bring you Tim Cook’s words, when asked about Russian influence: “I don’t believe that the big issue are ads from foreign government. I believe that’s like .1 percent of the issue. The bigger issue is that some of these tools are used to divide people, to manipulate people, to get fake news to people in broad numbers, and so, to influence their thinking, and this, to me, is the No. 1 through 10 issue.”
Update, November 2nd: The Verge published a very partial set of the ads that were run on Facebook by Russian operatives. They’re ugly, and their purpose is clearly to drive violence, hatred, and fear. It’s sadly easy to see, with these messages and Facebook’s optimized delivery platform, just how so much divisive harm was done.
— Bloomberg (@business) November 1, 2017