Yesterday a post made the rounds about how Facebook was apparently (because Facebook confirmed it on Monday, only to deny it this morning) using location data to populate the People You May Know feature. The post gave the example of how a parents of suicidal teens who attended an anonymous meeting were suggested to each other, an action that puts a name to a face that was against the very premise of the meeting. Reporter Violet Blue added anonymous meetings such as AA, OKCupid and Tinder dates, and reporters being connected to protected sources to the list of problematic location-based suggestions. An account for escorts also mentioned it as a problem.
Assuming for the sake of argument that Facebook is suggesting friends based on a shared location, let’s look at the motivation behind it. Facebook PMs know that their goal is to have users spend more time on the site and that users open Facebook to stay in touch with their friends. If more friends means more visits and greater engagement, a specific goal might be to prompt users to connect with more people. Now, once users have already connected with family, local friends, coworkers, former coworkers, classmates, former classmates, interest and community groups, it’s time for the PM to get creative. Oh, and make it frictionless, of course, so that the user has to make as little effort as possible and delight them with the unexpected suggestions. So while I really understand what Facebook was going for in suggesting friends based on a similar location, they might not have thought through all the possible user scenarios.
It also points to another issue: when does something intended to be frictionless cross the line by making too many assumptions or taking too many shortcuts and becomes creepy for users instead of magical? I have a few guidelines:
- Transparency: users need to understand how the magic works. I’m not talking about understanding the algorithm or the physics. After all, users don’t need to understand how an internal combustion engine works to drive a car. Yet, to feel secure, users need to understand some part of what the algorithm does and, more importantly, what input it used to come up with that output.
- Context: there needs to be a reason for the magic to happen. That could be as a result of an action the user has taken or a contextual setting but not out of the blue. Users need to be able to associate the magic with the flow it is part of.
- Control: users need to have some control of the input. Often, that level of granular control just isn’t offered and, to be fair, it can lead to an overabundance of options. Users also need to understand what controls affect a specific feature’s operation. Usually users have to dig deep to look for settings that they think might turn off a feature they don’t like. There needs to be a direct link from the feature to the setting to turn it off.
One example I have seen of this magic working well is on Google Now. Google Now tries to show me a limited list of stories that it hopes might interest me. Today it suggested a link to a story about John Oliver’s latest rant. Why? Below the link and blurb it tells me that “You’ve shown an interest in John Oliver.” That statement is short and it’s true, I have searched and watched Mr Oliver’s rants on YouTube several times.
Then, Google Now offers me a way to control the stories I am shown, right from the story itself. Am I not interested in John Oliver any more? Or just in stories from Fast Company? Or maybe I just don’t want their suggestions any more. All of these options are easily accessed in exactly the right place. Google Now gave me transparency without explaining all of their machine learning algorithm, the right context, a story about a topic I had expressed interest in, and the ability to control what I see in the future. As a user I was delighted and I will continue changing and modifying Google Now to meet my needs. Google gets my clicks and a bit more knowledge about what I like. Win-win, right?
Could Facebook do something like this? Absolutely, though it depends on how open they want to be with that information. Regardless of whether they use a shared location as input for the People You May Know feature, being more transparent about it could have prevented the negative press. Allowing users to turn it off would enable users to continue using the app and still allow Facebook to access their location, two things Facebook undoubtedly wants.
So yes, create magical moments, delight the users, but also keep them informed, give them options, and let them opt out.