When the algorithm needs to be tempered with humanity

Yesterday Uber again caught the eye of the public when surge pricing went into effect in Sydney, Australia, during a hostage situation with people trying to leave the area. Keeping in mind that the Silicon Valley echo chamber amplifies the responses pertaining to tech, I thought this would end up being a minor issue. Yet later it made the mainstream news outlets as well, as part of reporting on the ongoing situation.

Initially Uber Sydney justified the surge pricing. They tweeted:

In theory, Uber is right. Uber’s algorithm, attuned to shifts in supply and demand, correctly responded to increased rider demand to get out of Sydney’s financial district as soon as possible. The higher prices were intended to bring more drivers to the area and maybe they did. Yet users reported never seeing the fare multiplier as high as 4 before and Uber eventually promised to refund all rides out of Sydney during the hostage situation. The response was was mostly a cry for Uber to use some “human decency” when raising prices and some went as far to say that drivers might not be as motivated by money as Uber thinks. For some drivers, helping others even at a normal rate might have been a sufficient motivator to pick up rides in Sydney’s financial district.

A surge multplier of 4! Uber sruge pricing during the Sydney hostage situation. Source: Mashable

A surge multplier of 4! Uber sruge pricing during the Sydney hostage situation.
Source: Mashable

It’s understandable that an algorithm can’t understand “human decency.” Artificial intelligence is not quite there yet. However, humans can still intervene when the algorithm reaches an unexpected result. Why would a surge multiplier of four apply to downtown Sydney on a normal Monday in December when historically Monday rider demand in Sydney hasn’t exceeded driver supply? We talk a lot about analyzing more and more user data to determine expected product behavior. In Uber’s case, it might have recognized an abnormal situation and alerted a human.

As this incident unfolded last night I tried to remember other incidents where “the algorithm” had been blamed for a service’s misbehavior Google’s AdSense is usually a finely tuned machine. Able to determine, based on past behavior what ads are more likely to be clicked (thus resulting in income for Google) based on the user’s search terms, location and previous performance of the ad and specific ad copy. The AdSense algorithm constantly optimizes ad display for maximum benefit to both Google and the advertiser. Almost two years ago, a professor at Harvard realized that on AdSense “a black-identifying name was 25 percent more likely to get an ad suggestive of an arrest record.” She explained this as an unfortunate but automatic extension of the AdSense algorithm which reflects the racism exhibited over time by users.

A possible scenario imagined by Salon makes a lot of sense. “An employer is Googling prospective job applicants. Some of those applicants have black-identified names. Due to his or her personal racism, the employer happens to be more likely to click on the ads that suggest “Arrested?” next to the black-identified names. And over time, Google’s AdSense algorithm learns that “ads suggestive of an arrest record” work better when associated with black-identified names.”

As in the Uber case, the algorithm is doing what it is supposed to do. But as in the Uber case, could humanity help prevent misuse? It may be more difficult to identify the problem with AdSense but it is possible to address it using historical data to weed out the potential racial bias.

Bottom line is that when your algorithm gets a product in trouble too often, it’s time to stop blaming it and either change it or add a dash of humanity.

Advertisements

3 thoughts on “When the algorithm needs to be tempered with humanity

  1. What a nice post. Thanks.
    I would go further.
    The ‘amorality’ of surge pricing and the advertising calculations accentuates the little-appreciated fact that there is actually no algorithm for management (let alone for good judgement). We have come to put too much stock in algorithms, and disaster awaits.
    An algorithm is a finite set of instructions, operating on a set of inputs (plus history) to produce a set of outputs. Computer programs are the exemplars.
    It has been well known for decades — with mathematical certainty — that all algorithms have fundamental limitations. Remarkably simple problems are know to have no algorithmic solution. For instance, no single algorithm can ever tell us if any given computer program will ever halt or not. And no single algorithm can efficiently compute the solution to all forms of the travelling salesperson problem.
    Now let’s turn to human affairs — like the fair pricing of taxi rides — and consider if we’re doing it right by entrusting computers to come up with the answers?
    The reality of human judgement is that we never know all the inputs in advance. In hard real life problems, there is always novelty — something we didn’t think of when we first wrote the rules. No algorithm can EVER behave well when the inputs are unbounded.
    It certainly sucks that Uber rips us off in times of emergency. But financial ruin can result from algorithmic trading when it runs amok, and much worse is to come if we continue to naively install algorithms at the heart of e-healthcare, artificial intelligence and the Internet of Things.

    • Steve, thanks for that comment. I agree “No algorithm can EVER behave well when the inputs are unbounded.” This Uber example is in some ways trivial. It’s just a fee that is easily avoidable. But what happens when algorithms influence someone’s ability to get a job? To get a loan? To get the right healthcare, as you mentioned?
      Perhaps algorithms need to have more triggers for human interaction? More triggers to say “something isn’t right here, why don’t we have someone take a look?” It will be interesting to see how products integrate the human/unexpected factor.

  2. Pingback: Price gouging and relying on an algorithm to prevent fraud | What it all boils down to

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s