There are nearly a million active Uber drivers in the United States and Canada, and none of them have human supervisors. It’s better than having a real boss, one driver in the Boston area told me, “except when something goes wrong.”
When something does go wrong, Uber drivers can’t tell the boss or a co-worker. They can call or write to “community support,” but the results can be enraging. Cecily McCall, an African-American driver from Pompano Beach, Fla., told me that a passenger once called her “dumb” and “stupid,” using a racial epithet, so she ended the trip early. She wrote to a support rep to explain why and got what seemed like a robotic response: “We’re sorry to hear about this. We appreciate you taking the time to contact us and share details.”
The rep offered not to match her with that same passenger again. Disgusted, Ms. McCall wrote back, “So that means the next person that picks him up he will do the same while the driver gets deactivated” — fired by the algorithm — because of a low rating or complaint from an angry passenger. “Welcome to America.”
Over the past four years, I have traveled more than 5,000 miles in more than 25 cities, interviewing 125 drivers for Uber and other ride-hailing apps, as well as taxi drivers, and observing hundreds more. And I have spent countless hours in Facebook groups and other online forums for drivers, which collectively have 300,000 members, to better understand their experiences. I have learned that drivers at ride-hailing companies may have the freedom and flexibility of gig economy work, but they are still at the mercy of a boss — an algorithmic boss.
Data and algorithms are presented as objective, neutral, even benevolent: Algorithms gave us super-convenient food delivery services and personalized movie recommendations. But Uber and other ride-hailing apps have taken the way Silicon Valley uses algorithms and applied it to work, and that’s not always a good thing.
The algorithmic manager seems to watch everything you do. Ride-hailing platforms track a variety of personalized statistics, including ride acceptance rates, cancellation rates, hours spent logged in to the app and trips completed. And they display selected statistics to individual drivers as motivating tools, like “You’re in the top 10 per cent of partners!”
Uber uses the accelerometer in drivers’ phones along with GPS and gyroscope to give them safe driving reports, tracking their performance in granular detail. One driver posted to a forum that a grade of 210 out of 247 “smooth accelerations” earned a “Great work!” from the boss.
Surge pricing, which multiplies prices for passengers and earnings for drivers during periods of high demand, is another form of algorithmic management that encourages drivers to relocate to certain areas at certain times. The drivers get in-app notifications, heat maps and emails with real-time and predictive information about spikes in demand. A driver who wants to go home and is trying to log out might be prompted with an automatic message: “Your next rider is going to be awesome! Stay online to meet him.”
It’s easy enough to dismiss those gentle nudges, but in-app notifications like “Fares are at 3.0X right now!” or “There are lots of events in New Orleans this weekend where we expect Uber demand to be high!” raise expectations and are hard for drivers to ignore. But by wording its expectations as helpful hints, rather than orders, ride-hailing companies can avoid the appearance of a direct supervisory relationship with their drivers. Some Uber drivers say they feel misled when they travel to a surge area in high demand only to find that it has disappeared. The consensus in driver forums is, “Don’t chase the surge.”
Uber takes fees and commissions on every ride, and complaints about low pay and rate cuts are common. In 2016, Uber started charging passengers on some rides more than drivers were paid, without notifying them, in a policy called “upfront pricing.”
By the logic of Silicon Valley, the company was simply trying out a new pricing policy, but many drivers were angry that their livelihoods were part of this experiment. One group of drivers filed a class-action lawsuit in San Francisco, arguing that Uber violated the terms of its contract by changing the policy without notifying drivers. Uber has said that it is not a violation of their contract because drivers continue to be paid per mile and per minute. A settlement is pending, and drivers can now view the prices charged to passengers.
While critics use the language of the workplace to describe the treatment of drivers, the language of technology can deflect such concerns. When payments for trips are missing, labor advocates might call it wage theft, but Uber says it’s a glitch. When Uber charges passengers what it predicts they are willing to pay based on their route rather than standard rates, economists may call it price discrimination, but Uber explains it as an innovation in artificial intelligence.
Other tools, like the rating system, serve as automatic enforcers of the nudges made by algorithmic managers. In certain services on Uber’s platform, if drivers fall below 4.6 stars on a 5-star rating system, they may be “deactivated” — never “fired.” So some drivers tolerate bad passenger behavior rather than risk losing their livelihoods because of retaliatory reviews.
To be sure, drivers are not simply passive victims of algorithms. Uber drivers figured out the upfront pricing scheme by sharing pictures of passengers’ receipts alongside their own pay stubs in online driver forums.
@2018NewYorkTimesService
To read the full story, Subscribe Now at just Rs 249 a month