20 January 2020

There once were fears that computers would take jobs and replace workers. It didn't happen. Instead, computers became their bosses. Why? Because algorithms are more rational decision makers than humans. They don't have emotion, only rules and procedures. They don't organise moments of formal appraisal, but run in the background of workers' computers, continuously monitoring them. They don't give explanations of what went wrong and provide warm encouragement to do better, only automatic sanctions when there is a deviation from the norm. What to do?
An article we wrote in Administrative Science Quarterly looked at eBay business sellers, who are a good example because they had algorithmic bosses long before Uber drivers and Amazon Mechanical Turk workers did. We saw that despite working comfortably from home and having discretion over the way they organise their days, they find that algorithmic bosses can be quite difficult.
The main problem with algorithmic bosses is their power. They make decisions based on evaluations from buyers who, generally, remain anonymous. The algorithm compiles evaluations automatically and calculates an average score, which ranks sellers and triggers sanctions. By giving more power to one side (the buyers) the algorithm disempowers the other side (the sellers), who are at the mercy of buyers' unfair evaluations and feel deprived of constructive communication to solve issues.
Can sellers appeal to the platform? It's not that simple, because the algorithm is the platform. When sellers try to talk to real people, eBay staff tend to hide behind the algorithmic rules, saying that there is nothing they can do. And most replies to queries are automated (through the algorithm!) which reinforces the feeling of working for a distant machine.
From the worker's point of view, the algorithmic bos starts to look like a prison. The boss monitors them all the time, it is always right even when the data it uses is wrong. Far from giving workers more autonomy, the algorithmic boss creates a new form of bureaucracy: it singles out workers by their profile and average score, it turns customers into a myriad of evaluators, and it applies rigid instructions to workers.
Our insights stress the urgent need to revisit the way we conceptualise power at work, at the intersection of markets and bureaucracies.
In a new world of work where subjective judgement is compiled and quantified by technologies, human activities become subject to metrification, classification, comparison, and market competition. Our insights stress the urgent need to revisit the way we conceptualise power at work, at the intersection of markets and bureaucracies. Online platforms redefine the link between social and material, generating new challenges for workers. Our article shows that it is possible for them to become more resilient and able to cope with these challenges.
How? The eBay sellers provide some answers, and their approach will work for others too. The first lesson is that with algorithmic bosses, it is more difficult to launch collective protests than with human bosses, because of the fear of retaliation (the algorithm monitors everything!) and the isolation (the algorithm keeps everybody busy behind their screen). Therefore, online workers tend to rely on themselves.
The second lesson is that you can 'work around the algorithm' by manipulating the data. When a problem happens during a transaction, instead of following standard procedures, eBay sellers call their buyers, find a compromise, reassure them and, if a bad review was already posted, then negotiate to remove it.
Also, sellers are on the alert for buyers who look suspicious. Maybe their behaviour matches that of a known manipulative buyer; maybe the tone or content of their communication gives out warning signs. In each case, the response is the same: refuse to sell to that buyer, avoiding the risk of buyer data manipulation.
Sellers with some experience have created their own rules, forcing buyers to abide by them, and blocking them if necessary. Of course, this is not what their algorithmic boss expects them to do. But as they use the platform day in day out, they develop a deep knowledge of the algorithm. They take advantage of some of its features and exploit the small gaps. In other words, they develop practices around the algorithm to cope with working for an algorithm.

Corentin Curchod is a Senior Lecturer in Strategic Management and Organisation.
Read the full article: .