12 civil society organisations tell delivery platforms it’s time to deliver answers on how they use algorithms to manage their workers
EDRi member Privacy International and more digital rights groups, together with trade unions, call out food delivery platforms for their algorithmic management of workers. In an open letter co-signed by 12 organisations, they make three clear recommendations for the platforms to improve.
Algorithmic management affects the lives of people working for gig-economy platforms
Digital rights groups have come together with Trade Unions to write an open letter calling on food delivery platforms to maintain a public register of the algorithms they use to manage workers, to accompany algorithmic decisions with personalised explanations, and allow workers, their representatives, and public interest groups to test how the algorithms work.
Algorithmic management of workers has become the norm for gig-economy platforms. Decisions made by algorithms can determine how much people get paid, what work they receive, and even if their account (and employment) is suspended. But workers are too often left in the dark about how those algorithms work and how those decisions get made, leaving them playing a game they don’t know the rules to.
It is time for food delivery platforms to deliver answers
That’s why 12 organisations are telling these platforms: it’s time to deliver answers. They are calling on Deliveroo, Just Eat and Uber Eats to implement three clear recommendations:
- Give drivers and riders an actual understanding of the terms of their employment.
- Give drivers and riders a way of understanding and challenging specific decisions that affect them.
- Give drivers and riders a mechanism to uncover issues built into the algorithms they’re subjected to.
The organisations argue that giving people a basic understanding of how they’re being managed, what to look out for, even how to do well is not too much to ask. And yet, over recent years there have been serious abuses of power from these platforms, with workers being left in the dark about the jobs they rely on to feed their families.
One Uber driver – Pa, who won a settlement from Uber in 2024, faced increasingly persistent requests for pictures – likely to carry out ID requests. One night, after completing a shift, Pa went on the app to check his earnings. He found out that his account had been deactivated, losing his access to his job suddenly and with no warning. Uber claimed he had been allowing someone else to use his account – but he hadn’t. Pa believed that Uber was using a facial recognition technology that struggles to recognise faces of darker skinned people. All he wanted was for a human to review the pictures he had taken – clearly of the same person. But Uber did not provide this option to him.
In this case Pa took Uber to court and was able to win a settlement. But for most drivers in his position – at Uber or other platforms – this kind of redress is difficult to impossible to achieve.
Platforms can’t keep racing to the bottom on workers’ rights
If Pa had understood which algorithms were being used to manage workers at Uber, he would likely have understood why he was being asked to send so many pictures, giving him the opportunity to speak to Uber about the issue.
If Uber had given him an individualised explanation of the decision they made along with contact details and information on how to request a review, then he may never have faced waking up to find out he had no income.
Lastly, if Uber allowed workers, their representatives, and public interest groups review their algorithm then maybe Pa would never have been subjected to a facial recognition system he believes to be racial discriminatory. Discriminatory outputs could be caught earlier, and drivers could trust that the systems that are managing them aren’t discriminating against them.
12 organisations demand accountability and oversight of the technology governing workers’ futures
With their current level of transparency, the algorithms on these platforms undermine workers’ rights to dignity and autonomy. Implementing these recommendations would be a crucial step to ensuring accountability and appropriate oversight of the technology governing workers’ futures.
That’s why Privacy International and 11 other organisations are telling these platforms to improve how they use algorithms and how automated decision making is used in the workplace. You can join this call by sharing this campaign and telling these platforms it’s #TimeToDeliverAnswers.
Contribution by: EDRi member, Privacy International