Food delivery apps urged to reveal how algorithms affect UK couriers’ work

Mon, 20 Jan 2025, 14:21
Full Content
Full article content available

Takeaway delivery apps are facing pressure to crack open the black-box algorithms that govern the work of more than 100,000 couriers in the UK and reveal more about how decisions are made on pay and access to jobs.

A coalition including the TUC, Amnesty International, couriers’ unions and the campaign group Privacy International claim the opaque use of algorithms is “automating exploitation”. They say withholding vital information from couriers about their work is “creating precarity, stress, and misery”.

The call for greater openness targets UberEats, Deliveroo and JustEat - the UK and Ireland’s three dominant platforms in takeaway delivery with a combined annual turnover of almost £9bn. Just Eat’s 88,000 couriers deliver about 4.7m meals and grocery orders a week. It echoes growing pressure on the UK government to

increase transparency about AI systems in the public sector

such as the welfare system.

In a

letter

seen by the Guardian, the workers’ groups accuse the companies of “leveraging black-box algorithms to make decisions about deactivation, work allocation and pay without sufficient explanation, stripping workers of the ability to understand and challenge those decisions”.

The coalition includes the App Drivers and Couriers Union and the Worker Info Exchange, which both represent gig economy workers in the UK. The group said: “We believe the foundation of respect is transparency. Yet current systems withhold vital information from workers.”

The companies insist they do provide information to their couriers, but Privacy International’s legal officer, Jonah Mendelsohn, said workers should not be expected to “play a game that they don’t know the rules for”.

“Too often workers are left in the dark about the reasons why they have been fired, underpaid, or that they’ve been discriminated against as more and more decisions impacting them are made by algorithms,” he said. “At a time when AI governance is under global scrutiny, from new EU legislation to the UK’s commitment for public authorities to publish details about their algorithms, it’s time that these gig economy platforms catch up and deliver answers.”

The US state of Colorado recently passed laws that require ride-sharing companies to spell out the exact circumstances that would lead to a driver being deactivated or suspended, and to communicate more clearly to drivers and customers about how fares are calculated and what the expected costs will be. Uber has mounted a legal challenge to the legislation,

claiming

it violates first amendment rights to free speech and could cause accidents because drivers would have to deal with more information on their phone screens.

The call for greater transparency comes as AI systems play an ever increasing role in working lives. New polling, shared exclusively with the Guardian, shows that 62% of people are worried about risks associated with AI tools.

The biggest concerns relate to the threat of cyber-attacks, job losses, misinformation and accidents caused by unreliable AI systems replacing humans, according to the research, which conducted by Public First for the Centre for Long-Term Resilience.

Keir Starmer shifted the UK government’s position last week from a focus on the existential risks posed by AI towards an

embrace of its potential

for economic growth and improvement of public services. More and more employers are integrating AIs into their processes meaning that workers are likely want to know more about how they are affected by them.

Related:

Key takeaways from Keir Starmer’s action plan for AI

Deliveroo

calls its algorithm Frank

and describes it as “made up of machine-learning technology, which predicts the timings of every order” and says it is designed to “help riders make more deliveries and therefore make more money” .

A spokesperson for the firm said its website carried information for riders about how the algorithm offers orders and calculates fees, receives information when they sign up and can raise questions with support staff who may escalate them to data protection officers.

The termination of riders’ accounts as a result of suspicious activity are reviewed by a human and are not automated, they said.

“We take our legal obligations regarding transparency and data protection very seriously, and see this as a core part of treating riders with dignity and respect,” the spokesperson said. “We understand that effective communication about this, and the systems that support Deliveroo’s business operations, are an important element of our relationship with riders.”

Just Eat said its couriers earn more than the London living wage “for the time they are on an order”. A spokesperson said: “We maintain an open dialogue with our courier partners through regular communications, including face-to-face events across the country called StreetMeets, where we invite and share feedback on issues that are important to them.”

Uber was approached for comment.

AI Model Selection

Avg. Response: 10.0s

Llama3.2:1b

Meta
Default
Size: 1B
Success Rate: 100.0%

Llama-3.2-1B-Instruct-GGUF

Meta
Size: 1B
Success Rate: 100.0%
All models run locally on our servers. Response times may vary based on server load.