Fighting for algorithmic justice: lessons learned in working closely with affected people

Bits of Freedom shares lessons learned while working on “Amsterdam Top400”, an invasive municipality project which involved the use of predictive policing and led to unwanted interference in the private lives of young people. Together with a coalition of professionals from different background and affected individuals, they explored the possibility of holding the municipality of Amsterdam accountable for violations of children’s rights, data protection law, and fundamental freedoms.

By Bits of Freedom (BoF) (guest author) · January 21, 2026

Amsterdam Top400

In 2015, the municipality of Amsterdam created Top400, a crime prevention program addressing young people, to complement the Top600, a list of 600 young repeat offenders of high impact crimes. Unlike Top600, inclusion on Top400 did not require criminal conviction.

Children could be added to the list based on a wide range of criteria, including being suspected of a criminal offense, having changed schools frequently, being victims or witnesses of domestic violence or other crimes or being siblings of someone in the aforementioned Top600 list.

For two years, an algorithm was used to select young people. Once listed, children were intensively monitored by the municipality, the police, the school, so-called ‘street-coaches’ and dozens of other affiliated partners. The program takes a mixed approach of care and repression. It functioned as a system of surveillance and control, which resulted in stigmatization of youth and an increased distrust in governmental institutions. Despite its impact, it was never properly evaluated from the perspective of the affected community.

Bringing different kinds of expertise together

To respond to the program, a coalition to challenge its impact was formed, bringing together different organisations and people with varied expertise were involved alongside impacted communities. Fair Trials provided insights into the use of similar lists in the UK, and their experiences in fighting them. Controle Alt Delete was involved, bringing in the perspective of Dutch youth being harassed by the police. The lawyers of Public Interest Litigation Project (PILP) and scholar dr. Fieke Jansen, started a Freedom of Information Act (FOIA) investigation. Under this framework, the coalition got access to policy documents, gaining insights into how the program was supposed to function according to the municipality, as well as its impacts and harms.

The results of the FOIA investigation were published in a report, which was launched together with the documentary “Mothers” directed by Nirit Peled, a journalist and documentary maker, maximising the attention with different means to create impact. Both the report and the documentary showed the injustice and wrongdoing by the municipality of Amsterdam and the impact on the lives of children and their families. The mayor, however, kept defending the project in the media and seemed unwilling to stop the unlawful practices.

Facing this resistance, we decided to explore litigation opportunities, resorting on a legal analysis prepared by one lawyer of our coalition. The outcome: we could sue the municipality for several breaches of the General Data Protection Regulation (GDPR), as well as of the Convention of the Rights of the Child, and article 8 of the European Convention on Human Rights.

Centering affecting people in the decision-making

The diversity of expertise, roles and perspectives in this coalition was very valuable and complementary. An advisory group consisting of people from the affected community was a necessary precondition for the work to be carried out. PILP was able to talk to several people that were in the Top400 and other people from the affected community. While many recognised the need for action, they were reluctant to engage directly or be visibly connected to the effort. Having reached adulthood and moved beyond the program, they understandably wished to leave that chapter behind. Nevertheless, a number of individuals eventually agreed to participate in our advisory group.

Through a series of meetings, we explained the legal situation, possible course of action and consequences that each would entail. In turn, participants shared their experiences and indicated what would be the most helpful next steps. Crucially, these meetings were held between every step we took – whether engaging with the municipality, drafting letters, launching campaigns, or working with the media – ensuring that affected people remained actively involved in shaping the strategy.

Early on, it became clear that many of them were still living with trauma, pain and fear. After the mayor dismissive response to the documentary and the report, there were support for strong legal action, but also deep concern about personal visibility and the risk of further harm. The best way forward became clear: we decided to prepare for a court case against the Municipality of Amsterdam, with Bits of Freedom as a party represented by PILP. We did this by sending a letter with our demands to the municipality.

Preparing for court

We formally urged the municipality to end the covert observation by street-coaches, to stop the obligated and broad sharing of information between the program-director and the schools, to financially compensate people that were selected by the algorithm, and to issue a a statement acknowledging the unethical nature of this practice.

We invited the municipality for a conversation to see if we could reach agreement in a settlement, which we felt as an obligated pre-step. If the demands were not met, we would go to court. In its response, the municipality partially met two out of three demands: it agreed to stop covert observation, and to place limitations on data sharing with schools. However, the municipality claimed that their current practices differed from the policy documents uncovered through the FOIA investigation and proposed drafting new policy documents to clarify its approach.

At this stage, a settlement looked achievable. Discussion focused on the content of the settlement rather than the question whether it would be possible to reach an agreement. However, when the officials went to the mayor for approval, which seemed only a formality, she refused to agree and stood by the Top400 approach. In her view, no unlawful conduct has taken place, making a settlement impossible.

The settlement procedure had taken up a lot of time, yet it brought some results: the municipality had written new policy documents for better practices regarding the street-coaches and the schools. However, it failed to address the harm done to those selected by the algorithm.

Necessary change in strategy

Our initial reaction was to proceed to court. Yet, the municipality amendments to its policy and the fact that the algorithm had only been used temporarily significantly weakened the legal case. As a result, the lawsuit that we previously considered very promising, became a case with little chance of success: there was a real risk that a judge would find insufficient grounds to rule the matter. A loss in court could have had broader negative consequences, legitimising similar practices in Amsterdam and beyond

In the interest of the affected community, we decided to take a different approach. We shifted our focus towards public accountability: we adapted our communication strategy and raised awareness about the mayor’s position and the substantive changes in work processes and policy documents. While some of these changes were positive, without any broader shift in consciousness, they amounted to little more than symbolic concessions.

At the same time, we expanded our work to the protection of children’s rights, particularly when using technology such as AI, high on our agenda. We view the municipality of Amsterdam’s practice of profiling citizens, especially children, as part of a wider trend in which governments deploy data-driven tools to cast suspicions on individuals without formal grounds. We believe it is important to continue opposing this and to highlight the consequences for children, families, our society, and the rule of law. We are prepared for the next round in the fight for algorithmic justice.

Our takeaways

Along the way, we identified a number of insights that strengthened our approach, which could be of use to others exploring similar avenues:

  • Working in a coalition allowed us to approach the problem from different angles, combining expertise and effectively dividing the and, workload ;
  • Engaging an affected community requires time, transparency and sustained trust-building, particularly when communities have been previously harmed by institutions, like in our case.
  • Consulting affected communities at every step of the process is crucial, in this way you will jointly decide what the next step(s) should be;
  • Being attentive and mindful of the community’s concerns. People generally are fearful of engaging in lawsuits and speaking to media; many would prefer to leave painful experiences behind. Be prepared to carry visibility and responsibility when individuals cannot safely do so themselves.

Ultimately, these lessons reinforce the urgency of standing firm against harmful algorithmic practices and defending the rights of those most affected.”

Contribution by: EDRi member, Bits of Freedom