Why 'Ditch the algorithm' is the future of political protest | Louise Amoore

  • 8/19/2020
  • 00:00
  • 4
  • 0
  • 0
news-picture

n improbable nightmare that stalked students in the past was tearing open an envelope to find someone else’s exam results inside. On 13 August, for tens of thousands of A-level students in England, this became a reality. The predictive algorithm developed by the qualifications regulator Ofqual disregarded the hard work of many young people in a process that ascribed weight to the past performance of schools and colleges. As one teenager described the experience of being downgraded: “I logged on at 8am and just started sobbing.” Three days later, the A-level debacle sparked protests in English cities, with young people bearing placards reading “The algorithm stole my future” and “Fuck the algorithm”. The protests marked an unusual convergence of politics and predictive models. That the government subsequently U-turned on its decision, allowing students to revert to centre-assessed grades (CAGs), could be seen as a turning point when the effects of algorithmic injustice were brought into clear focus for all to see. The injustices of predictive models have been with us for some time. The effects of modelling people’s future potential – so clearly recognised and challenged by these students – is also present in algorithms that predict which children might be at risk of abuse, which visa application should be denied, or who has the greatest probability of committing a crime. Our life chances – if we get a visa, whether our welfare claims are flagged as fraudulent, or whether we’re designated at risk of reoffending – are becoming tightly bound up with algorithmic outputs. Could the A-level scandal be a turning point for how we think of algorithms – and if so, what durable change might it spark? Resistance to algorithms has often focused on issues such as data protection and privacy. The young people protesting against Ofqual’s algorithm were challenging something different. They weren’t focused on how their data might be used in the future, but how data had been actively used to change their futures. The potential pathways open to young people were reduced, limiting their life chances according to an oblique prediction. The Ofqual algorithm was the technical embodiment of a deeply political idea: that a person is only as good as their circumstances dictate. The metric took no account of how hard a school had worked, while its appeal system sought to deny individual redress, and only the “ranking” of students remained from the centres’ inputs. In the future, challenging algorithmic injustices will mean attending to how people’s choices in education, health, criminal justice, immigration and other fields are all diminished by a calculation that pays no attention to our individual personhood. The A-level scandal made algorithms an object of direct resistance and exposed what many already know to be the case: that this type of decision-making involves far more than a series of computational steps. I n their designs and assumptions, algorithms shape the world in which they’re used. To decide whether to include or exclude a data input, or to weight one feature over another are not merely technical questions – they’re also political propositions about what a society can and should be like. In this case, Ofqual’s model decided it’s not possible that good teaching, hard work and inspiration can make a difference to a young person’s life and their grades. The politics of the algorithm were visible for all to see. Many decisions – from what constitutes a “small” subject entry to whether a cohort’s prior attainment should nudge down the distribution curve – had profound and arbitrary effects on real lives. Student A, who attended a small independent sixth form, studying ancient history, Latin and philosophy – each with entries of fewer than five – would have received her results unadjusted by the algorithm. Meanwhile, student B, at a large academy sixth form, studying maths, chemistry and biology, would have had her results downgraded by the standardisation model and missed her university offer grades. Algorithmic outputs are not the same as accountable decisions. When teachers across the country gathered together the evidence for each of their students, agonising over rankings, discussing marginal differences in mock grades or coursework with their colleagues, there was profound and unavoidable uncertainty – particularly when you factor in educational inequalities. The Royal Statistical Society has expressed the difficulties and uncertainties associated with something as complex as anticipating grades, although its offer to lend its expertise to Ofqual was rebuffed. Grappling openly and transparently with difficult questions, such as how to achieve fairness, is precisely what characterises ethical decision-making in a society. Instead, Ofqual responded with non-disclosure agreements, offering no public insight into what it was doing as it tested competing models. Its approach was proprietary, secretive and opaque. Opportunities for meaningful public accountability were missed. Algorithms offer governments the allure of definitive solutions and the promise of reducing intractable decisions to simplified outputs. This logic runs counter to democratic politics, which express the contingency of the world and the deliberative nature of collective decision-making. Algorithmic solutions translate this contingency into clear parameters that can be tweaked and weights that can be adjusted, such that even major errors and inaccuracies can be fine-tuned or adjusted. This algorithmic worldview is one of defending the “robustness”, “validity” and “optimisation” of opaque systems and their outputs, closing off spaces for public challenges that are vital to democracy. This week, a generation of young people exposed the politics of the algorithm. That may itself be an important turning point. • Louise Amoore is professor of political geography at Durham University, and author of Cloud Ethics: Algorithms and the Attributes of Ourselves and Others

مشاركة :