Designing for Fairness: Unpacking and Reimagining the COMPAS Risk Assessment Algorithm
Overview
In this case study, I examined the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) risk assessment algorithm, a tool used to predict the risk of recidivism (the likelihood of reoffending). Despite its intentions, COMPAS has come under scrutiny for perpetuating racial biases disproportionately affecting Black defendants—highlighting such challenges presented by COMPAS, I outlined the implications of the algorithm’s biases and proposed a set of recommendations aimed at redesigning the algorithm to enhance fairness and equity.
Role: Researcher
Duration: 7 weeks
Objectives
The primary objective of this study was to identify and address the discriminatory biases within the COMPAS algorithm and propose actionable design improvements to ensure more equitable outcomes in the criminal justice system. This was broken down into the following aims:
Identify the extent of racial discrimination and bias in COMPAS, including affected groups and causes.
Analyse COMPAS’s impact on ethnically-diverse people, particularly Black defendants, and the criminal justice system.
Examine socio-technical theories to mitigate discrimination in COMPAS.
Propose design changes to enhance fairness and objectivity in the criminal justice system.
-
Striving to uncover and address racial biases in the COMPAS algorithm, this study revolved around its disproportionate impact on Black defendants. By analysing such biases and exploring socio-technical solutions, my overarching objective was to propose design improvements promoting fairness and equity within the criminal justice system.
-
Using peer-reviewed articles and credible sources, I comprehensively researched and analysed algorithmic discrimination, the COMPAS tool, and potential design improvements to form the core elements of my case study.
-
Revelations of racial bias in COMPAS were discovered, disproportionately labelling Black defendants as high-risk, in addition to the algorithm's failure to account for complex social and individual factors - consequently leading to systemic inequities.
-
Proposals included adopting inclusive design practices, creating simplified but accurate models, improving data collection, and redefining fairness to address societal biases and discrimination in decision-making processes.
-
Addressing COMPAS' racial bias requires more than algorithmic fixes; it demands systemic changes in the criminal justice system alongside inclusive design and improved data practices.
-
Reinforcing my commitment to ethical design, this case study emphasised the need for ongoing reflection, inclusivity, and a rethinking of fairness to develop technology that promotes justice and equity in society.
Methods
Data was gathered from peer-reviewed articles and credible sources using targeted search keywords to understand algorithmic discrimination, the implications of COMPAS, and design proposals for improvement.
Key Issues
Racial Bias and Discrimination: COMPAS has been criticised for systematically labelling Black defendants as high-risk more frequently than their white counterparts, leading to disproportionately harsher sentencing. This bias reflects deeper societal prejudices embedded in the historical data used by the algorithm.
Algorithmic Limitations: The tool’s reliance on historical data and predetermined risk factors often fails to account for the nuanced and evolving nature of individual identities and socio-political contexts.
Design Recommendations
The following recommendations were proposed as a way to bolster fairness, reduce discrimination, and provide awareness and education to ultimately encourage positive changes within the criminal justice system and wider society.
Inclusive Design Practices
Technology development must incorporate diverse perspectives and actively counteract biases—designers must adopt anti-racist and anti-discriminatory approaches to ensure inclusivity.
Designing for Justice, Fairness and Accuracy
Implement simplified models with complex forecasting capabilities.
Introduce targeted policy interventions to address biases in decision-making processes.
Recognise and mitigate the limitations of statistical notions of justice.
Implementing Hermeneutic Technologies
Use hermeneutic technics—creating an enigmatic position between the technology and the world rather than between the user and the world—to represent diverse user experiences accurately. Resultingly, designs should focus on minimising systemic biases, ensuring that technological artefacts do not perpetuate discrimination.
Adaptive Data Practices
Improve data reporting and collection methods to better capture racial and ethnic information.
Support laboratory and field studies to refine understanding of racial discrimination and validate findings in real-world contexts.
Reimagining Fairness
Shift the definition of fairness from a purely mathematical perspective to one that addresses systemic biases and societal impacts—recognising that even highly accurate algorithms can reinforce existing prejudices.
Conclusion
The COMPAS algorithm, while a tool for assessing recidivism risk, highlights significant issues with racial bias and systemic inequity. Aiming to address these challenges, the proposed recommendations highlight inclusive design, improved data practices, and a redefined understanding of fairness. Ultimately, achieving true justice may require moving beyond risk assessment algorithms— fundamentally addressing the biases within the criminal justice system itself.
Reflection
Demonstrating a commitment to rethinking and improving technological systems to foster fairness and equity, this project reinforces my belief in the profound impact that design and technology can have on social justice. Researching the COMPAS algorithm illuminated the complex interplay between technological systems and societal inequities, highlighting that, while algorithms like COMPAS aim to bring objectivity to decision-making processes, they often mirror and even amplify existing biases if not carefully managed.
Designing for equity requires more than just technical adjustments; it demands a deep commitment to inclusivity and a willingness to challenge entrenched biases. By integrating diverse perspectives and adopting a critical approach to data and design, we can create tools that not only perform well but also promote fairness and justice.
The challenge of addressing biases in predictive algorithms is a technical and moral issue, calling for continuous reflection, adaptation, and a readiness to rethink our definitions of fairness and justice. This project strengthened my resolve to pursue ethical design practices and advocate for systems that are not only accurate but also equitable. Moving forward, I am committed to applying these insights to future projects and contributing to a broader dialogue on the role of technology in shaping a just society.