Ngozi OkidegbeIn a March 22 seminar, law professor Ngozi Okidegbe of Boston University will discuss how biased data can reproduce unjust outcomes in bail decisions.

Seminar to discuss algorithmic discrimination

Jurisdictions are increasingly employing pretrial algorithms as a solution to the racial and socioeconomic inequities in the bail system. But in practice, pretrial algorithms have reproduced the very inequities they were intended to correct. Scholars have diagnosed this problem as biased data problem: pretrial algorithms generate racially and socioeconomically biased predictions, because they are constructed and trained with biased data.

In an article published in the Cornell Law Review, law professor Ngozi Okidegbe of Boston University contends that biased data is not the sole cause of algorithmic discrimination. Another reason pretrial algorithms produce biased results is that they are exclusively built and trained with data from carceral knowledge sources — the police, pretrial services agencies, and the court system.

Okidegbe and Windsor Law professor Danardo Jones will discuss the issue in a seminar Friday, March 22, hosted by the LTEC Lab Seminar Series and the Transnational Law Racial Justice Network.

“Discredited Data: the Epistemic Origins of Algorithmic Discrimination” will be offered online and in person from 2 to 4 p.m. Lunch will be served to in-person guests at 1:30 p.m. in room 0140, Windsor Law. Register here.

Okidegbe calls for a shift away from carceral knowledge sources toward non-carceral knowledge sources drawn from communities most impacted by the criminal legal system. Though data derived from community knowledge sources have traditionally been discredited and excluded in the construction of pretrial algorithms, she says, tapping into them offers the potential to produce racially and socioeconomically just outcomes.

Her full article, “Discredited Data,” is available here.

Strategic Priority: 
Academic Area: