Skip to content

shivamxsingla/credit-prediction-model-audit

Repository files navigation

Credit Prediction Model Audit

In 2019, Apple received backlash on social media after its newly launched Apple Card product appeared to offer higher credit limits to men compared to women. In multiple cases, married couples found the husband received a credit limit that was 10-20x higher than the wife's even when the couple had joint assets.

From a regulatory perspective, financial institutions that operate within the United States are subject to legal regulations prohibiting discrimination on the basis of race, gender, or other protected classes. With the increasing prevalence of automated decision systems in the financial lending space, experts have raised concerns about whether these systems could exacerbate existing inequalities in financial lending.

Although the two concepts are intertwined, algorithmic fairness is not the same concept as anti-discrimination law. An AI system can comply with anti-discrimination laws while exhibiting fairness-related concerns. On the other hand, some fairness interventions may be illegal under anti-discrimination laws. Xiang and Raji discuss the compatibilities and disconnects between anti-discrimination law and algorithmic notions of fairness. This case study focuses on fairness in financial services rather than compliance with financial anti-discrimination regulations.