November 5, 2019

12:00 pm / 1:30 pm

Venue

Clark Hall, Room 316

12:00 – 1:00 Seminar
1:00 – 1:30 Lunch

?Fairness By CausalMediation Analysis: Criteria, Algorithms, and Open Problems?

Abstract: Systematic discriminatory biases present in our society influence the waydata is collected and stored, the way variables are defined, and the wayscientific findings are put into practice as policy. Automated decision procedures and learning algorithms applied to such data may serve to perpetuate existing injustice or unfairness in our society.

We consider how to solve prediction and policy learning problems in a way which ?breaks the cycle of injustice? by correcting for the unfair dependence of outcomes, decisions, or both, on sensitive features (e.g., variables that correspond to gender, race, disability, or other protected attributes). We use methods from causal inference and constrained optimization to learn outcome predictors and optimal policies in a way that addresses multiple potential biases which afflict data analysis in sensitive contexts.

Our proposal comes equipped with the guarantee that solving prediction or decision problems on new instances will result in a joint distribution where the given fairness constraint is satisfied. We illustrate our approach with both syntheticdata and real criminal justice data.

Bio: Ilya Shpitser is a John C Malone Assistant Professor of Computer Science, and a member of the Malone Center for Engineering in Healthcare. Dr. Shpitser works on causal and semi-parametric inference, missing data, dependent data and algorithmic fairness, with applications in medicine and public health.