Visit hbs.edu

Automatically signaling quality? A study of the fairness-economic tradeoffs in reducing bias through AI/ML on digital platforms

Lauren Rhue headshot

Digital platforms have a widely-documented issue with bias, which can impact their bottom line via missed opportunities, bad publicity, or even legal action. There is evidence that bias decreases when a platform conveys a signal of quality for underrepresented groups. This paper studies the practicalities of implementing a fair and equitable designation of quality for crowdfunding projects and the potential economic tradeoffs. In particular, we investigate Kickstarter.com’s Projects We Love (PWL) badge. We perform an observational and simulation study that leverages cutting-edge algorithmic fairness techniques to equitably distribute PWLs.

We examine the distribution of PWLs across a sensitive attribute and find that PWLs help underrepresented groups yet are distributed inequitably. We use cutting-edge algorithmic fairness techniques to identify projects with high PWL-propensity that would otherwise be overlooked. We find that distributing the PWL badge in an equitable way results in comparable performance on business objectives while decreasing disparities in both PWL designation and success for an underrepresented group. To the best of our knowledge, this is the first paper to study the business implications of implementing algorithmic fairness recommendations on digital platforms and the potential economic tradeoffs therein.

Email us at digitalinitiative@hbs.edu for information on attending this seminar.

Engage With Us

Join Our Community

Ready to dive deeper with the Digital Data Design Institute at Harvard? Subscribe to our newsletter, contribute to the conversation and begin to invent the future for yourself, your business and society as a whole.