CS 5/692, Winter 2020
February 10 / Week 6
Algorithmic Bias 2: NLP
Readings
Required
- Bolukbasi, T., Chang, K.-W., Zou, J. Y., Saligrama, V., & Kalai, A. T. (2016). Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. In D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, & R. Garnett (Eds.), (pp. 4349–4357). Presented at the Advances in Neural Information Processing Systems 29
- Caliskan, A., Bryson, J. J., & Narayanan, A. (2017). Semantics derived automatically from language corpora contain human-like biases. Science (New York, NY), 356(6334), 183–186.
- Nissim, M., van Noord, R., & van der Goot, R. (2019). Fair is Better than Sensational: Man is to Doctor as Woman is to Doctor. arXiv Preprint 1905.09866
- Gonen, H., & Goldberg, Y. (2019). Lipstick on a Pig: Debiasing Methods Cover up Systematic Gender Biases in Word Embeddings But do not Remove Them (pp. 609–614). Presented at the Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Minneapolis, Minnesota: Association for Computational Linguistics.
- Prates, M. O. R., Avelar, P. H., & Lamb, L. C. (2019). Assessing gender bias in machine translation: a case study with Google Translate. Neural Computing and Applications, 14(1), 1–19.
Optional