CS 5/662, Winter 2021
Week 3 /
Wed, Jan 20 &
- Rogers, A., & Augenstein, I. (2020). What Can We Do to Improve Peer Review in NLP?. in Findings of the Association for Computational Linguistics: EMNLP 2020
- Belz, A. (2009). That’s Nice… What Can You Do With It? Computational Linguistics, 35(1), 111–118.
- Fokkens, A., van Erp, M., Postma, M., Pedersen, T., Vossen, P., & Freire, N. (2013). Offspring from Reproduction Problems: What Replication Failure Teaches Us. ACL 2013
- Bender Chapter 2
- J&M Chapter 6 (Optional: 18)
- Eisenstein Chapter 14 covers some of the same ground but from a different perspective, and I strongly recommended reading it in addition to the J&M chapter (it’s pretty short, too).
- Goldberg Chapters 8, 10, and 11
- Lilian Weng has two excellent blog posts that provide a survey of word embedding methods:
- Robyn Speer’s How to make a racist AI without really trying
- Arora, S., Li, Y., Liang, Y., Ma, T., & Risteski, A. (2016). A Latent Variable Model Approach to PMI-based Word Embeddings. Transactions of the Association for Computational Linguistics, 4, 385–399.
- If you really want to grok what’s going on with word embeddings, read this one.