Skip to content
Snippets Groups Projects
Commit d094438e authored by friebolin's avatar friebolin
Browse files

Update table of contents

parent 11ad0bc9
No related branches found
No related tags found
No related merge requests found
......@@ -7,16 +7,16 @@ Members of the project:
- Mira Umlauf [umlauf@cl.uni-heidelberg.de](mailto:umlauf@cl.uni-heidelberg.de)
# Table of contents
1. 📚 [Project documents](#documents)
1. 📚 [Project documents](#project-documents)
2. 🔎 [Metonomy Resolution](#metonomy)
3. 📈 [Data Augmentation](#augmentation)
3. 📈 [Data Augmentation](#data-augmentation)
4. 💡 [Methods](#methods)
1. 📝 [Backtranslation](#backtranslation)
2. 🍸 [MixUp](#mixup)
5. 🗃️ [Data](#data)
6. 🛠️ [Set Up](#setup)
6. 🛠️ [Set Up](#set-up)
7. ⚙️ [Usage](#usage)
8. 🏯 [Code Structure](#structure)
8. 🏯 [Code Structure](#code-structure)
9. 📑 [References](#references)
***
......@@ -221,10 +221,15 @@ For `<COMMAND>` you must enter one of the commands you find in the list below, w
- [Downloaded TWM datasets](./swp/swp-data-augmentation-for-metonymy-resolution/datasets/li_twm)
[^1]: Fairseq Tool.
[^2]: Backtranslation paper.
[^3]: Zhang, Hongyi, Cissé, Moustapha, Dauphin, Yann N. & Lopez-Paz, David. mixup: Beyond empirical risk minimization. *CoRR*, 2017.
[^4]: Sun, L., Xia, C., Yin, W., Liang, T., Yu, P. S., & He, L. (2020). Mixup-transformer: dynamic data augmentation for NLP tasks. arXiv preprint arXiv:2010.02394.
[^5]: Li et al.
[^5]: Li, Haonan, Vasardani, Maria, Tomko, Martin & Baldwin, Timothy. ["Target word masking for location metonymy resolution."](https://aclanthology.org/2020) Proceedings of the 28th International Conference on Computational Linguistics, December 2020.
[^6]: Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805.
[^7]: Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., ... & Stoyanov, V. (2019). Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment