![]() Anthology ID: 2021.naacl-main.141 Volume: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies Month: June Year: 2021 Address: Online Editors: Kristina Toutanova, Our empirical results demonstrate that MelBERT outperforms several strong baselines on four benchmark datasets, i.e., VUA-18, VUA-20, MOH-X, and TroFi. Our model not only leverages contextualized word representation but also benefits from linguistic metaphor identification theories to detect whether the target word is metaphorical. To this end, we propose a novel metaphor detection model, namely metaphor-aware late interaction over BERT (MelBERT). To tackle this problem, we adopt pre-trained contextualized models, e.g., BERT and RoBERTa. Abstract Automated metaphor detection is a challenging task to identify the metaphorical expression of words in a sentence.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |