NLP-CIC @ PRELEARN: Mastering prerequisites relations, from handcrafted features to embeddings

Jason Angel, Segun Taofeek Aroyehun, Alexander Gelbukh

Producción científica: Contribución a una revistaArtículo de la conferenciarevisión exhaustiva

3 Citas (Scopus)

Resumen

We present our systems and findings for the prerequisite relation learning task (PRELEARN) at EVALITA 2020. The task aims to classify whether a pair of concepts hold a prerequisite relation or not. We model the problem using handcrafted features and embedding representations for in-domain and cross-domain scenarios. Our submissions ranked first place in both scenarios with average F1 score of 0.887 and 0.690 respectively across domains on the test sets. We made our code freely available.

Idioma originalInglés
PublicaciónCEUR Workshop Proceedings
Volumen2765
EstadoPublicada - 2020
Evento7th Evaluation Campaign of Natural Language Processing and Speech Tools for Italian. Final Workshop, EVALITA 2020 - Virtual, Online
Duración: 17 dic. 2020 → …

Huella

Profundice en los temas de investigación de 'NLP-CIC @ PRELEARN: Mastering prerequisites relations, from handcrafted features to embeddings'. En conjunto forman una huella única.

Citar esto