NLP-CIC @ PRELEARN: Mastering prerequisites relations, from handcrafted features to embeddings

Jason Angel, Segun Taofeek Aroyehun, Alexander Gelbukh

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

We present our systems and findings for the prerequisite relation learning task (PRELEARN) at EVALITA 2020. The task aims to classify whether a pair of concepts hold a prerequisite relation or not. We model the problem using handcrafted features and embedding representations for in-domain and cross-domain scenarios. Our submissions ranked first place in both scenarios with average F1 score of 0.887 and 0.690 respectively across domains on the test sets. We made our code freely available.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume2765
StatePublished - 2020
Event7th Evaluation Campaign of Natural Language Processing and Speech Tools for Italian. Final Workshop, EVALITA 2020 - Virtual, Online
Duration: 17 Dec 2020 → …

Fingerprint

Dive into the research topics of 'NLP-CIC @ PRELEARN: Mastering prerequisites relations, from handcrafted features to embeddings'. Together they form a unique fingerprint.

Cite this