Recognizing Textual Entailment via Multi-taskKnowledge Assisted LSTMLei Sha, Sujian Li, Baobao Chang, Zhifang SuiKey Laboratory putational Linguistics, Ministry of EducationSchool of Electronics Engineering puter Science, Peking UniversityCollaborative Innovation Center for Language Ability, Xuzhou 221009 ******@pku.,{lisujian, chbb, szf}***@ Textual Entailment (RTE) plays an importantrole in NLP applications like question answering, information retrieval,etc. Most previous works either use classi?ers to employ elaborately de-signed features and lexical similarity or bring distant supervision andreasoning technique into RTE task. However, these approaches are hardto generalize due to plexity of feature engineering and are o cascading errors and data sparsity problems. For alleviating the aboveproblems, some work use LSTM-based recurrent work withword-by-word attention to recognize textual entailment. Nevertheless,these work did not make full use of knowledge base (KB) to help rea-soning. In this paper, we propose a deep work architecturecalledMulti-taskKnowledgeAssistedLSTM (MKAL), which aims toconduct implicit inference with the assistant of KB and use predicate-to-predicate attention to detect the entailment between predicates. In addi-tion, our model applies a multi-task architecture to further improve theperformance. The experimental results show that our proposed methodachieves petitive pared to the previous IntroductionFor the natural language, mon phenomenon is that there exist a lot of waysto express the same or similar meaning. To discover such di?erent expressions,the Recognizing Textual Entailment (RTE) task is proposed to judge whetherthe meaning of one text (denoted as H) can be inferred (entailed) from theother one (T)[3]. For many natural language processing applications like questionanswering, information retrieval which need to deal with the diversity of naturallanguage, recognizing textual entailments is a critical
Recognizing Textual Entailment via Multi-task Knowledge Assisted LSTM 来自淘豆网m.daumloan.com转载请标明出处.