A Hierarchical LSTM Modelfor Joint TasksQianrong Zhou, Liyun Wen, Xiaojie Wang,Long Ma, and Yue WangSchool puter,Beijing University of Posts and munications,Beijing, China{zhouqr,wenliyun,xjwang,miss_longma,wangyuesophie}***@ work has shown that joint modeling of two Natu-ral Language Processing (NLP) tasks are e?ective for achieving betterperformances for both tasks. Lots of task-speci?c joint models are pro-posed. This paper proposes a Hierarchical Long Short-Term Memory(HLSTM) model and some its variants for modeling two tasks models are ?exible for modeling di?erent types binations oftasks. It avoids task-speci?c feature engineering. Besides the enabling ofcorrelation information between tasks, our models take the hierarchicalrelations between two tasks into consideration, which is not discussed inprevious work. Experimental results show that our models outperformstrong baselines in three di?erent types of bination. While bothcorrelation information and hierarchical relations between two tasks arehelpful to improve performances for both tasks, the models especiallyboost performance of tasks on the top of the hierarchical :Hierarchical LSTM·Joint modeling1 IntroductionIt is a normal situation in Natural Language Processing (NLP) that two tasksinteract with each other. For example, Chinese word segmentation and POS-tagging, POS-tagging and chunking, intent identi?cation and slot ?lling in goal-driven spoken language dialogue systems, and so , the second task is modeled after the ?rst one is ?nished, since the?rst task is thought to be more fundamental or lower than the second one. It isso called pipeline method, . low level tasks are followed by high level example, chunking in character-based languages such as Chinese, Japaneseand Thai requires word segmentation and POS-tagging as pre-processing steps[1,2,3]. In Spoken Language Understanding (SLU), intent is ?rstly identi?ed asa classi?catio
A Hierarchical LSTM Model for Joint Tasks 来自淘豆网m.daumloan.com转载请标明出处.