TY - CONF ID - SisLab3171 UR - http://www.aclweb.org/anthology/D18-1250 A1 - Le, Hoang Quynh A1 - Can, Duy Cat A1 - Vu, Tien Sinh A1 - Dang, Thanh Hai A1 - Pilehvar, Mohammad Taher A1 - Collier, Nigel Y1 - 2018/// N2 - Experimental performance on the task of relation classification has generally improved using deep neural network architectures. One major drawback of reported studies is that individual models have been evaluated on a very narrow range of datasets, raising questions about the adaptability of the architectures, while making comparisons between approaches difficult. In this work, we present a systematic large-scale analysis of neural relation classification architectures on six benchmark datasets with widely varying characteristics. We propose a novel multi-channel LSTM model combined with a CNN that takes advantage of all currently popular linguistic and architectural features. Our ?Man for All Seasons? approach achieves state-of-the-art performance on two datasets. More importantly, in our view, the model allowed us to obtain direct insights into the continued challenges faced by neural language models on this task. Example data and source code are available at: https://github. com/aidantee/MASS. TI - Large-scale Exploration of Neural Relation Classification Architectures SP - 2266 M2 - Brussels, Belgium AV - public EP - 2277 T2 - Conference on Empirical Methods in Natural Language Processing (EMNLP 2018) ER -