eprintid: 3477 rev_number: 7 eprint_status: archive userid: 383 dir: disk0/00/00/34/77 datestamp: 2019-06-04 02:50:44 lastmod: 2019-06-04 02:50:44 status_changed: 2019-06-04 02:50:44 type: article metadata_visibility: show creators_name: Le, Hoang Quynh creators_name: Can, Duy Cat creators_name: Ha, Quang Thuy creators_name: Collier, Nigel creators_id: lhquynh@vnu.edu.vnu creators_id: catcd@vnu.edu.vn creators_id: thuyhq@vnu.edu.vn creators_id: nhc30@cam.ac.uk title: A Richer-but-Smarter Shortest Dependency Path with Attentive Augmentation for Relation Extraction ispublished: pub subjects: IT divisions: fac_fit abstract: To extract the relationship between two entities in a sentence, two common approaches are (1) using their shortest dependency path (SDP) and (2) using an attention model to capture a context-based representation of the sentence. Each approach suffers from its own disadvantage of either missing or redundant information. In this work, we propose a novel model that combines the advantages of these two approaches. This is based on the basic information in the SDP enhanced with information selected by several attention mechanisms with kernel filters, namely RbSP (Richer-but- Smarter SDP). To exploit the representation behind the RbSP structure effectively, we develop a combined deep neural model with a LSTM network on word sequences and a CNN on RbSP. Experimental results on the SemEval-2010 dataset demonstrate improved performance over competitive baselines. The data and source code are available at https: //github.com/catcd/RbSP. date: 2019-06-02 date_type: published publisher: Association for Computational Linguistics official_url: https://www.aclweb.org/anthology/N19-1298 contact_email: lhquynh@vnu.edu.vn full_text_status: public publication: 2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics volume: 1 pagerange: 2902-2912 refereed: TRUE citation: Le, Hoang Quynh and Can, Duy Cat and Ha, Quang Thuy and Collier, Nigel (2019) A Richer-but-Smarter Shortest Dependency Path with Attentive Augmentation for Relation Extraction. 2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics, 1 . pp. 2902-2912. document_url: https://eprints.uet.vnu.edu.vn/eprints/id/eprint/3477/1/N19-1298.pdf