eprintid: 3041 rev_number: 9 eprint_status: archive userid: 244 dir: disk0/00/00/30/41 datestamp: 2018-08-03 15:41:57 lastmod: 2018-08-03 15:41:57 status_changed: 2018-08-03 15:41:57 type: conference_item metadata_visibility: show creators_name: Nguyen, Ngoc Khuong creators_name: Le, Anh Cuong creators_name: Nguyen, Viet Ha creators_id: cuongla@vnu.edu.vn creators_id: hanv@vnu.edu.vn title: An Attention-Based Long-Short-Term-Memory Model for Paraphrase Generation ispublished: pub subjects: IT divisions: fac_fit abstract: Neural network based sequence-to-sequence models have shown to be the effective approach for paraphrase generation. In the problem of paraphrase generation, there are some words which should be ignored in the target text generation. The current models do not pay enough attention to this problem. To overcome this limitation, in this paper we propose a new model which is a penalty coefficient attention-based Residual Long-Short-Term-Memory (PCA-RLSTM) neural network for forming an end-to-end paraphrase generation model. Extensive experiments on the two most popular corpora (PPDB and WikiAnswers) show that our proposed model’s performance is better than the state-of-the-art models for paragraph generation problem. date: 2018 date_type: completed official_url: https://link.springer.com/chapter/10.1007%2F978-3-319-75429-1_14 full_text_status: none pres_type: paper pagerange: 166-178 event_title: IUKM 2018 event_dates: 04 February 2018 event_type: conference refereed: TRUE citation: Nguyen, Ngoc Khuong and Le, Anh Cuong and Nguyen, Viet Ha (2018) An Attention-Based Long-Short-Term-Memory Model for Paraphrase Generation. In: IUKM 2018, 04 February 2018.