TY - CONF ID - SisLab3041 UR - https://link.springer.com/chapter/10.1007%2F978-3-319-75429-1_14 A1 - Nguyen, Ngoc Khuong A1 - Le, Anh Cuong A1 - Nguyen, Viet Ha Y1 - 2018/// N2 - Neural network based sequence-to-sequence models have shown to be the effective approach for paraphrase generation. In the problem of paraphrase generation, there are some words which should be ignored in the target text generation. The current models do not pay enough attention to this problem. To overcome this limitation, in this paper we propose a new model which is a penalty coefficient attention-based Residual Long-Short-Term-Memory (PCA-RLSTM) neural network for forming an end-to-end paraphrase generation model. Extensive experiments on the two most popular corpora (PPDB and WikiAnswers) show that our proposed model?s performance is better than the state-of-the-art models for paragraph generation problem. TI - An Attention-Based Long-Short-Term-Memory Model for Paraphrase Generation SP - 166 AV - none EP - 178 T2 - IUKM 2018 ER -