Adaptive Sparse Transformer untuk Meningkatkan ROUGE-1 Score pada Text Summarization Scientific Paper
(1) Program Studi Teknik Informatika, Universitas Kristen Petra Surabaya
(2) Program Studi Teknik Informatika, Universitas Kristen Petra Surabaya
(3) Program Studi Teknik Informatika, Universitas Kristen Petra Surabaya
(*) Corresponding Author
Abstract
Keywords
Full Text:
PDFReferences
Cachola, I., Lo, K., Cohan, A. & Weld, D.S. 2020. TLDR:
Extreme Summarization of Scientific Documents. Allen
Institute for AI.
DOI=http://dx.doi.org/10.18653/v1/2020.findingsemnlp.428.
Cohan, A., Dernoncourt, F., Kim, D. S., Bui, T., Kim, S.,
Chang, W. & Goharian, N. 2018. A discourse-aware attention
model for abstractive summarization of long documents.
Association for Computational Linguistics (ACL), 615-621 .
DOI=https://doi.org/10.18653/v1/n18-2097.
Correia, G. M., Niculae, V. & Martins, A. F. T. 2020.
Adaptively sparse transformers. Association for
Computational Linguistic (ACL), 2174–2184.
DOI=https://doi.org/10.18653/v1/d19-1223
El-Kassas, W. S., Salama, C. R., Rafea, A. A., & Mohamed,
H. K. 2020. Automatic Text Summarization: A
Comprehensive Survey. Expert Systems with Applications,
, 113679.
DOI=https://doi.org/10.1016/j.eswa.2020.113679.
Huang, D., Cui, L., Yang, S. Bao, G., Wang, K. Xie, J. &
Zhang, Y. 2020. What Have We Achieved on Text
Summarization?. School of Engineering, Westlake
University. DOI=http://dx.doi.org/10.18653/v1/2020.emnlpmain.33.
Ju, J., Liu, M., Gao, L. & Pan, S. 2020. SciSummPip: An
Unsupervised Scientific Paper Summarization Pipeline.
Association for Computational Linguistics (ACL), 318–327.
DOI=https://doi.org/10.18653/v1/2020.sdp-1.37
Peters, B., Niculate, V. & Martins, A. F. T. 2019. Sparse
Sequence-to-Sequence Models. In Proceedings of the 57th
Annual Meeting of the Association for Computational
Linguistics, 1504–1519.
DOI=http://dx.doi.org/10.18653/v1/P19-1146.
Pilault, J., Li, R., Subramanian, S. & Pal, C. 2020. On
Extractive and Abstractive Neural Document Summarization
with Transformer Language Models. Association for
Computational Linguistics (ACL), 9308–9319.
DOI=https://doi.org/10.18653/v1%2F2020.emnlp-main.748.
Sun, X. & Zhuge, H. 2018. Summarization of Scientific
Paper Through Reinforcement Ranking on Semantic Link
Network. IEEE Access, 2018-Vol6, 40611-40625.
DOI=https://doi.org/10.1109/ACCESS.2018.2856530.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones,L.,
Gomez, A. N., Kaiser, Ł., & Polosukhin, I. 2017. Attention
Is All You Need. Advances in Neural Information Processing
Systems, 2017-December, 5999-6009.
DOI=https://doi.org/10.48550/arXiv.1706.03762.
Verma, S. & Nidhi, V. 2017. Extractive Summarization using
Deep Learning. Delhi Technological University.
DOI=http://doi.org/10.13053/rcs-147-10-9.
Refbacks
- There are currently no refbacks.
Jurnal telah terindeks oleh :