Optimalisasi Summarization Berita BBC dengan Metode BiLSTM-Transformer

Authors

  • Rafael Austin Universitas Esa Unggul
  • Andhika Dwi Rachmawanto Universitas Esa Unggul
  • Michael Jeconiah Yonathan Universitas Esa Unggul
  • M Naufal Arriz Universitas Esa Unggul
  • Vitri Tundjungsari Universitas Esa Unggul

DOI:

https://doi.org/10.55606/jitek.v6i1.10669

Keywords:

BiLSTM, AI, Transformer, Embedding, Summarizer

Abstract

The rapid growth of digital news, such as that from the BBC, presents challenges for readers in absorbing dense information within limited time. This research proposes an automated text summarization system using a hybrid BiLSTM Transformer architecture to produce concise yet contextually accurate summaries. The model integrates BiLSTM to capture local sequential relationships and Transformer’s self-attention mechanism to handle global context, overcoming the computational limitations of standalone Transformers. Utilizing a self-embedding approach, the system processes text in an unsupervised manner, making it suitable for datasets without ground truth summaries. Evaluation was conducted using 50 samples from the Xsum dataset and 25 live BBC news links, with performance measured via cosine similarity to assess contextual preservation. The results demonstrated a consistent average cosine similarity of 0.7959 for dataset samples and 0.7877 for new data. These findings indicate that the hybrid model effectively maintains semantic integrity and provides reliable summaries for complex news articles.

References

[1] Halimah, Surya Agustian, and Siti Ramadhani, ‘Peringkasan teks otomatis (automated text summarization) pada artikel berbahasa indonesia menggunakan algoritma lexrank’, Jurnal CoSciTech (Computer Science and Information Technology), vol. 3, no. 3, pp. 371–381, Dec. 2022, doi: 10.37859/coscitech.v3i3.4300.

[2] V. Gupta and G. S. Lehal, ‘A Survey of Text Summarization Extractive techniques’, in Journal of Emerging Technologies in Web Intelligence, Aug. 2010, pp. 258–268. doi: 10.4304/jetwi.2.3.258-268.

[3] R. Patil, S. Boit, V. Gudivada, and J. Nandigam, ‘A Survey of Text Representation and Embedding Techniques in NLP’, doi: 10.1109/ACCESS.2022.0092316.

[4] Y. Asri, D. Kuswardani, A. A. Sari, and A. R. Ansyari, ‘Word embedding for contextual similarity using cosine similarity’, Indonesian Journal of Electrical Engineering and Computer Science, vol. 38, no. 2, p. 1170, May 2025, doi: 10.11591/ijeecs.v38.i2.pp1170-1180.

[5] J. Devlin, M.-W. Chang, K. Lee, K. T. Google, and A. I. Language, ‘BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding’. [Online]. Available: https://github.com/tensorflow/tensor2tensor

[6] C. Raffel et al., ‘Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer’, 2020. [Online]. Available: http://jmlr.org/papers/v21/20-074.html.

[7] A. See, P. J. Liu, and C. D. Manning, ‘Get to the point: Summarization with pointer-generator networks’, in ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers), Association for Computational Linguistics (ACL), 2017, pp. 1073–1083. doi: 10.18653/v1/P17-1099.

[8] Tundjungsari Vitri, ‘Dasar Machine Learning’.

[9] N. Q. Habban and M. H. Abdulameer, ‘A Hybrid Model for Long Document of Academic Research Papers Summarization Using BiLSTM-based Sentence Classification and Hierarchical Attention Mechanism’, International Journal of Intelligent Engineering and Systems, vol. 18, no. 7, pp. 635–654, 2025, doi: 10.22266/ijies2025.0831.41.

[10] S.-N. Vo, T.-T. Vo, B. Le, H. Chi Minh, and V. Nam, ‘Interpretable Extractive Text Summarization with Meta-Learning and BI-LSTM: A Study of Meta Learning And Explainability Techniques’, 2023.

[11] Z. Huang, P. Xu, D. Liang, A. Mishra, and B. Xiang, ‘TRANS-BLSTM: Transformer with Bidirectional LSTM for Language Understanding’, Mar. 2020, [Online]. Available: http://arxiv.org/abs/2003.07000

[12] B. Rahmadhani, P. Purwono, and Safar Dwi Kurniawan, ‘Understanding Transformers: A Comprehensive Review’, Journal of Advanced Health Informatics Research, vol. 2, no. 2, pp. 85–94, Aug. 2024, doi: 10.59247/jahir.v2i2.292.

[13] A. Vaswani et al., ‘Attention Is All You Need’, 2023.

[14] S. Bayat and G. Isik, ‘ASSESSING THE EFFICACY OF LSTM, TRANSFORMER, AND RNN ARCHITECTURES IN TEXT SUMMARIZATION’. [Online]. Available: http://as-proceeding.com/:Konya,Turkeyhttps://www.icaens.com/

[15] G. Tamrakar, ‘A Comparative Analysis of Transformer and LSTM Architectures for Text Summarization: A Case Study on News and Scientific Article Corpora’, SECITS Journal of Scalable Distributed Computing and Pipeline Automation, vol. 1, no. 1, pp. 24–31, doi: 10.17051/SJSDCPA/01.01.04.

[16] F. Liu, N. Zhao, and G. Zhu, ‘Cognitive difference text classification in online knowledge collaboration based on SA-BiLSTM hybrid model’, Sci. Rep., vol. 15, no. 1, Dec. 2025, doi: 10.1038/s41598-025-06914-w.

[17] A. Pannadhika Putra, D. Purnami Singgih Putri, and Aak. Cahyawan Wiranatha, ‘Scientific Paper Recommendation System: Application of Sentence Transformers and Cosine Similarity Using arXiv Data’, 2025. [Online]. Available: http://jurnal.polibatam.ac.id/index.php/JAIC

[18] Y. M. Wazery, M. E. Saleh, and A. A. Ali, ‘An optimized hybrid deep learning model based on word embeddings and statistical features for extractive summarization’, Journal of King Saud University - Computer and Information Sciences, vol. 35, no. 7, Jul. 2023, doi: 10.1016/j.jksuci.2023.101614.

[19] A. Chechkin, E. Pleshakova, and S. Gataullin, ‘A Hybrid Neural Network Transformer for Detecting and Classifying Destructive Content in Digital Space’, Algorithms, vol. 18, no. 12, p. 735, Nov. 2025, doi: 10.3390/a18120735.

[20] Prabha P L and Parvathy M, ‘Extractive and abstractive Text summarization technique’, doi: 10.35940/ijrte.A2235.059120.

Downloads

Published

2026-03-09

How to Cite

Rafael Austin, Andhika Dwi Rachmawanto, Michael Jeconiah Yonathan, M Naufal Arriz, & Vitri Tundjungsari. (2026). Optimalisasi Summarization Berita BBC dengan Metode BiLSTM-Transformer. Jurnal Informatika Dan Tekonologi Komputer (JITEK), 6(1), 30–41. https://doi.org/10.55606/jitek.v6i1.10669

Most read articles by the same author(s)