Published on: May 2026
A HYBRID TRANSFORMER ARCHITECTURE FOR ACADEMIC SUMMARIZATION WITH TABULAR AND NARRATIVE OUTPUTS
Harshad Pawar Sarvesh Rathod Piyush Bothe Sarthak Janrao
Article Status
Available Documents
Abstract
How to Cite this Paper
Pawar, H., Rathod, S., Bothe, P. & Janrao, S. (2026). A Hybrid Transformer Architecture for Academic Summarization with Tabular and Narrative Outputs. International Journal of Creative and Open Research in Engineering and Management, <i>02</i>(05). https://doi.org/10.55041/ijcope.v2i4.1037
Pawar, Harshad, et al.. "A Hybrid Transformer Architecture for Academic Summarization with Tabular and Narrative Outputs." International Journal of Creative and Open Research in Engineering and Management, vol. 02, no. 05, 2026, pp. . doi:https://doi.org/10.55041/ijcope.v2i4.1037.
Pawar, Harshad,Sarvesh Rathod,Piyush Bothe, and Sarthak Janrao. "A Hybrid Transformer Architecture for Academic Summarization with Tabular and Narrative Outputs." International Journal of Creative and Open Research in Engineering and Management 02, no. 05 (2026). https://doi.org/https://doi.org/10.55041/ijcope.v2i4.1037.
References
- [1] N. Radha, R. Swathika, M. K. B, and K. R. Uthayan, "AI-Driven Summarization of Academic Literature using Transformer Model," in *2024 Second International Conference on Inventive Computing and Informatics (ICICI)*, 2024.
- [2] M. Ulker and A. B. Ozer, "Abstractive Summarization Model for Summarizing Scientific Article," *IEEE Access*, 12, pp. 91252-91262, 2024.
- [3] B. Khan, M. Usman, I. Khan, J. Khan, D. Hussain, and Y. H. Gu, "Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data," *IEEE Access*, vol. 13, pp. 37557-37571, 2025.
- [4] S. Mukhtar, S. Lee, and J. Heo, "A Multidocument Summarization Technique for Informative Bug Summaries," *IEEE Access*, 12, pp. 158908-158926, 2024.
- [5] M. Azam, S. Khalid, S. Almutairi, et al., "Current Trends and Advances in Extractive Text Summarization: A Comprehensive Review," *IEEE Access*, vol. 13, pp. 28150-28166, 2025.
- [6] Devlin, M.-W. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding," arXiv:1810.04805, 2018.
- [7] Lewis, Y. Liu, N. Goyal, M. Ghazvininejad,
- Mohamed, O. Levy, V. Stoyanov, and L. Zettlemoyer, "BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension," arXiv:1910.13461, 2019.
- [8] C. Raffel, N. Shazeer, A. Roberts, K. Lee, S. Narang, Matena, Y. Zhou, W. Li, and P. J. Liu, "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer," arXiv:1910.10683, 2019.
- [9] Y. Lin, "ROUGE: A Package for Automatic Evaluation of Summaries," in *Proceedings of the ACL Workshop: Text Summarization Branches Out*, 2004, pp. 74–81.
- [10] Beltagy, K. Lo, and A. Cohan, "SciBERT: A Pretrained Language Model for Scientific Text," arXiv:1903.10676, 2019.
Ethical Compliance & Review Process
- •All submissions are screened under plagiarism detection.
- •Review follows editorial policy.
- •Authors retain copyright.
- •Peer Review Type: Double-Blind Peer Review
- •Published on: May 03 2026
This article is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. You are free to share and adapt this work for non-commercial purposes with proper attribution.

