IJCOPE Journal

UGC Logo DOI / ISO Logo

International Journal of Creative and Open Research in Engineering and Management

A Peer-Reviewed, Open-Access International Journal Supporting Multidisciplinary Research, Digital Publishing Standards, DOI Registration, and Academic Indexing.
Journal Information
ISSN: 3108-1754 (Online)
Crossref DOI: Available
ISO Certification: 9001:2015
Publication Fee: 599/- INR
Compliance: UGC Journal Norms
License: CC BY 4.0
Peer Review: Double Blind
Volume 02, Issue 05

Published on: May 2026

A HYBRID TRANSFORMER ARCHITECTURE FOR ACADEMIC SUMMARIZATION WITH TABULAR AND NARRATIVE OUTPUTS

Harshad Pawar Sarvesh Rathod Piyush Bothe Sarthak Janrao

BE IT Student International Institute of Information technology Pune India

Article Status

Plagiarism Passed Peer Reviewed Open Access

Available Documents

Abstract

The rapid expansion of academic publications has created an urgent need for tools that can automatically condense and interpret research information. This study introduces a hybrid summarization framework that combines transformer-based models with a vector similarity engine to produce summaries in both descriptive and tabular formats. The proposed system employs BERT to extract semantically rich sentence representations and BART to generate coherent abstractive summaries, while FAISS is used to cluster and retrieve the most informative content efficiently. Unlike conventional summarizers that generate only one form of output, our approach produces both a fluent narrative summary and a structured table containing essential experimental details such as datasets, evaluation metrics, and key findings. Experiments conducted on a curated collection of academic papers demonstrate that this dual-format model achieves higher ROUGE and F1 scores than individual transformer baselines. The results suggest that integrating extractive and generative stages within a single pipeline can improve both factual reliability and readability, offering a scalable solution for academic knowledge summarization.

 

How to Cite this Paper

Pawar, H., Rathod, S., Bothe, P. & Janrao, S. (2026). A Hybrid Transformer Architecture for Academic Summarization with Tabular and Narrative Outputs. International Journal of Creative and Open Research in Engineering and Management, <i>02</i>(05). https://doi.org/10.55041/ijcope.v2i4.1037

Pawar, Harshad, et al.. "A Hybrid Transformer Architecture for Academic Summarization with Tabular and Narrative Outputs." International Journal of Creative and Open Research in Engineering and Management, vol. 02, no. 05, 2026, pp. . doi:https://doi.org/10.55041/ijcope.v2i4.1037.

Pawar, Harshad,Sarvesh Rathod,Piyush Bothe, and Sarthak Janrao. "A Hybrid Transformer Architecture for Academic Summarization with Tabular and Narrative Outputs." International Journal of Creative and Open Research in Engineering and Management 02, no. 05 (2026). https://doi.org/https://doi.org/10.55041/ijcope.v2i4.1037.

Search & Index

References


  1. [1] N. Radha, R. Swathika, M. K. B, and K. R. Uthayan, "AI-Driven Summarization of Academic Literature using Transformer Model," in *2024 Second International Conference on Inventive Computing and Informatics (ICICI)*, 2024.

  2. [2] M. Ulker and A. B. Ozer, "Abstractive Summarization Model for Summarizing Scientific Article," *IEEE Access*, 12, pp. 91252-91262, 2024.

  3. [3] B. Khan, M. Usman, I. Khan, J. Khan, D. Hussain, and Y. H. Gu, "Next-Generation Text Summarization: A T5-LSTM FusionNet Hybrid Approach for Psychological Data," *IEEE Access*, vol. 13, pp. 37557-37571, 2025.

  4. [4] S. Mukhtar, S. Lee, and J. Heo, "A Multidocument Summarization Technique for Informative Bug Summaries," *IEEE Access*, 12, pp. 158908-158926, 2024.

  5. [5] M. Azam, S. Khalid, S. Almutairi, et al., "Current Trends and Advances in Extractive Text Summarization: A Comprehensive Review," *IEEE Access*, vol. 13, pp. 28150-28166, 2025.

  6. [6] Devlin, M.-W. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding," arXiv:1810.04805, 2018.

  7. [7] Lewis, Y. Liu, N. Goyal, M. Ghazvininejad,

  8. Mohamed, O. Levy, V. Stoyanov, and L. Zettlemoyer, "BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension," arXiv:1910.13461, 2019.

  9. [8] C. Raffel, N. Shazeer, A. Roberts, K. Lee, S. Narang, Matena, Y. Zhou, W. Li, and P. J. Liu, "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer," arXiv:1910.10683, 2019.

  10. [9] Y. Lin, "ROUGE: A Package for Automatic Evaluation of Summaries," in *Proceedings of the ACL Workshop: Text Summarization Branches Out*, 2004, pp. 74–81.

  11. [10] Beltagy, K. Lo, and A. Cohan, "SciBERT: A Pretrained Language Model for Scientific Text," arXiv:1903.10676, 2019.

Ethical Compliance & Review Process

  • All submissions are screened under plagiarism detection.
  • Review follows editorial policy.
  • Authors retain copyright.
  • Peer Review Type: Double-Blind Peer Review
  • Published on: May 03 2026
CCBYNC

This article is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. You are free to share and adapt this work for non-commercial purposes with proper attribution.

View License
Scroll to Top