Hybrid Transformer-Based Resume Parsing and Job Matching Using TextRank, SBERT, and DeBERTa

International Journal of Electronics and Communication Engineering |
© 2025 by SSRG - IJECE Journal |
Volume 12 Issue 9 |
Year of Publication : 2025 |
Authors : Madhu Bala Myneni, Fathima Sha Quadhari |
How to Cite?
Madhu Bala Myneni, Fathima Sha Quadhari, "Hybrid Transformer-Based Resume Parsing and Job Matching Using TextRank, SBERT, and DeBERTa," SSRG International Journal of Electronics and Communication Engineering, vol. 12, no. 9, pp. 63-71, 2025. Crossref, https://doi.org/10.14445/23488549/IJECE-V12I9P105
Abstract:
As the job markets have become very competitive, distinguishing oneself has become very important. Resume summarization and ranking have emerged as a crucial task for processing and handling large volumes of resumes. With the advent of Natural Language Processing (NLP) techniques, automated resume summarization has gained significant attention due to its potential to expedite the hiring process while ensuring fairness and objectivity. This paper focuses on hybrid transformer-based resume parsing and job matching by implementing TextRank, SBERT, and DeBERTa. DeBERTa, an advanced transformer, functions on a disentangled attention mechanism for contextual understanding of the words, TextRank, SBERT and PageRank algorithms for extractive summarization. The ranking of candidates is done by calculating a composite score, which includes evaluation metrics like cosine similarity for job description match based on understanding the context. Difflib- a sequence matcher for candidates’ experience fit, and Jaccard similarity for skills match. These scores are weighted based on their importance to the job, creating a balanced and tailored ranking. This approach focuses on saving time, reducing labour costs, and making recruitment more efficient by identifying the best matches for each position.
Keywords:
Natural Language Processing, Recruitment automation, Resume, Sentence-BERT, TextRank.
References:
[1] Satyaki Sanyal et al., “Resume Parser with Natural Language Processing,” International Journal of Engineering Science and Computing, vol. 7, no. 2, pp. 4484-4489, 2017.
[Google Scholar]
[2] Alessandro Barducci et al., “An End-to-End Framework for Information Extraction from Italian Resumes,” Expert Systems with Applications, vol. 210, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Vedant Bhatia et al., “End-to-End Resume Parsing and Finding Candidates for a Job Description using BERT,” arXiv Preprint, pp. 1-6, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[4] G.O. Narendra, and S. Hashwanth, “Named Entity Recognition based Resume Parser and Summarizer,” International Journal of Advanced Research in Science Communication and Technology, vol. 2, no. 1, pp. 451-456, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[5] B.V. Brindashree, and T.P. Pushphavathi, “HR Analytics: Resume Parsing Using NER and Candidate Hiring Prediction Using Machine Learning Model,” International Journal of Research in Engineering, Science and Management, vol. 6, no. 12, pp. 217-221, 2023.
[Publisher Link]
[6] Sai Tarun Boddu, Sujeeth Desu, and Sreekanth Puli, “Resume Summarizer and Job Description Matcher using Natural Language Processing and Spacy,” International Journal of Scientific Research in Engineering and Management, vol. 7, no. 11, pp. 1-11, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Asmita Deshmukh, and Anjali Raut, “Enhanced Resume Screening for Smart Hiring using Sentence-Bidirectional Encoder Representations from Transformers (S-BERT),” International Journal of Advanced Computer Science and Applications, vol. 15, no. 8, pp. 1-9, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Jacob Devlin et al., “BERT: Pre-Training of Deep Bidirectional Transformers for Language Understanding,” Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1, pp. 4171-4186, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Chaoyu Fan et al., “The Entity Relationship Extraction Method using Improved RoBERTa and Multi-Task Learning,” Computers, Materials & Continua, vol. 77, no. 2, pp. 1719-1738, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Vaibhav Gulati et al., “Extractive Article Summarization using Integrated TextRank and BM25+ Algorithm,” Electronics, vol. 12, no. 2, pp. 1-17, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[11] Pengcheng He et al., “DeBERTa: Decoding-Enhanced BERT with Disentangled Attention,” arXiv preprint, pp. 1-23, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[12] Vinaya James, Akshay Kulkarni, and Rashmi Agarwal, “Resume Shortlisting and Ranking with Transformers,” Intelligent Systems and Machine Learning, pp. 99-108, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[13] Arwa A. Jamjoom et al., “RoBERTaNET: Enhanced RoBERTa Transformer-based Model for Cyberbullying Detection with GloVe Features,” IEEE Access, vol. 12, pp. 58950-58959, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Ylber Januzaj, and Artan Luma, “Cosine Similarity – A Computing Approach to Match Similarity Between Higher Education Programs and Job Market Demands Based on Maximum Number of Common Words,” International Journal of Emerging Technologies in Learning, vol. 17, no. 12, pp. 258-268, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[15] Dineth Jayakody et al., “Instruct-DeBERTa: A Hybrid Approach for Aspect-based Sentiment Analysis on Textual Reviews,” arXiv Preprint, pp. 1-12, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[16] Wahab Khan et al., “Named Entity Recognition using Conditional Random Fields,” Applied Sciences, vol. 12, no. 13, pp. 1-18, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Diksha Khurana et al., “Natural Language Processing: State of the Art, Current Trends and Challenges,” Multimedia Tools and Applications, vol. 82, pp. 3713-3744, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[18] Zijian Győző Yang et al., “Abstractive Text Summarization for Hungarian,” Annals of Mathematics and Computer Science, vol. 53, pp. 299-316, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[19] Yinhan Liu et al., “RoBERTa: A Robustly Optimized BERT Pretraining Approach,” arXiv Preprint, pp. 1-13, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[20] Gunjan Keswani et al., “Abstractive Long Text Summarization using Large Language Models,” International Journal of Intelligent Systems and Applications in Engineering, vol. 12, no. 12s, pp. 160-168, 2024.
[Google Scholar] [Publisher Link]
[21] Suphakit Niwattanakul et al., “Using of Jaccard Coefficient for Keywords Similarity,” Proceedings of the International MultiConference of Engineers and Computer Scientists, Hong Kong, vol. 1, pp. 380-384, 2013.
[Google Scholar] [Publisher Link]
[22] Ahmad Farhan AlShammari, “Implementation of Text Similarity using Cosine Similarity Method in Python,” International Journal of Computer Applications, vol. 185, no. 2, pp. 1-4, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[23] Rimmalapudi Rajesh et al., “Resume Parsing Using Named Entity Recognition and Hugging Face Model,” International Journal of Innovative Research in Science, Engineering and Technology, vol. 13, no. 3, pp. 2585-2592, 2024.
[CrossRef] [Publisher Link]
[24] G.R. Gurushantha Murthy, Shinu Abhi, and Rashmi Agarwal, “A Hybrid Resume Parser and Matcher using RegEx and NER,” 2023 International Conference on Advances in Computation, Communication and Information Technology (ICAICCIT), Faridabad, India, pp. 24-29, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[25] Mehtap Saatci, Rukiye Kaya, and Ramazan Ünlü, “Resume Screening with Natural Language Processing (NLP),” Alphanumeric Journal, vol. 12, no. 2, pp. 121-140, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[26] Irfan Ali et al., “Resume Classification System Using Natural Language Processing and Machine Learning Techniques,” Mehran University Research Journal of Engineering and Technology, vol. 41, no. 1, pp. 65-79, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[27] K. Venkatesh Sharma et al., “Enhancing Query Relevance: Leveraging SBERT and Cosine Similarity for Optimal Information Retrieval,” International Journal of Speech Technology, vol. 27, pp. 753-763, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[28] Fahd A. Ghanem et al., “Deep Learning-Based Short Text Summarization: An Integrated BERT and Transformer Encoder–Decoder Approach,” Computation, vol. 13, no. 4, pp. 1-21, 2025.
[CrossRef] [Google Scholar] [Publisher Link]
[29] Neha Aidasani et al., “AI-Powered Resume Ranking System: Enhancing Recruitment Efficiency Through Natural Language Processing,” International Journal for Research Trends and Innovation, vol. 10, no. 5, pp. 367-371, 2025.
[Publisher Link]
[30] S. Divya et al., “Unified Extractive-Abstractive Summarization: A Hybrid Approach Utilizing BERT and Transformer Models for Enhanced Document Summarization,” PeerJ Computer Science, vol. 10, pp. 1-26, 2024.
[CrossRef] [Google Scholar] [Publisher Link]