![](images/logo.png?v=2)
International Journal For Multidisciplinary Research
E-ISSN: 2582-2160
•
Impact Factor: 9.24
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Home
Research Paper
Submit Research Paper
Publication Guidelines
Publication Charges
Upload Documents
Track Status / Pay Fees / Download Publication Certi.
Editors & Reviewers
View All
Join as a Reviewer
Reviewer Referral Program
Get Membership Certificate
Current Issue
Publication Archive
Conference
Publishing Conf. with IJFMR
Upcoming Conference(s) ↓
WSMCDD-2025
GSMCDD-2025
Conferences Published ↓
RBS:RH-COVID-19 (2023)
ICMRS'23
PIPRDA-2023
Contact Us
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 7 Issue 1
January-February 2025
Indexing Partners
![Academia.edu Academia](images/index-partners/academia.png)
![Advanced Sciences Index Advanced Sciences Index](images/index-partners/advanced-sciences.png)
![Bielefeld Academic Search Engine Bielefeld Academic Search Engine](images/index-partners/bielefeld.gif)
![CiteSeer CiteSeer](images/index-partners/cite-seer.png)
![DRJI DRJI](images/index-partners/drji.png)
![Google Scholar Google Scholar](images/index-partners/google-scholar.png)
![Independent Search Engine & Directory Network (isedn.org) Independent Search Engine & Directory Network](images/index-partners/isedn.jpg)
![ISI (International Scientific Indexing) ISI (International Scientific Indexing)](images/index-partners/isi.png)
![Issuu Issuu](images/index-partners/issuu.png)
![Mendeley Research Networks Mendeley Research Networks](images/index-partners/mendeley.png)
![RefSeek RefSeek](images/index-partners/ref-seek.png)
![ResearcherId - Thomson Reuters ResearcherId - Thomson Reuters](images/index-partners/researcher-id.png)
![ResearchGate ResearchGate](images/index-partners/research-gate.png)
![Scirus Scirus](images/index-partners/scirus.png)
![Scribd Scribd](images/index-partners/scribd.gif)
![Semantic Scholar Semantic Scholar](images/index-partners/semantic-scholar.png)
![UTeM - Universiti Teknikal Malaysia Melaka UTeM - Universiti Teknikal Malaysia Melaka](images/index-partners/utem.png)
![Wiki for Call for Papers Wiki for Call for Papers](images/index-partners/wiki-cfp.png)
![WorldCat WorldCat](images/index-partners/world-cat.png)
Fine-Tuning Pre-Trained Language Models for Improved Retrieval in RAG Systems for Domain-Specific Use
Author(s) | Syed Arham Akheel |
---|---|
Country | USA |
Abstract | Large Language Models (LLMs) have significantly advanced natural language understanding and generation capabilities, but domain-specific applications often necessitate supplementation with current, external information to mitigate knowledge gaps and reduce hallucinations. Retrieval-Augmented Generation (RAG) has emerged as an effective solution, dynamically integrating up-to-date information through retrieval mechanisms. Fine-tuning pre-trained LLMs with domain-specific data to optimize retrieval queries has become an essential strategy to enhance RAG systems, especially in ensuring the retrieval of highly relevant information from vector databases for response generation. This paper provides a comprehensive review of the literature on the fine-tuning of LLMs to optimize retrieval processes in RAG systems. We discuss advancements such as Query Optimization, Retrieval-Augmented Fine Tuning (RAFT), Retrieval-Augmented Dual Instruction Tuning (RA-DIT), as well as frameworks like RALLE, DPR, and the ensemble of retrieval based and generation-based systems, that enhance the synergy between retrievers and LLMs. |
Keywords | Retrieval-Augmented Generation, Large Language Models, Domain-Specific Fine-Tuning, Information Retrieval, RAFT, RA-DIT |
Field | Computer Applications |
Published In | Volume 6, Issue 5, September-October 2024 |
Published On | 2024-10-22 |
Cite This | Fine-Tuning Pre-Trained Language Models for Improved Retrieval in RAG Systems for Domain-Specific Use - Syed Arham Akheel - IJFMR Volume 6, Issue 5, September-October 2024. DOI 10.36948/ijfmr.2024.v06i05.22581 |
DOI | https://doi.org/10.36948/ijfmr.2024.v06i05.22581 |
Short DOI | https://doi.org/g82hrk |
Share this
![](images/issn-logo.png)
E-ISSN 2582-2160
![](images/ean-13-barcode.gif)
CrossRef DOI is assigned to each research paper published in our journal.
IJFMR DOI prefix is
10.36948/ijfmr
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
![](images/loading.gif)