International Journal For Multidisciplinary Research
E-ISSN: 2582-2160
•
Impact Factor: 9.24
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Home
Research Paper
Submit Research Paper
Publication Guidelines
Publication Charges
Upload Documents
Track Status / Pay Fees / Download Publication Certi.
Editors & Reviewers
View All
Join as a Reviewer
Reviewer Referral Program
Get Membership Certificate
Current Issue
Publication Archive
Conference
Publishing Conf. with IJFMR
Upcoming Conference(s) ↓
WSMCDD-2025
GSMCDD-2025
Conferences Published ↓
RBS:RH-COVID-19 (2023)
ICMRS'23
PIPRDA-2023
Contact Us
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 6 Issue 6
November-December 2024
Indexing Partners
Machine Translation with Neural Networks Based on a Transformer
Author(s) | Chaali Kaoutar, Lachhab Yasser, Sefiani Maha |
---|---|
Country | Morocco |
Abstract | Over the same period, thanks to deep neural networks, particularly in machine translation tasks that fall under natural language process (NLP), there have been major advances. Context is not something that lies well with traditional systems based on statistical models or tranlation memory and they also struggle with the level of fluency in translation. Through the invention of their Transformer models, Vaswani et al. This has also made great strides in improving both precision and performance [28] from 2017. In this paper, we explore how to adapt pre-trained Transformer models into low-resource languages. For lower-engaged languages, we want to introduce hybrid methods in machine translation systems using multi-task learning and also transfer-learning with different metrics. To address this, we exploit the high-resource languages for training and introduce representation reduction techniques to be able to better handle some low-resourced cases. Our results show that multi-task training improves BLEU scores by a large margin, especially for more scarce languages such as Swahili and Amharic. Our results exemplify the power of combining effective multi-task training with transfer learning in low-resource language translation. |
Keywords | Natural Language Processing (NLP), Machine Translation , Transformer Models ,Low-Resource Languages ,Multi-Task Learning , Transfer Learning, Vocabulary Reduction , Neural Networks , BLEU Score , Language Models . |
Field | Computer > Artificial Intelligence / Simulation / Virtual Reality |
Published In | Volume 6, Issue 5, September-October 2024 |
Published On | 2024-09-04 |
Cite This | Machine Translation with Neural Networks Based on a Transformer - Chaali Kaoutar, Lachhab Yasser, Sefiani Maha - IJFMR Volume 6, Issue 5, September-October 2024. DOI 10.36948/ijfmr.2024.v06i05.26674 |
DOI | https://doi.org/10.36948/ijfmr.2024.v06i05.26674 |
Short DOI | https://doi.org/gt9kvb |
Share this
E-ISSN 2582-2160
doi
CrossRef DOI is assigned to each research paper published in our journal.
IJFMR DOI prefix is
10.36948/ijfmr
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.