International Journal For Multidisciplinary Research

E-ISSN: 2582-2160     Impact Factor: 9.24

A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal

Call for Paper Volume 6 Issue 4 July-August 2024 Submit your research before last 3 days of August to publish your research paper in the issue of July-August.

Machine Translation for Low Resource Language using NLP Attention Mechanism

Author(s) Vindya B J, Harish T A
Country India
Abstract Machine translation (MT) systems play a major role in overcoming language barriers, particularly for underutilized language pairs like Bangla-English. This research compares the relative effectiveness of Neural Machine Translation (NMT) against Statistical Machine Translation (SMT) for translating between Bangla and English. It does this by employing a thorough methodology that
involves model training, validation, and inference utilizing publicly available corpora. We developed and trained an NMT system using encoder-decoder architecture and attention methods. The system was trained using a Tatoeba Project dataset that was split into training and validation sets. The models
were evaluated and modified using standard metrics, and the trained model checkpoints were saved and recovered for inference. Our experiments demonstrate that when translating between Bangla and English, NMT outperforms SMT by a wide margin in terms of BLEU scores.
Keywords Machine Translation, Bangla-to-English, English-to-Bangla, Statistical Machine Translation, Neural Machine Translation, Sub word Segmentation, Byte Pair Encoding, Tatoeba Dataset
Field Computer > Artificial Intelligence / Simulation / Virtual Reality
Published In Volume 6, Issue 4, July-August 2024
Published On 2024-07-25
Cite This Machine Translation for Low Resource Language using NLP Attention Mechanism - Vindya B J, Harish T A - IJFMR Volume 6, Issue 4, July-August 2024.

Share this