![](images/logo.png?v=2)
International Journal For Multidisciplinary Research
E-ISSN: 2582-2160
•
Impact Factor: 9.24
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Home
Research Paper
Submit Research Paper
Publication Guidelines
Publication Charges
Upload Documents
Track Status / Pay Fees / Download Publication Certi.
Editors & Reviewers
View All
Join as a Reviewer
Reviewer Referral Program
Get Membership Certificate
Current Issue
Publication Archive
Conference
Publishing Conf. with IJFMR
Upcoming Conference(s) ↓
WSMCDD-2025
Conferences Published ↓
RBS:RH-COVID-19 (2023)
ICMRS'23
PIPRDA-2023
Contact Us
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 6 Issue 4
July-August 2024
Indexing Partners
![Academia.edu Academia](images/index-partners/academia.png)
![Advanced Sciences Index Advanced Sciences Index](images/index-partners/advanced-sciences.png)
![Bielefeld Academic Search Engine Bielefeld Academic Search Engine](images/index-partners/bielefeld.gif)
![CiteSeer CiteSeer](images/index-partners/cite-seer.png)
![DRJI DRJI](images/index-partners/drji.png)
![Google Scholar Google Scholar](images/index-partners/google-scholar.png)
![Independent Search Engine & Directory Network (isedn.org) Independent Search Engine & Directory Network](images/index-partners/isedn.jpg)
![ISI (International Scientific Indexing) ISI (International Scientific Indexing)](images/index-partners/isi.png)
![Issuu Issuu](images/index-partners/issuu.png)
![Mendeley Research Networks Mendeley Research Networks](images/index-partners/mendeley.png)
![RefSeek RefSeek](images/index-partners/ref-seek.png)
![ResearcherId - Thomson Reuters ResearcherId - Thomson Reuters](images/index-partners/researcher-id.png)
![ResearchGate ResearchGate](images/index-partners/research-gate.png)
![Scirus Scirus](images/index-partners/scirus.png)
![Scribd Scribd](images/index-partners/scribd.gif)
![Semantic Scholar Semantic Scholar](images/index-partners/semantic-scholar.png)
![UTeM - Universiti Teknikal Malaysia Melaka UTeM - Universiti Teknikal Malaysia Melaka](images/index-partners/utem.png)
![Wiki for Call for Papers Wiki for Call for Papers](images/index-partners/wiki-cfp.png)
![WorldCat WorldCat](images/index-partners/world-cat.png)
Image Caption Generator by using CNN and LSTM
Author(s) | S. Pasupathy |
---|---|
Country | India |
Abstract | In this article, we systematically analyze a deep neural networks-based image caption generation method. Image Captioning aims to automatically generate a sentence description for an image. Our article model will take an image as input and generate on English sentence as output, describing the contents of the image. It has attracted much research attention in cognitive computing in the recent years. The task is rather complex, as the concepts of both computer vision and natural language processing domains are combined together. We have developed a model using the concepts of a Convolutional Neural Network (CNN) and long Short-Term Memory (LSTM) model and build a working model of Image caption generator by implementing CNN and LSTM. After the caption generation phase, we use BLEU Scores to evaluate the efficiency of our model. Thus, our system helps the user to get descriptive caption for the given input image. |
Keywords | Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), BiLingual Evaluation Understudy (BLEU) |
Field | Computer > Artificial Intelligence / Simulation / Virtual Reality |
Published In | Volume 5, Issue 2, March-April 2023 |
Published On | 2023-04-23 |
Cite This | Image Caption Generator by using CNN and LSTM - S. Pasupathy - IJFMR Volume 5, Issue 2, March-April 2023. DOI 10.36948/ijfmr.2023.v05i02.2501 |
DOI | https://doi.org/10.36948/ijfmr.2023.v05i02.2501 |
Short DOI | https://doi.org/gr6h89 |
Share this
E-ISSN 2582-2160
![](images/ean-13-barcode.gif)
CrossRef DOI is assigned to each research paper published in our journal.
IJFMR DOI prefix is
10.36948/ijfmr
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
![](images/loading.gif)