
International Journal For Multidisciplinary Research
E-ISSN: 2582-2160
•
Impact Factor: 9.24
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Home
Research Paper
Submit Research Paper
Publication Guidelines
Publication Charges
Upload Documents
Track Status / Pay Fees / Download Publication Certi.
Editors & Reviewers
View All
Join as a Reviewer
Get Membership Certificate
Current Issue
Publication Archive
Conference
Publishing Conf. with IJFMR
Upcoming Conference(s) ↓
WSMCDD-2025
GSMCDD-2025
Conferences Published ↓
RBS:RH-COVID-19 (2023)
ICMRS'23
PIPRDA-2023
Contact Us
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 7 Issue 2
March-April 2025
Indexing Partners



















Investigating the Optimal Cloud Computing Infrastructure for Training Large-Scale Generative Models
Author(s) | Abdul Sajid Mohammed, Shalmali Patil |
---|---|
Country | USA |
Abstract | The training of large-scale generative models, such as Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs), presents unique challenges due to their computational intensity and memory requirements. These models often require significant hardware resources, distributed frameworks, and scalable environments to manage vast datasets and extensive neural architectures. Cloud computing has emerged as a vital infrastructure for addressing these demands, offering scalable and flexible platforms that support high-performance computing, on-demand resource allocation, and specialized services. This survey explores the interplay between cloud computing and generative model training, highlighting key requirements, state-of-the-art solutions, optimization strategies, and cost-energy efficiency considerations. Furthermore, it identifies the prevailing challenges in cloud-based training environments and outlines potential future directions. The findings provide a comprehensive foundation for researchers and practitioners aiming to enhance the efficiency and scalability of generative model training through optimal cloud infrastructure. |
Keywords | Generative AI, Cloud Computing, Scalable, Distributed Computing, GPU Acceleration, TPU Pods, Federated Learning, Energy-Efficient Computing, Cost Optimization, AI Infrastructure, Data Security, Sustainability AI, Machine Learning |
Field | Computer > Data / Information |
Published In | Volume 4, Issue 6, November-December 2022 |
Published On | 2022-11-29 |
DOI | https://doi.org/10.36948/ijfmr.2022.v04i06.30908 |
Short DOI | https://doi.org/g8tzzg |
Share this

E-ISSN 2582-2160

CrossRef DOI is assigned to each research paper published in our journal.
IJFMR DOI prefix is
10.36948/ijfmr
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
