International Journal For Multidisciplinary Research
E-ISSN: 2582-2160
•
Impact Factor: 9.24
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Home
Research Paper
Submit Research Paper
Publication Guidelines
Publication Charges
Upload Documents
Track Status / Pay Fees / Download Publication Certi.
Editors & Reviewers
View All
Join as a Reviewer
Reviewer Referral Program
Get Membership Certificate
Current Issue
Publication Archive
Conference
Publishing Conf. with IJFMR
Upcoming Conference(s) ↓
WSMCDD-2025
GSMCDD-2025
Conferences Published ↓
RBS:RH-COVID-19 (2023)
ICMRS'23
PIPRDA-2023
Contact Us
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 6 Issue 6
November-December 2024
Indexing Partners
Minimizing Communication Overhead in Decentralized Deep Neural Networks
Author(s) | Sirajddola Nadaf, Shivaji lamani |
---|---|
Country | India |
Abstract | Minimizing communication overhead in decentralized deep neural network (DNN) training has become a critical challenge, particularly with the increasing adoption of distributed systems for large-scale machine learning tasks. This paper introduces advanced techniques that leverage gradient compression, adaptive sparsification, and hybrid aggregation to optimize communication efficiency while maintaining model accuracy and convergence rates. Experimental results on benchmark datasets such as CIFAR-10 and ImageNet show that the proposed methods reduce communication costs by up to 70% compared to standard approaches while achieving comparable or superior model accuracy. Additionally, scalability tests on diverse neural network architectures highlight the robustness of the approach, demonstrating efficient performance across varying network sizes and computational setups. These findings underscore the potential of the proposed strategies to enable faster, cost-effective, and sustainable decentralized deep learning systems. |
Keywords | Decentralized Deep Neural Networks (DNNs), Distributed Machine Learning, Communication Overhead, Gradient Compression, Internet of Things (IoT) |
Field | Engineering |
Published In | Volume 6, Issue 6, November-December 2024 |
Published On | 2024-12-05 |
Cite This | Minimizing Communication Overhead in Decentralized Deep Neural Networks - Sirajddola Nadaf, Shivaji lamani - IJFMR Volume 6, Issue 6, November-December 2024. DOI 10.36948/ijfmr.2024.v06i06.32530 |
DOI | https://doi.org/10.36948/ijfmr.2024.v06i06.32530 |
Short DOI | https://doi.org/g8t3jm |
Share this
E-ISSN 2582-2160
doi
CrossRef DOI is assigned to each research paper published in our journal.
IJFMR DOI prefix is
10.36948/ijfmr
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.