International Journal For Multidisciplinary Research

E-ISSN: 2582-2160     Impact Factor: 9.24

A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal

Call for Paper Volume 6 Issue 6 November-December 2024 Submit your research before last 3 days of December to publish your research paper in the issue of November-December.

Minimizing Communication Overhead in Decentralized Deep Neural Networks

Author(s) Sirajddola Nadaf, Shivaji lamani
Country India
Abstract Minimizing communication overhead in decentralized deep neural network (DNN) training has become a critical challenge, particularly with the increasing adoption of distributed systems for large-scale machine learning tasks. This paper introduces advanced techniques that leverage gradient compression, adaptive sparsification, and hybrid aggregation to optimize communication efficiency while maintaining model accuracy and convergence rates. Experimental results on benchmark datasets such as CIFAR-10 and ImageNet show that the proposed methods reduce communication costs by up to 70% compared to standard approaches while achieving comparable or superior model accuracy. Additionally, scalability tests on diverse neural network architectures highlight the robustness of the approach, demonstrating efficient performance across varying network sizes and computational setups. These findings underscore the potential of the proposed strategies to enable faster, cost-effective, and sustainable decentralized deep learning systems.
Keywords Decentralized Deep Neural Networks (DNNs), Distributed Machine Learning, Communication Overhead, Gradient Compression, Internet of Things (IoT)
Field Engineering
Published In Volume 6, Issue 6, November-December 2024
Published On 2024-12-05
Cite This Minimizing Communication Overhead in Decentralized Deep Neural Networks - Sirajddola Nadaf, Shivaji lamani - IJFMR Volume 6, Issue 6, November-December 2024. DOI 10.36948/ijfmr.2024.v06i06.32530
DOI https://doi.org/10.36948/ijfmr.2024.v06i06.32530
Short DOI https://doi.org/g8t3jm

Share this