DynaComm: Accelerating Distributed CNN Training between Edges and Clouds through Dynamic Communication Scheduling Show others and affiliations
2022 (English) In: IEEE Journal on Selected Areas in Communications, ISSN 0733-8716, E-ISSN 1558-0008, Vol. 40, no 2, p. 611-625Article in journal (Refereed) Published
Abstract [en]
To reduce uploading bandwidth and address privacy concerns, deep learning at the network edge has been an emerging topic. Typically, edge devices collaboratively train a shared model using real-time generated data through the Parameter Server framework. Although all the edge devices can share the computing workloads, the distributed training processes over edge networks are still time-consuming due to the parameters and gradients transmission procedures between parameter servers and edge devices. Focusing on accelerating distributed Convolutional Neural Networks (CNNs) training at the network edge, we present DynaComm, a novel scheduler that dynamically decomposes each transmission procedure into several segments to achieve optimal layer-wise communications and computations overlapping during run-time. Through experiments, we verify that DynaComm manages to achieve optimal layer-wise scheduling for all cases compared to competing strategies while the model accuracy remains untouched.
Place, publisher, year, edition, pages IEEE, 2022. Vol. 40, no 2, p. 611-625
Keywords [en]
Edge computing, deep learning training, dynamic scheduling, convolutional neural network
National Category
Computer Systems
Research subject Pervasive Mobile Computing
Identifiers URN: urn:nbn:se:ltu:diva-87417 DOI: 10.1109/jsac.2021.3118419 ISI: 000742724700013 Scopus ID: 2-s2.0-85119617658 OAI: oai:DiVA.org:ltu-87417 DiVA, id: diva2:1601138
Note Validerad;2022;Nivå 2;2022-03-07 (joosat);
Funder: National Key R&D Program of China (2019YFB2101700, 2018YFB0804402); National Science Foundation of China (U1736115); Key Research and Development Project of Sichuan Province (21SYSX0082)
2021-10-072021-10-072022-07-04 Bibliographically approved