Abstract
Generative adversarial networks (GANs) have shown great success in deep representations learning, data generation, and security enhancement. With the development of the Internet of Things, 5th generation wireless systems (5G), and other technologies, the large volume of data collected at the edge of networks provides a new way to improve the capabilities of GANs. Due to privacy, bandwidth, and legal constraints, it is not appropriate to upload all the data to the cloud or servers for processing. Therefore, this article focuses on deploying and training GANs at the edge rather than converging edge data to the central node. To address this problem, we designed a novel distributed learning architecture for GANs, called DANCE. DANCE can adaptively perform communication compression based on the available bandwidth, while supporting both data and model parallelism training of GANs. In addition, inspired by the gossip mechanism and Stackelberg game, a compatible algorithm, AC-GAN is proposed. The theoretical analysis guarantees the convergence of the model and the existence of approximate equilibrium in AC-GAN. Both simulation and prototype system experiments show that AC-GAN can achieve better training effectiveness with less communication overhead than the SOTA algorithms, i.e., FL-GAN and MD-GAN.
- [1] . 2018. Oboe: Auto-tuning video ABR algorithms to network conditions. In Proceedings of the Conference of the ACM Special Interest Group on Data Communication. 44–58. Google Scholar
Digital Library
- [2] . 2017. QSGD: Communication-Efficient SGD via gradient quantization and encoding. In Advances in Neural Information Processing Systems, Vol. 30, 1709–1720. Google Scholar
Digital Library
- [3] . 2006. Randomized gossip algorithms. IEEE Trans. Inf. Theory 52, 6 (2006), 2508–2530.Google Scholar
Digital Library
- [4] . 2020. Edge federation: Towards an integrated service provisioning model. IEEE/ACM Transactions on Networking 28, 3 (2020), 1116–1129.Google Scholar
- [5] . 2017. Scan: Structure correcting adversarial network for chest x-rays organ segmentation. arXiv:1703.08770. Retrieved from https://arxiv.org/abs/1703.08770.Google Scholar
- [6] . 2019. Unbiased estimators for random design regression. arXiv:1907.03411. Retrieved from https://arxiv.org/abs/1907.03411.Google Scholar
- [7] . 2017. Generative multi-adversarial networks. In Proceedings of the International Conference on Learning Representations.Google Scholar
- [8] . 2019. Generative adversarial networks for distributed intrusion detection in the internet of things. In Proceedings of the IEEE Global Communications Conference
(GLOBECOM’19) . IEEE, 1–6.Google ScholarDigital Library
- [9] . 2018. Multi-agent diverse generative adversarial networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 8513–8521.Google Scholar
Cross Ref
- [10] . 2014. Generative adversarial nets. In Advances in Neural Information Processing Systems, Vol. 3, 2672–2680. Google Scholar
Digital Library
- [11] . 2018. Gossiping GANs position paper. In Proceedings of the 2nd Workshop on Distributed Infrastructures for Deep Learning, Part of Middleware, 25–28. Google Scholar
Digital Library
- [12] . 2019. MD-GAN: Multi-discriminator generative adversarial networks for distributed datasets. In Proceedings of the IEEE 33rd International Parallel and Distributed Processing Symposium. 866–877.Google Scholar
Cross Ref
- [13] . 2017. Gans trained by a two time-scale update rule converge to a local nash equilibrium. In Advances in Neural Information Processing Systems. 6626–6637. Google Scholar
Digital Library
- [14] . 2017. Multi-generator generative adversarial nets. arXiv:1708.02556. Retrieved from https://arxiv.org/abs/1708.02556.Google Scholar
- [15] . 2018. Progressive growing of gans for improved quality, stability, and variation. In Proceedings of the International Conference on Learning Representations, 7354–7363.Google Scholar
- [16] . 2014. Adam: A method for stochastic optimization. arXiv:1412.6980. Retrieved from https://arxiv.org/abs/1412.6980.Google Scholar
- [17] . 2019. Decentralized stochastic optimization and gossip algorithms with compressed communication. In Proceedings of the International Conference on Machine Learning. 3478–3487.Google Scholar
- [18] . 2017. Federated learning: Strategies for improving communication efficiency. arXiv:1610.05492. Retrieved from https://arxiv.org/abs/1610.05492.Google Scholar
- [19] . 2014. The cifar-10 Dataset. Retrieved from http://www.cs.toronto.edu/kriz/cifar.html.Google Scholar
- [20] . 1998. Gradient-based learning applied to document recognition. Proc. IEEE 86, 11 (1998), 2278–2324.Google Scholar
Cross Ref
- [21] . 2020. Task Scheduling with UAV-assisted Vehicular Cloud for Road Detection in Highway Scenario. IEEE Internet of Things Journal 7, 8 (2020), 7702–7713.Google Scholar
- [22] . 2014. Scaling distributed machine learning with the parameter server. In Proceedings of the 11th Symposium on Operating Systems Design and Implementation. 583–598. Google Scholar
Digital Library
- [23] . 2018. Deep gradient compression: Reducing the communication bandwidth for distributed training. In Proceedings of the International Conference on Learning Representations.Google Scholar
- [24] . 2017. Least squares generative adversarial networks. In Proceedings of the IEEE International Conference on Computer Vision. 2794–2802.Google Scholar
Cross Ref
- [25] . 2018. On the effectiveness of least squares generative adversarial networks. IEEE Trans. Pattern Anal. Mach. Intell. 41, 12 (2018), 2947–2960.Google Scholar
Cross Ref
- [26] . 2016. Federated learning of deep networks using model averaging. arXiv:1602.05629. Retrieved from https://arxiv.org/abs/1602.05629.Google Scholar
- [27] . 2019. Exploiting unintended feature leakage in collaborative learning. In Proceedings of the IEEE Symposium on Security and Privacy
(SP’19) . IEEE, 691–706.Google ScholarCross Ref
- [28] . 2017. Dual discriminator generative adversarial nets. In Advances in Neural Information Processing Systems. 2670–2680. Google Scholar
Digital Library
- [29] . 2017. Conditional image synthesis with auxiliary classifier gans. In Proceedings of the 34th International Conference on Machine Learning, Vol. 70. 2642–2651. Google Scholar
Digital Library
- [30] . 2017. Performability analysis of cloudlet in mobile cloud computing. Inf. Sci.388 (2017), 99–117. Google Scholar
Digital Library
- [31] . 1951. A stochastic approximation method. The Annals of Mathematical Statistics, 400–407.Google Scholar
Cross Ref
- [32] . 2016. Improved techniques for training gans. In Advances in Neural Information Processing Systems. 2234–2242. Google Scholar
Digital Library
- [33] . 2014. 1-bit stochastic gradient descent and its application to data-parallel distributed training of speech DNNs. In Proceedings of the Annual Conference of the International Speech Communication Association (2014), 1058–1062.Google Scholar
Cross Ref
- [34] . 2017. SSGAN: Secure steganography based on generative adversarial networks. In Proceedings of the Pacific Rim Conference on Multimedia. Springer, 534–544.Google Scholar
- [35] . 2019. Speedtest Report of the U.S. Retrieved February 14, 2020 fromhttps://www.speedtest.net/reports/united-states/.Google Scholar
- [36] . 2018. Sparsified SGD with memory. In Advances in Neural Information Processing Systems, 4447–4458. Google Scholar
Digital Library
- [37] . 2018. Perceptual adversarial networks for image-to-image transformation. IEEE Trans. Image Process. 27, 8 (2018), 4066–4079.Google Scholar
Cross Ref
- [38] . 2017. Fashion-mnist: A novel image dataset for benchmarking machine learning algorithms. arXiv:1708.07747. Retrieved from https://arxiv.org/abs/1708.07747.Google Scholar
- [39] . 2019. Decentralized learning of generative adversarial networks from multi-client non-iid data. arXiv:1905.09684. Retrieved from https://arxiv.org/abs/1905.09684.Google Scholar
- [40] . 2019. Self-attention generative adversarial networks. In Proceedings of the International Conference on Machine Learning, 7354–7363.Google Scholar
- [41] . 2018. Stackelberg GAN: Towards provable minimax equilibrium via multi-generator architectures. arXiv:1811.08010. Retrieved from https://arxiv.org/abs/1811.08010.Google Scholar
Index Terms
DANCE: Distributed Generative Adversarial Networks with Communication Compression
Recommendations
CapsuleGAN: Generative Adversarial Capsule Network
Computer Vision – ECCV 2018 WorkshopsAbstractWe present Generative Adversarial Capsule Network (CapsuleGAN), a framework that uses capsule networks (CapsNets) instead of the standard convolutional neural networks (CNNs) as discriminators within the generative adversarial network (GAN) ...
Detect and Remove Watermark in Deep Neural Networks via Generative Adversarial Networks
Information SecurityAbstractDeep neural networks (DNN) have achieved remarkable performance in various fields. However, training a DNN model from scratch requires expensive computing resources and a lot of training data, which are difficult to obtain for most individual ...
Random Generative Adversarial Networks
SoICT '22: Proceedings of the 11th International Symposium on Information and Communication TechnologyGenerative Adversarial Networks have surprisingly shown great ability in synthesizing high-fidelity and diverse images while resolving the problem of so-called mode collapse still remains a challenge to all researches in this field. In this paper, we ...






Comments