ABSTRACT
Tiny Machine Learning (TinyML) and Transfer Learning (TL) are two widespread methods of successfully deploying ML models to resource-starving devices. Tiny ML provides compact models, that can run on resource-constrained environments, while TL contributes to the performance of the model by using pre-existing knowledge. So, in this work we propose a simple but efficient TL method, applied to three types of Convolutional Neural Networks (CNN), by retraining more than the last fully connected layer of a CNN in the target device, and specifically one or more of the last convolutional layers. Our results shown that our proposed method (FxC1) achieves about increase in accuracy and increase in convergence speed, while it incurs a bit larger energy consumption overhead, compared to two baseline techniques, namely one that retrains the last fully connected layer, and another that retrains the whole network.
- Colby R. Banbury, Vijay Janapa Reddi, Maximilian Lam, William Fu, Amin Fazel, Jeremy Holleman, Xinyuan Huang, Robert Hurtado, David Kanter, Anton Lokhmotov, David A. Patterson, Danilo Pau, Jae sun Seo, Jeff Sieracki, Urmish Thakker, Marian Verhelst, and Poonam Yadav. 2020. Benchmarking TinyML Systems: Challenges and Direction. ArXiv abs/2003.04821(2020).Google Scholar
- PUNEET BANSAL. [n. d.]. Intel Image Classification. https://www.kaggle.com/datasets/puneet6060/intel-image-classification/discussionGoogle Scholar
- Andreas Chouliaras, Evangelia Fragkou, and Dimitrios Katsaros. 2021. Feed Forward Neural Network Sparsificationwith Dynamic Pruning. In 25th Pan-Hellenic Conference on Informatics (Volos, Greece) (PCI 2021). Association for Computing Machinery, New York, NY, USA, 12–17. https://doi.org/10.1145/3503823.3503826Google Scholar
Digital Library
- Oscar Day and Taghi M. Khoshgoftaar. 2017. A survey on heterogeneous transfer learning. Journal of Big Data 4(2017), 1–42.Google Scholar
Cross Ref
- Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. Imagenet: A large-scale hierarchical image database. In 2009 IEEE conference on computer vision and pattern recognition. Ieee, 248–255.Google Scholar
- Simone Disabato and Manuel Roveri. 2020. Incremental On-Device Tiny Machine Learning. Proceedings of the 2nd International Workshop on Challenges in Artificial Intelligence and Machine Learning for Internet of Things (2020).Google Scholar
Digital Library
- Evangelia Fragkou, Marianna Koultouki, and Dimitrios Katsaros. 2022. Model reduction of feed forward neural networks for resource-constrained devices. Applied Intelligence(2022).Google Scholar
- Yaroslav Ganin and Victor S. Lempitsky. 2015. Unsupervised Domain Adaptation by Backpropagation. ArXiv abs/1409.7495(2015).Google Scholar
- Yaroslav Ganin, E. Ustinova, Hana Ajakan, Pascal Germain, H. Larochelle, François Laviolette, Mario Marchand, and Victor S. Lempitsky. 2016. Domain-Adversarial Training of Neural Networks. In J. Mach. Learn. Res.Google Scholar
- Sedigh Ghamari, Koray Ozcan, Thu Dinh, Andrey Melnikov, Juan Carvajal, Jan Ernst, and Sek M. Chai. 2021. Quantization-Guided Training for Compact TinyML Models. ArXiv abs/2103.06231(2021).Google Scholar
- Lennart Heim, Andreas Biri, Zhongnan Qu, and Lothar Thiele. 2021. Measuring what Really Matters: Optimizing Neural Networks for TinyML. ArXiv abs/2104.10645(2021).Google Scholar
- Kavya Kopparapu and Eric Lin. 2021. TinyFedTL: Federated Transfer Learning on Tiny Devices. ArXiv abs/2110.01107(2021).Google Scholar
- Alex Krizhevsky, Vinod Nair, and Geoffrey Hinton. [n. d.]. CIFAR10 and CIFAR100. https://www.cs.toronto.edu/ kriz/cifar.htmlGoogle Scholar
- Jisu Kwon and Daejin Park. 2021. Toward Data-Adaptable TinyML using Model Partial Replacement for Resource Frugal Edge Device. The International Conference on High Performance Computing in Asia-Pacific Region (2021).Google Scholar
Digital Library
- Liangzhen Lai, Naveen Suda, and Vikas Chandra. 2018. CMSIS-NN: Efficient Neural Network Kernels for Arm Cortex-M CPUs. ArXiv abs/1801.06601(2018).Google Scholar
- Josen Daniel De Leon and Rowel Atienza. 2022. Depth Pruning with Auxiliary Networks for Tinyml. ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2022), 3963–3967.Google Scholar
- Yingling Li, Zhipeng Li, Tianxing Zhang, Peng Zhou, Siyin Feng, and Kunqin Yin. 2021. Design of a Novel Neural Network Compression Method for Tiny Machine Learning. Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering(2021).Google Scholar
Digital Library
- Edgar Liberis, Lukasz Dudziak, and Nicholas D. Lane. 2021. μNAS: Constrained Neural Architecture Search for Microcontrollers. Proceedings of the 1st Workshop on Machine Learning and Systems (2021).Google Scholar
Digital Library
- Wei Yang Bryan Lim, Nguyen Cong Luong, Dinh Thai Hoang, Yutao Jiao, Ying-Chang Liang, Qiang Yang, Dusit Tao Niyato, and Chunyan Miao. 2020. Federated Learning in Mobile Edge Networks: A Comprehensive Survey. IEEE Communications Surveys & Tutorials 22 (2020), 2031–2063.Google Scholar
- Mingsheng Long, Yue Cao, Jianmin Wang, and Michael I. Jordan. 2015. Learning Transferable Features with Deep Adaptation Networks. ArXiv abs/1502.02791(2015).Google Scholar
- Sinno Jialin Pan and Qiang Yang. 2010. A Survey on Transfer Learning. IEEE Transactions on Knowledge and Data Engineering 22, 10(2010), 1345–1359. https://doi.org/10.1109/TKDE.2009.191Google Scholar
Digital Library
- Dimitrios Papakostas, Theodoros Kasidakis, Evangelia Fragkou, and Dimitrios Katsaros. 2021. Backbones for Internet of Battlefield Things. In 2021 16th Annual Conference on Wireless On-demand Network Systems and Services Conference (WONS). 1–8. https://doi.org/10.23919/WONS51326.2021.9415560Google Scholar
- Haoyu Ren, Darko Anicic, and Thomas A. Runkler. 2021. TinyOL: TinyML with Online-Learning on Microcontrollers. 2021 International Joint Conference on Neural Networks (IJCNN) (2021), 1–8.Google Scholar
- Muhammad Akmal Shafique, Theocharis Theocharides, Vijay Janapa Reddy, and Boris Murmann. 2021. TinyML: Current Progress, Research Challenges, and Future Roadmap. 2021 58th ACM/IEEE Design Automation Conference (DAC) (2021), 1303–1306.Google Scholar
Digital Library
- Baochen Sun and Kate Saenko. 2016. Deep CORAL: Correlation Alignment for Deep Domain Adaptation. In ECCV Workshops.Google Scholar
- Maxim Zemlyanikin, Alexander Smorkalov, Tatiana Khanova, Anna Petrovicheva, and Grigory Serebryakov. 2019. 512KiB RAM Is Enough! Live Camera Face Recognition DNN on MCU. 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) (2019), 2493–2500.Google Scholar
Cross Ref
- Fuzhen Zhuang, Zhiyuan Qi, Keyu Duan, Dongbo Xi, Yongchun Zhu, Hengshu Zhu, Hui Xiong, and Qing He. 2021. A Comprehensive Survey on Transfer Learning. Proc. IEEE 109, 1 (2021), 43–76. https://doi.org/10.1109/JPROC.2020.3004555Google Scholar
Index Terms
Transfer Learning for Convolutional Neural Networks in Tiny Deep Learning Environments
Recommendations
Shallow Neural Networks beat Deep Neural Networks trained with transfer learning: A Use Case based on training Neural Networks to identify Covid-19 in chest X-ray images
PCI '21: Proceedings of the 25th Pan-Hellenic Conference on InformaticsSince the start of the covid-19 health crisis, there have been many studies on the application of deep learning models in order to detect the virus on chest X-ray images. Training large neural networks on big data sets is a computationally intensive ...
Learning to Learn and Compositionality with Deep Recurrent Neural Networks: Learning to Learn and Compositionality
KDD '16: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data MiningDeep neural network representations play an important role in computer vision, speech, computational linguistics, robotics, reinforcement learning and many other data-rich domains. In this talk I will show that learning-to-learn and compositionality are ...
Breast cancer masses classification using deep convolutional neural networks and transfer learning
AbstractWith the recent advances in the deep learning field, the use of deep convolutional neural networks (DCNNs) in biomedical image processing becomes very encouraging. This paper presents a new classification model for breast cancer masses based on ...






Comments