skip to main content
10.1145/3384419.3430782acmconferencesArticle/Chapter ViewAbstractPublication PagessensysConference Proceedingsconference-collections
research-article

ePerceptive: energy reactive embedded intelligence for batteryless sensors

Published:16 November 2020Publication History

ABSTRACT

For long, we have studied tiny energy harvesters to liberate sensors from batteries. With remarkable progress in embedded deep learning, we are now re-imagining these sensors as intelligent compute nodes. Naturally, we are approaching a crossroad where sensor intelligence is meeting energy autonomy enabling maintenance-free swarm intelligence and unleashing a plethora of applications ranging from precision agriculture to ubiquitous asset tracking to infrastructure monitoring. One of the critical challenges, however, is to adapt intelligence fidelity in response to available energy to maximise the overall system availability. To this end, we present the design and implementation of ePerceptive: a novel framework for best-effort embedded intelligence, i.e., inference fidelity varies in proportion to the instantaneous energy supplied. ePerceptive operates on two core principles. First, it enables training a single deep neural network (DNN) to operate on multiple input resolutions without compromising accuracy or incurring memory overhead. Second, it modifies a DNN architecture by injecting multiple exits to guarantee valid, albeit lower-fidelity inferences in the event of energy interruption. The combination of these techniques offers a smooth adaptation between inference latency and recognition accuracy while matching the computational load to the available power budget. We report the manifestation of ePerceptive in designing batteryless cameras and microphones built with TI MSP430 MCU and off-the-shelf RF and solar energy harvesters. Our evaluation of these batteryless sensors with multiple vision and acoustic workloads suggest that the dynamic adaptation of ePerceptive can increase the inference throughput by up to 80% compared to a static baseline while ensuring a maximum accuracy drop of less than 6%.

References

  1. Mattia Antonini, Tran Huy Vu, Chulhong Min, Alessandro Montanari, Akhil Mathur, and Fahim Kawsar. 2019. Resource Characterisation of Personal-Scale Sensing Models on Edge Accelerators. In Proceedings of the First International Workshop on Challenges in Artificial Intelligence and Machine Learning for Internet of Things (AIChallengeIoT'19). 7. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Omid Ardakanian, Arka Bhattacharya, and David Culler. 2016. Non-intrusive techniques for establishing occupancy related energy savings in commercial buildings. In Proceedings of the 3rd ACM International Conference on Systems for Energy-Efficient Built Environments. 21--30.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Domenico Balsamo, Alex S Weddell, Geoff V Merrett, Bashir M Al-Hashimi, Davide Brunelli, and Luca Benini. 2014. Hibernus: Sustaining computation during intermittent supply for energy-harvesting systems. IEEE Embedded Systems Letters 7, 1 (2014), 15--18.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Sara Beery, Grant Van Horn, Oisin MacAodha, and Pietro Perona. 2019. The iwildcam 2018 challenge dataset. arXiv preprint arXiv:1904.05986 (2019).Google ScholarGoogle Scholar
  5. Sara Beery, Grant Van Horn, and Pietro Perona. 2018. Recognition in terra incognita. In Proceedings of the European Conference on Computer Vision (ECCV).Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Sourav Bhattacharya and Nicholas D Lane. 2016. Sparsification and separation of deep learning layers for constrained resource inference on wearables. In Proceedings of the 14th ACM Conference on Embedded Network Sensor Systems.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Mark Boddy and Thomas L Dean. 1989. Solving time-dependent planning problems. Brown University, Department of Computer Science.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Tolga Bolukbasi, Joseph Wang, Ofer Dekel, and Venkatesh Saligrama. 2017. Adaptive neural networks for fast test-time prediction. arXiv preprint arXiv:1702.07811 (2017).Google ScholarGoogle Scholar
  9. Han Cai, Chuang Gan, Tianzhe Wang, Zhekai Zhang, and Song Han. 2019. Once-for-all: Train one network and specialize it for efficient deployment. arXiv preprint arXiv:1908.09791 (2019).Google ScholarGoogle Scholar
  10. Gregory Chen, Hassan Ghaed, Razi-ul Haque, Michael Wieckowski, Yejoong Kim, Gyouho Kim, David Fick, Daeyeon Kim, Mingoo Seok, Kensall Wise, et al. 2011. A cubic-millimeter energy-autonomous wireless intraocular pressure monitor. In 2011 IEEE International Solid-State Circuits Conference. IEEE, 310--312.Google ScholarGoogle ScholarCross RefCross Ref
  11. Y Chen, Tien-Ju Yang, Joel Emer, and Vivienne Sze. 2018. Understanding the limitations of existing energy-efficient design approaches for deep neural networks. Energy 2, L1 (2018), L3.Google ScholarGoogle Scholar
  12. Aakanksha Chowdhery, Pete Warden, Jonathon Shlens, Andrew Howard, and Rocky Rhodes. 2019. Visual Wake Words Dataset. arXiv preprint arXiv:1906.05721 (2019).Google ScholarGoogle Scholar
  13. Alexei Colin, Emily Ruppel, and Brandon Lucia. 2018. A reconfigurable energy storage architecture for energy-harvesting devices. In Proceedings of the Twenty-Third International Conference on Architectural Support for Programming Languages and Operating Systems. 767--781.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Jasper de Winkel, Carlo Delle Donne, Kasim Sinan Yildirim, Przemysław Pawełczak, and Josiah Hester. 2020. Reliable Timekeeping for Intermittent Computing. In Proceedings of the Twenty-Fifth International Conference on Architectural Support for Programming Languages and Operating Systems. 53--67.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Thomas L Dean and Mark S Boddy. 1988. An Analysis of Time-Dependent Planning.. In AAAI, Vol. 88. 49--54.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Andy Rosales Elias, Nevena Golubovic, Chandra Krintz, and Rich Wolski. 2017. Where's the bear? Automating wildlife image processing using iot and edge cloud systems. In 2017 IEEE/ACM Second International Conference on Internet-of-Things Design and Implementation (IoTDI). IEEE, 247--258.Google ScholarGoogle Scholar
  17. Biyi Fang, Xiao Zeng, and Mi Zhang. 2018. Nestdnn: Resource-aware multi-tenant on-device deep learning for continuous mobile vision. In Proceedings of the 24th Annual International Conference on Mobile Computing and Networking. 115--127.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Graham Gobieski, Brandon Lucia, and Nathan Beckmann. 2019. Intelligence beyond the edge: Inference on intermittent embedded systems. In Proceedings of the Twenty-Fourth International Conference on Architectural Support for Programming Languages and Operating Systems. ACM, 199--213.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Song Han, Huizi Mao, and William J Dally. 2015. Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding. arXiv preprint arXiv:1510.00149 (2015).Google ScholarGoogle Scholar
  20. Josiah Hester, Timothy Scott, and Jacob Sorber. 2014. Ekho: Realistic and repeatable experimentation for tiny energy-harvesting sensors. In Proceedings of the 12th ACM Conference on Embedded Network Sensor Systems. 330--331.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Josiah Hester and Jacob Sorber. 2017. Flicker: Rapid prototyping for the batteryless internet-of-things. In Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems. 1--13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Josiah Hester, Kevin Storer, and Jacob Sorber. 2017. Timely execution on intermittently powered batteryless sensors. In Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems. 1--13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Andrew G Howard, Menglong Zhu, Bo Chen, Dmitry Kalenichenko, Weijun Wang, Tobias Weyand, Marco Andreetto, and Hartwig Adam. 2017. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861 (2017).Google ScholarGoogle Scholar
  24. Gao Huang, Danlu Chen, Tianhong Li, Felix Wu, Laurens van der Maaten, and Kilian Q Weinberger. 2017. Multi-scale dense networks for resource efficient image classification. arXiv preprint arXiv:1703.09844 (2017).Google ScholarGoogle Scholar
  25. Benoit Jacob, Skirmantas Kligys, Bo Chen, Menglong Zhu, Matthew Tang, Andrew Howard, Hartwig Adam, and Dmitry Kalenichenko. 2018. Quantization and training of neural networks for efficient integer-arithmetic-only inference. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.Google ScholarGoogle ScholarCross RefCross Ref
  26. Junchen Jiang, Ganesh Ananthanarayanan, Peter Bodik, Siddhartha Sen, and Ion Stoica. 2018. Chameleon: scalable adaptation of video analytics. In Proceedings of the 2018 Conference of the ACM Special Interest Group on Data Communication.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Aman Kansal, Jason Hsu, Sadaf Zahedi, and Mani B Srivastava. 2007. Power management in energy harvesting sensor networks. ACM Transactions on Embedded Computing Systems (TECS) 6, 4 (2007), 32-es.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Yong-Deok Kim, Eunhyeok Park, Sungjoo Yoo, Taelim Choi, Lu Yang, and Dongjun Shin. 2015. Compression of deep convolutional neural networks for fast and low power mobile applications. arXiv preprint arXiv:1511.06530 (2015).Google ScholarGoogle Scholar
  29. Seulki Lee and Shahriar Nirjon. 2019. Neuro. ZERO: a zero-energy neural network accelerator for embedded sensing and inference systems. In Proceedings of the 17th Conference on Embedded Networked Sensor Systems. 138--152.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. En Li, Liekang Zeng, Zhi Zhou, and Xu Chen. 2019. Edge AI: On-demand accelerating deep neural network inference via edge computing. IEEE Transactions on Wireless Communications 19, 1 (2019), 447--457.Google ScholarGoogle ScholarCross RefCross Ref
  31. Min Lin, Qiang Chen, and Shuicheng Yan. 2013. Network in network. arXiv preprint arXiv:1312.4400 (2013).Google ScholarGoogle Scholar
  32. Zhuang Liu, Mingjie Sun, Tinghui Zhou, Gao Huang, and Trevor Darrell. 2018. Rethinking the value of network pruning. arXiv preprint arXiv:1810.05270 (2018).Google ScholarGoogle Scholar
  33. Brandon Lucia, Vignesh Balaji, Alexei Colin, Kiwan Maeng, and Emily Ruppel. 2017. Intermittent computing: Challenges and opportunities. In 2nd Summit on Advances in Programming Languages (SNAPL 2017).Google ScholarGoogle Scholar
  34. Kiwan Maeng, Alexei Colin, and Brandon Lucia. 2017. Alpaca: intermittent execution without checkpoints. In SIGPLAN International Conference on Object-Oriented Programming, Systems, Languages, and Applications.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Chulhong Min, Alessandro Montanari, Akhil Mathur, and Fahim Kawsar. 2019. A Closer Look at Quality-Aware Runtime Assessment of Sensing Models in Multi-Device Environments. In Proceedings of the 17th Conference on Embedded Networked Sensor Systems (SenSys '19). Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Alessandro Montanari, Mohammed Alloulah, and Fahim Kawsar. 2019. Degradable Inference for Energy Autonomous Vision Applications. In Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp/ISWC '19). Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Feng Nan and Venkatesh Saligrama. 2017. Adaptive classification for prediction under a budget. In Advances in neural information processing systems. 4727--4737.Google ScholarGoogle Scholar
  38. Matteo Nardello, Harsh Desai, Davide Brunelli, and Brandon Lucia. 2019. Camaroptera: a Batteryless Long-Range Remote Visual Sensing System. In Proceedings of the 7th International Workshop on Energy Harvesting & Energy-Neutral Sensing Systems. 8--14.Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Sivert T Sliper, Oktay Cetinkaya, Alex S Weddell, Bashir Al-Hashimi, and Geoff V Merrett. 2020. Energy-driven computing. Philosophical Transactions of the Royal Society A 378, 2164 (2020), 20190158.Google ScholarGoogle Scholar
  40. Mingxing Tan and Quoc V Le. 2019. Efficientnet: Rethinking model scaling for convolutional neural networks. arXiv preprint arXiv:1905.11946 (2019).Google ScholarGoogle Scholar
  41. Surat Teerapittayanon, Bradley McDanel, and Hsiang-Tsung Kung. 2016. Branchynet: Fast inference via early exiting from deep neural networks. In 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2464--2469.Google ScholarGoogle ScholarCross RefCross Ref
  42. Zizhao Wang, Wei Bao, Dong Yuan, Liming Ge, Nguyen H Tran, and Albert Y Zomaya. 2019. SEE: Scheduling Early Exit for Mobile DNN Inference during Service Outage. In Proceedings of the 22nd International ACM Conference on Modeling, Analysis and Simulation of Wireless and Mobile Systems. 279--288.Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Pete Warden. 2018. Speech commands: A dataset for limited-vocabulary speech recognition. arXiv preprint arXiv:1804.03209 (2018).Google ScholarGoogle Scholar
  44. Marco Willi, Ross T Pitman, Anabelle W Cardoso, Christina Locke, Alexandra Swanson, Amy Boyer, Marten Veldthuis, and Lucy Fortson. 2019. Identifying animal species in camera trap images using deep learning and citizen science. Methods in Ecology and Evolution 10, 1 (2019), 80--91.Google ScholarGoogle Scholar
  45. Tien-Ju Yang, Yu-Hsin Chen, and Vivienne Sze. 2017. Designing energy-efficient convolutional neural networks using energy-aware pruning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 5687--5695.Google ScholarGoogle ScholarCross RefCross Ref
  46. Tien-Ju Yang, Andrew Howard, Bo Chen, Xiao Zhang, Alec Go, Mark Sandler, Vivienne Sze, and Hartwig Adam. 2018. Netadapt: Platform-aware neural network adaptation for mobile applications. In Proceedings of the European Conference on Computer Vision (ECCV). 285--300.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Kasim Sinan Yildirim, Amjad Yousef Majid, Dimitris Patoukas, Koen Schaper, Przemyslaw Pawelczak, and Josiah Hester. 2018. Ink: Reactive kernel for tiny batteryless sensors. In Proceedings of the 16th ACM Conference on Embedded Networked Sensor Systems. ACM, 41--53.Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Yundong Zhang, Naveen Suda, Liangzhen Lai, and Vikas Chandra. 2017. Hello edge: Keyword spotting on microcontrollers. arXiv preprint arXiv:1711.07128 (2017).Google ScholarGoogle Scholar

Index Terms

  1. ePerceptive: energy reactive embedded intelligence for batteryless sensors

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          SenSys '20: Proceedings of the 18th Conference on Embedded Networked Sensor Systems
          November 2020
          852 pages
          ISBN:9781450375900
          DOI:10.1145/3384419

          Copyright © 2020 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 16 November 2020

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          Overall Acceptance Rate174of867submissions,20%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader