Abstract
In this paper, we present an adaptive power manager for solar energy harvesting sensor nodes. We use a simplified model consisting of a solar panel, an ideal battery and a general sensor node with variable duty cycle. Our power manager uses Reinforcement Learning (RL), specifically SARSA(λ) learning, to train itself from historical data. Once trained, we show that our power manager is capable of adapting to changes in weather, climate, device parameters and battery degradation while ensuring near-optimal performance without depleting or overcharging its battery. Our approach uses a simple but novel general reward function and leverages the use of weather forecast data to enhance performance. We show that our method achieves near perfect energy neutral operation (ENO) with less than 6% root mean square deviation from ENO as compared to more than 23% deviation that occur when using other approaches.
- Luca Benini, Giuliano Castelli, Alberto Macii, and Riccardo Scarsi. 2001. Battery-driven dynamic power management. IEEE Design 8 Test of Computers 18, 2 (2001), 53--60. Google Scholar
Digital Library
- Pol Blasco et al. 2013. A learning theoretic approach to energy harvesting communication system optimization. IEEE Tr. on Wireless Communications 12, 4 (2013), 1872--1882.Google Scholar
Cross Ref
- Wai Hong Ronald Chan et al. 2015. Adaptive duty cycling in sensor networks with energy harvesting using continuous-time Markov chain and fluid models. IEEE Journal on Selected Areas in Communications 33, 12 (2015), 2687--2700.Google Scholar
Cross Ref
- Deniz Gunduz, Kostas Stamatiou, Nicolo Michelusi, and Michele Zorzi. 2014. Designing intelligent energy harvesting communication systems. IEEE Communications Magazine 52, 1 (2014), 210--216.Google Scholar
Cross Ref
- Jason Hsu et al. 2006. Adaptive duty cycling for energy harvesting systems. In Proc. of the 2006 ISLPED. 180--185. Google Scholar
Digital Library
- Roy Chaoming Hsu et al. 2014. A Reinforcement Learning-Based ToD Provisioning Dynamic Power Management for Sustainable Operation of Energy Harvesting Wireless Sensor Node. IEEE Tr. on Emerging Topics in Computing 2, 2 (2014), 181--191.Google Scholar
Cross Ref
- Roy Chaoming Hsu et al. 2015. Dynamic energy management of energy harvesting wireless sensor nodes using fuzzy inference system with reinforcement learning. In IEEE 13th INDIN. 116--120.Google Scholar
- Xiaofan Jiang, Joseph Polastre, and David Culler. 2005. Perpetual environmentally powered sensor networks. In Proceedings of the 4th International Symposium on Information Processing in Sensor Networks. IEEE Press, 65. Google Scholar
Digital Library
- Aman Kansal et al. 2007. Power management in energy harvesting sensor networks. ACM Tr. on Embedded Computing Systems 6, 4 (2007), 32. Google Scholar
Digital Library
- Aman Kansal, Dunny Potter, and Mani B. Srivastava. 2004. Performance aware tasking for environmentally powered sensor networks. ACM SIGMETRICS Performance Evaluation Review 32, 1 (2004), 223--234. Google Scholar
Digital Library
- Junaid Ahmed Khan et al. 2015. Energy management in wireless sensor networks: a survey. Computers 8 Electrical Engineering 41 (2015), 159--176.Google Scholar
- Cheng-Ting Liu and Roy Chaoming Hsu. 2011. Dynamic power management utilizing reinforcement learning with fuzzy reward for energy harvesting wireless sensor nodes. In 37th Annual Conference on IEEE Industrial Electronics Society. 2365--2369.Google Scholar
Cross Ref
- S. Danish Maqbool et al. 2011. Analysis of adaptability of Reinforcement Learning approach. In IEEE 14th INMIC. 45--49.Google Scholar
- Nicolò Michelusi et al. 2013. Energy management policies for harvesting-based wireless sensor devices with battery degradation. IEEE Tr. on Communications 61, 12 (2013), 4934--4947.Google Scholar
Cross Ref
- Andrea Ortiz et al. 2016. Reinforcement learning for energy harvesting point-to-point communications. In 2016 IEEE International Conference on Communications. 1--6.Google Scholar
- Vijay Raghunathan, Aman Kansal, Jason Hsu, Jonathan Friedman, and Mani Srivastava. 2005. Design considerations for solar energy harvesting wireless embedded systems. In Proceedings of the 4th International Symposium on Information Processing in Sensor Networks. IEEE Press, 64. Google Scholar
Digital Library
- Luigi Rucco et al. 2013. A bird’s eye view on reinforcement learning approaches for power management in WSNs. In 6th WMNC. 1--8.Google Scholar
- Navin Sharma et al. 2010. Cloudy computing: Leveraging weather forecasts in energy harvesting sensor systems. In 7th IEEE SECON. 1--9.Google Scholar
- Sujesha Sudevalayam and Purushottam Kulkarni. 2011. Energy harvesting sensor nodes: Survey and implications. IEEE Communications Surveys 8 Tutorials 13, 3 (2011), 443--461.Google Scholar
- Srikanth Sundaresan et al. 2009. Event-driven adaptive duty-cycling in sensor networks. International Journal of Sensor Networks 6, 2 (2009), 89--100. Google Scholar
Digital Library
- Richard S. Sutton et al. 1992. Reinforcement learning is direct adaptive optimal control. IEEE Control Systems 12, 2 (1992), 19--22.Google Scholar
Cross Ref
- Emre Ünsal, Taner Akkan, L. Özlem Akkan, and Yalçın Çebi. 2016. Power management for Wireless Sensor Networks in underground mining. In Signal Processing and Communication Application Conference (SIU), 2016 24th. IEEE, 1053--1056.Google Scholar
Cross Ref
- Christopher M. Vigorito et al. 2007. Adaptive control of duty cycling in energy-harvesting wireless sensor networks. In 4th IEEE Communications Society Conference on Sensor, Mesh and Ad Hoc Communications and Networks. 21--30.Google Scholar
Index Terms
Adaptive Power Management in Solar Energy Harvesting Sensor Node Using Reinforcement Learning
Recommendations
Power management in energy harvesting sensor networks
Special Section LCTES'05Power management is an important concern in sensor networks, because a tethered energy infrastructure is usually not available and an obvious concern is to use the available battery energy efficiently. However, in some of the sensor networking ...
Reinforcement Learning-Based Dynamic Power Management for Energy Harvesting Wireless Sensor Network
IEA/AIE '09: Proceedings of the 22nd International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems: Next-Generation Applied IntelligenceIn this study, a dynamic power management method based on reinforcement learning is proposed to improve the energy utilization for energy harvesting wireless sensor networks. Simulations of the proposed method on wireless sensor nodes powered by solar ...
Prediction free energy neutral power management for energy harvesting wireless sensor nodes
Current power management mechanisms for energy harvesting wireless sensors typically rely on predicted information about the amount of energy that can be harvested in the future. However, such mechanisms suffer from inevitable prediction errors, which ...






Comments