Residential Demand Response Using Reinforcement Learning
Friday, September 24, Giedt Hall 1003, 12:00pm-1:00pm
Prof. Dan O' Neill
Consulting Professor, Stanford University
Host: Professor Anna Scaglione
Demand response (DR) systems enable the dynamic adjustment of electrical demand in response to pricing signals. By suitably adjusting energy prices, electrical load can be shifted from periods of high or peak demand to other periods, thereby improving operational efficiency, reducing operating costs, improving capital efficiency, and reducing harmful emissions and risk of outages.
Residential DR faces several challenges. With real time variable pricing, consumers face an infinite sequence of decisions to consume energy at current (known) prices or to defer using the device until later at possibly unknown prices. Each decision implicitly requires the consumer to estimate what future energy prices may be and weigh this differential cost against the dis-utility of waiting. This decision is further complicated by consumers selecting devices in a correlated. We hypothesize that few consumers will be willing to continuously make a sequence of decisions of this type, especially when many of these decisions will have limited short term financial impact on the consumer. Under this hypothesis, fully-automated energy management algorithms are a necessary part of residential DR.
The presentation focuses on a novel energy management system for residential demand response. The algorithm, named consumer automated energy management system (CAES), reduces residential energy costs and smoothes energy usage. CAES is an online learning application that implicitly estimates the impact of future energy prices and of consumer decisions on long term costs and schedules residential device usage. CAES models both energy prices and residential device usage as Markov, but does not assume knowledge of the structure or transition probabilities of these Markov chains. CAES learns continuously and adapts to individual consumer preferences and pricing modifications over time. In numerical simulations CAES reduced average end-user financial costs from 16% to 40% with respect to a price-unaware energy allocation.
Dan O'Neill is currently a Consulting Professor at Stanford University. His research focuses on finding, analyzing, and implementing algorithms to manage, optimize and control complex systems in stochastic environments when the statistics of the underlying randomness are unknown or can change dramatically in the areas of communications networks, Smart grid applications, and demand driven power control in complex systems. He received a best paper award at ICC 2008 and Globecom 2010. Prior to Stanford he was CEO at Clearwater Networks, General Manager at National Semiconductor, and VP and Partner at Merrill Lynch Venture Capital.