Distributed energy storage is beneficial to energy consumers and system operators, but it is expensive to install and requires careful management. To improve the viability of demand-side energy storage, this study presents a new method of obtaining optimal control actions for a demand-side energy storage system in the presence of distributed generation. The optimal usage policy is determined by framing the selection of charge and discharge actions as a stochastic optimization problem, which is conveniently represented as a Markov decision process (MDP). The proposed method uses short-term forecasts of loads and local generation to represent the time-dependence and nonstationarity of the net load profile. In contrast to previous studies, the forecast information is used to restrict the problem state space, reducing the computational complexity of the policy calculation. The method is tested using historic load and generation data with real time-of-use rate schedules and achieves substantial reductions in energy costs over similar existing methods.
- Cost functions,
- Digital storage,
- Distributed power generation,
- Dynamic programming,
- Energy storage,
- Forecasting,
- Markov processes,
- Charge and discharge,
- Distributed energy storages,
- Energy storage systems,
- Forecast information,
- Markov Decision Processes,
- Power system dynamics,
- Stochastic optimization problems,
- Substantial reduction,
- Distributed computer systems
Available at: http://works.bepress.com/jonathan-kimball/97/