Skip to content

Use of deep reinforcement learning to optimally control a battery in a house in Australia

License

Notifications You must be signed in to change notification settings

alexanderkell/battery-optimisation

Repository files navigation

Unit Tests

Battery Optimisation

This project looks at the optimisation of domestic batteries with domestic solar photovoltaics attached.

The project's aim is to use reinforcement learning to optimally dispatch a battery source.

Publication

Kell, A.J., Stephen McGough, A. and Forshaw, M. (2022) ‘Optimizing a domestic battery and solar photovoltaic system with Deep Reinforcement Learning’, 2022 IEEE International Conference on Big Data (Big Data) [Preprint]. doi:10.1109/bigdata55660.2022.10021028.

Abstract

This study introduces a state-of-the-art method for optimizing home battery and solar systems using deep reinforcement learning. It focuses on improving battery performance in solar-battery systems, significantly reducing household electricity costs.

The deep learning algorithm adapts to changing energy needs and solar outputs, enhancing energy management. The Figures below highlight the impact of different battery sizes and the algorithm's effectiveness.

Installation

Pre-requisite: A virtual environment

Although not strictly necessary, creating a conda virtual environment is highly recommended: it will isolate users and developers from changes occuring on their operating system, and from conflicts between python packages. It ensures reproducibility from day to day.

Create a virtual env including python with:

> conda create -n battery-optimisation python=3.7

Activate the environment with:

> conda activate battery-optimisation

Later, to recover the system-wide "normal" python, deactivate the environment with:

> conda deactivate

Installation of packages

Next, to install the required python packages, run:

> pip install -r requirements.in

Usage

To run the reinforcement learning algorithm, you must run a single file, such as the following:

> python3 src/models/run_model.py   

Training

To visualise the training in real-time, it is possible to use tensorboard. To start tensorboard, you must find your ray_results/ folder. This is usually in ~/ray_results/. The following code should work to get tensorboard started:

> tensorboard --logdir=~/ray_results/

You can then view the training by navigating to http://localhost:6007/ in a browser.

Important features?

  • Weather
  • Historical load
  • Historical weather irradiance
  • Generator capacity

Reward

  • Inverse of electricity price

Observations

  • State of battery charge
  • Battery size
  • Previous data points
  • Time
  • Day

Repository Structure

├── LICENSE 
├── README.md
├── data
│   ├── interim
│   ├── models
│   ├── processed
│   ├── raw
│   └── results
├── notebooks
│   ├── data_munging
│   ├── exploration
│   ├── features
│   ├── modelling
│   └── results
├── references
│   └── Ausgrid solar home electricity data notes (Aug 2014).pdf
├── reports
│   └── figures
├── requirements.in
├── requirements.txt
├── src <--------------------------------- Main source code folder
│   ├── __init__.py
│   ├── __pycache__ 
│   ├── data <----------------------------- Code to generate data
│   └── models <--------------------------- Code for simulation, modelling, training and testing of algorithms
└── test
    ├── __init__.py
    └── test_model.py <------------------- Unit tests to ensure code is working as expected

About

Use of deep reinforcement learning to optimally control a battery in a house in Australia

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published