Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A simulation environment for training a reinforcement learning agent trading a battery storage
Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, FI-00076 Espoo, Finland.
Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, FI-00076 Espoo, Finland.
Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, FI-00076 Espoo, Finland.
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science. Department of Electrical Engineering and Automation, School of Electrical Engineering, Aalto University, FI-00076 Espoo, Finland; International Research Laboratory of Computer Technologies, ITMO University, 197101 St. Petersburg, Russia.ORCID iD: 0000-0002-9315-9920
2021 (English)In: Energies, E-ISSN 1996-1073, Vol. 14, no 17, article id 5587Article in journal (Refereed) Published
Abstract [en]

Battery storages are an essential element of the emerging smart grid. Compared to other distributed intelligent energy resources, batteries have the advantage of being able to rapidly react to events such as renewable generation fluctuations or grid disturbances. There is a lack of research on ways to profitably exploit this ability. Any solution needs to consider rapid electrical phenomena as well as the much slower dynamics of relevant electricity markets. Reinforcement learning is a branch of artificial intelligence that has shown promise in optimizing complex problems involving uncertainty. This article applies reinforcement learning to the problem of trading batteries. The problem involves two timescales, both of which are important for profitability. Firstly, trading the battery capacity must occur on the timescale of the chosen electricity markets. Secondly, the real-time operation of the battery must ensure that no financial penalties are incurred from failing to meet the technical specification. The trading-related decisions must be done under uncertainties, such as unknown future market prices and unpredictable power grid disturbances. In this article, a simulation model of a battery system is proposed as the environment to train a reinforcement learning agent to make such decisions. The system is demonstrated with an application of the battery to Finnish primary frequency reserve markets.

Place, publisher, year, edition, pages
MDPI, 2021. Vol. 14, no 17, article id 5587
Keywords [en]
battery, reinforcement learning, simulation, frequency reserve, frequency containment reserve, timescale, artificial intelligence, real-time, electricity market
National Category
Energy Engineering
Research subject
Dependable Communication and Computation Systems
Identifiers
URN: urn:nbn:se:ltu:diva-87125DOI: 10.3390/en14175587ISI: 000694168100001Scopus ID: 2-s2.0-85114611521OAI: oai:DiVA.org:ltu-87125DiVA, id: diva2:1595501
Note

Validerad;2021;Nivå 2;2021-09-20 (beamah);

Forskningsfinansiär: Business Finland (7439/31/2018)

Available from: 2021-09-20 Created: 2021-09-20 Last updated: 2023-08-28Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Vyatkin, Valeriy

Search in DiVA

By author/editor
Vyatkin, Valeriy
By organisation
Computer Science
In the same journal
Energies
Energy Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 679 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf