Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Integer Echo State Networks: Efficient Reservoir Computing for Digital Hardware
Redwood Center for Theoretical Neuroscience, University of California at Berkeley, Berkeley, CA 94720, USA; Intelligent Systems Lab, Research Institutes of Sweden, 164 40 Kista, Sweden.
Neuromorphic Computing Lab, Intel Labs, Santa Clara, CA 95054 USA; Redwood Center for Theoretical Neuroscience, University of California at Berkeley, Berkeley, CA 94720, USA.
Netlight Consulting AB, 111 53 Stockholm, Sweden.
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.ORCID iD: 0000-0003-0069-640x
2022 (English)In: IEEE Transactions on Neural Networks and Learning Systems, ISSN 2162-237X, E-ISSN 2162-2388, Vol. 33, no 4, p. 1688-1701Article in journal (Refereed) Published
Abstract [en]

We propose an approximation of echo state networks (ESNs) that can be efficiently implemented on digital hardware based on the mathematics of hyperdimensional computing. The reservoir of the proposed integer ESN (intESN) is a vector containing only n-bits integers (where n< 8 is normally sufficient for a satisfactory performance). The recurrent matrix multiplication is replaced with an efficient cyclic shift operation. The proposed intESN approach is verified with typical tasks in reservoir computing: memorizing of a sequence of inputs, classifying time series, and learning dynamic processes. Such architecture results in dramatic improvements in memory footprint and computational efficiency, with minimal performance loss. The experiments on a field-programmable gate array confirm that the proposed intESN approach is much more energy efficient than the conventional ESN.

Place, publisher, year, edition, pages
IEEE, 2022. Vol. 33, no 4, p. 1688-1701
Keywords [en]
Dynamic systems modeling, echo state networks (ESNs), hyperdimensional computing (HDC), memory capacity, reservoir computing (RC), time-series classification, vector symbolic architectures
National Category
Computer Sciences
Research subject
Dependable Communication and Computation Systems
Identifiers
URN: urn:nbn:se:ltu:diva-82337DOI: 10.1109/TNNLS.2020.3043309ISI: 000778930100029PubMedID: 33351770Scopus ID: 2-s2.0-85098778990OAI: oai:DiVA.org:ltu-82337DiVA, id: diva2:1517034
Funder
Swedish Research Council, 2015-04677EU, Horizon 2020, 839179
Note

Validerad;2022;Nivå 2;2022-04-13 (sofila);

Funder: Defense Advanced Research Projects Agency

Available from: 2021-01-13 Created: 2021-01-13 Last updated: 2023-10-28Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textPubMedScopus

Authority records

Osipov, Evgeny

Search in DiVA

By author/editor
Osipov, Evgeny
By organisation
Computer Science
In the same journal
IEEE Transactions on Neural Networks and Learning Systems
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 118 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf