Change search
Link to record
Permanent link

Direct link
Alternative names
Publications (7 of 7) Show all publications
Kleyko, D., Rachkovskij, D. A., Osipov, E. & Rahimi, A. (2023). A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations. ACM Computing Surveys, 55(6), Article ID 130.
Open this publication in new window or tab >>A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations
2023 (English)In: ACM Computing Surveys, ISSN 0360-0300, E-ISSN 1557-7341, Vol. 55, no 6, article id 130Article in journal (Refereed) Published
Abstract [en]

This two-part comprehensive survey is devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer to a family of computational models that use high-dimensional distributed representations and rely on the algebraic properties of their key operations to incorporate the advantages of structured symbolic representations and distributed vector representations. Notable models in the HDC/VSA family are Tensor Product Representations, Holographic Reduced Representations, Multiply-Add-Permute, Binary Spatter Codes, and Sparse Binary Distributed Representations but there are other models too. HDC/VSA is a highly interdisciplinary field with connections to computer science, electrical engineering, artificial intelligence, mathematics, and cognitive science. This fact makes it challenging to create a thorough overview of the field. However, due to a surge of new researchers joining the field in recent years, the necessity for a comprehensive survey of the field has become extremely important. Therefore, amongst other aspects of the field, this Part I surveys important aspects such as: known computational models of HDC/VSA and transformations of various input data types to high-dimensional distributed representations. Part II of this survey [84] is devoted to applications, cognitive computing and architectures, as well as directions for future work. The survey is written to be useful for both newcomers and practitioners.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2023
National Category
Computer Sciences
Research subject
Dependable Communication and Computation Systems
Identifiers
urn:nbn:se:ltu:diva-95133 (URN)10.1145/3538531 (DOI)000893245700022 ()
Funder
EU, Horizon 2020, 839179Swedish Foundation for Strategic Research, UKR22-0024
Note

Validerad;2023;Nivå 2;2023-01-03 (joosat);

Funder: AFOSR (FA9550-19-1-0241); National Academy of Sciences of Ukraine (0120U000122, 0121U000016, 0117U002286); Ministry of Education and Science of Ukraine (0121U000228, 0122U000818);

Available from: 2023-01-03 Created: 2023-01-03 Last updated: 2023-01-03Bibliographically approved
Kleyko, D., Rachkovskij, D., Osipov, E. & Rahimi, A. (2023). A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges. ACM Computing Surveys, 55(9), Article ID 175.
Open this publication in new window or tab >>A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges
2023 (English)In: ACM Computing Surveys, ISSN 0360-0300, E-ISSN 1557-7341, Vol. 55, no 9, article id 175Article in journal (Refereed) Published
Abstract [en]

This is Part II of the two-part comprehensive survey devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer to a family of computational models that use high-dimensional distributed representations and rely on the algebraic properties of their key operations to incorporate the advantages of structured symbolic representations and vector distributed representations. Holographic Reduced Representations [321, 326] is an influential HDC/VSA model that is well known in the machine learning domain and often used to refer to the whole family. However, for the sake of consistency, we use HDC/VSA to refer to the field.Part I of this survey [222] covered foundational aspects of the field, such as the historical context leading to the development of HDC/VSA, key elements of any HDC/VSA model, known HDC/VSA models, and the transformation of input data of various types into high-dimensional vectors suitable for HDC/VSA. This second part surveys existing applications, the role of HDC/VSA in cognitive computing and architectures, as well as directions for future work. Most of the applications lie within the Machine Learning/Artificial Intelligence domain; however, we also cover other applications to provide a complete picture. The survey is written to be useful for both newcomers and practitioners.

Place, publisher, year, edition, pages
Association for Computing Machinery, 2023
Keywords
analogical reasoning, applications, Artificial intelligence, binary spatter codes, cognitive architectures, cognitive computing, distributed representations, geometric analogue of holographic reduced representations, holographic reduced representations, hyperdimensional computing, machine learning, matrix binding of additive terms, modular composite representations, multiply-add-permute, sparse binary distributed representations, sparse block codes, tensor product representations, vector symbolic architectures
National Category
Computer Sciences
Research subject
Dependable Communication and Computation Systems
Identifiers
urn:nbn:se:ltu:diva-95673 (URN)10.1145/3558000 (DOI)000924882300001 ()2-s2.0-85147845869 (Scopus ID)
Funder
EU, Horizon 2020, 839179Swedish Foundation for Strategic Research, UKR22-0024
Note

Validerad;2023;Nivå 2;2023-02-21 (joosat);

Funder: AFOSR (FA9550-19-1-0241); National Academy of Sciences of Ukraine (grant no. 0120U000122, 0121U000016, 0122U002151, 0117U002286); Ministry of Education and Science of Ukraine (grant no. 0121U000228, 0122U000818)

Available from: 2023-02-21 Created: 2023-02-21 Last updated: 2023-05-08Bibliographically approved
Tyshchuk, O. V., Desiateryk, O. O., Volkov, O. E., Revunova, E. G. & Rachkovskij, D. (2022). A Linear System Output Transformation for Sparse Approximation*. Cybernetics and Systems Analysis, 58(5), 840-850
Open this publication in new window or tab >>A Linear System Output Transformation for Sparse Approximation*
Show others...
2022 (English)In: Cybernetics and Systems Analysis, ISSN 1060-0396, E-ISSN 1573-8337, Vol. 58, no 5, p. 840-850Article in journal (Refereed) Published
Abstract [en]

We propose an approach that provides a stable transformation of the output of a linear system into the output of a system with the desired basis. The matrix of basis functions of the linear system has a large condition number, and the series of its singular numbers gradually decreases to zero. Two types of methods for stable output transformation are developed using the approximation of matrices based on the truncated Singular Value Decomposition and on the Random Projection with different types of random matrices. It is shown that the use of the output transformation as preprocessing increases the accuracy of solving sparse approximation problems. An example of using the method to determine the activity of weak radiation sources is considered.

Place, publisher, year, edition, pages
Springer, 2022
Keywords
discrete ill-posed problem, random projection, singular value decomposition, sparse approximation
National Category
Computational Mathematics Control Engineering
Research subject
Dependable Communication and Computation Systems
Identifiers
urn:nbn:se:ltu:diva-94998 (URN)10.1007/s10559-022-00517-3 (DOI)000895709900018 ()2-s2.0-85143236339 (Scopus ID)
Note

Validerad;2023;Nivå 2;2023-01-01 (marisr);

Translated from Kibernetyka ta Systemnyi Analiz, No. 5, September–October, 2022, pp. 189–202.

Available from: 2022-12-27 Created: 2022-12-27 Last updated: 2023-04-25Bibliographically approved
Rachkovskij, D. A. & Kleyko, D. (2022). Recursive Binding for Similarity-Preserving Hypervector Representations of Sequences. In: 2022 International Joint Conference on Neural Networks (IJCNN): 2022 Conference Proceedings. Paper presented at IEEE World Congress on Computational Intelligence (WCCI 2022), International Joint Conference on Neural Networks (IJCNN 2022), Padua, Italy, July 18-23, 2022. IEEE
Open this publication in new window or tab >>Recursive Binding for Similarity-Preserving Hypervector Representations of Sequences
2022 (English)In: 2022 International Joint Conference on Neural Networks (IJCNN): 2022 Conference Proceedings, IEEE, 2022Conference paper, Published paper (Refereed)
Abstract [en]

Hyperdimensional computing (HDC), also known as vector symbolic architectures (VSA), is a computing framework used within artificial intelligence and cognitive computing that operates with distributed vector representations of large fixed dimensionality. A critical step in designing the HDC/VSA solutions is to obtain such representations from the input data. Here, we focus on a wide-spread data type of sequences and propose their transformation to distributed representations that both preserve the similarity of identical sequence elements at nearby positions and are equivariant with respect to the sequence shift. These properties are enabled by forming representations of sequence positions using recursive binding as well as superposition operations. The proposed transformation was experimentally investigated with symbolic strings used for modeling human perception of word similarity. The obtained results are on a par with more sophisticated approaches from the literature. The proposed transformation was designed for the HDC/VSA model known as Fourier Holographic Reduced Representations. However, it can be adapted to some other HDC/VSA models.

Place, publisher, year, edition, pages
IEEE, 2022
Keywords
data structures, distributed representation, hyperdimensional computing, hypervector, recursive binding, sequence representation, shift equivariance, similarity preserving transformation, vector symbolic architectures
National Category
Computer Sciences Information Systems
Research subject
Dependable Communication and Computation Systems
Identifiers
urn:nbn:se:ltu:diva-94150 (URN)10.1109/IJCNN55064.2022.9892462 (DOI)000867070904096 ()2-s2.0-85137519384 (Scopus ID)
Conference
IEEE World Congress on Computational Intelligence (WCCI 2022), International Joint Conference on Neural Networks (IJCNN 2022), Padua, Italy, July 18-23, 2022
Funder
EU, Horizon 2020, 839179Swedish Foundation for Strategic Research, UKR22-0024
Note

Funder: AFOSR (FA9550-19-1-0241), Intel’s THWAI program, National Academy of Sciences of Ukraine (0121U000016), Ministry of Education and Science of Ukraine (0121U000228, 0122U000818);

ISBN för värdpublikation: 978-1-7281-8671-9

Available from: 2022-11-18 Created: 2022-11-18 Last updated: 2023-05-08Bibliographically approved
Rachkovskij, D. A. (2022). Representation of spatial objects by shift-equivariant similarity-preserving hypervectors. Neural Computing & Applications, 34(24), 22387-22403
Open this publication in new window or tab >>Representation of spatial objects by shift-equivariant similarity-preserving hypervectors
2022 (English)In: Neural Computing & Applications, ISSN 0941-0643, E-ISSN 1433-3058, Vol. 34, no 24, p. 22387-22403Article in journal (Refereed) Published
Abstract [en]

Hyperdimensional Computing (HDC), also known as Vector-Symbolic Architectures (VSA), is an approach that has been proposed to combine the advantages of distributed vector representations and symbolic structured data representations in Artificial Intelligence, Machine Learning, and Pattern Recognition problems. HDC/VSA operate with hypervectors, i.e., brain-like distributed representations of large fixed dimension. The key problem of HDC/VSA is how to transform data of various types into hypervectors. In this paper, we propose a novel approach for the formation of hypervectors of spatial objects, such as images, that provides both an equivariance with respect to the shift of objects and preserves the similarity of objects described by similar features at nearby positions. In contrast to known hypervector formation methods, we represent the features by compositional hypervectors and exploit permutations of hypervectors for representing the position of features. We experimentally explored the proposed approach in some tasks that exploit various descriptions of two-dimensional (2D) images. In terms of standard accuracy measures such as error rate or mean average precision, our results are on a par or better than those of other methods and are obtained without feature learning. The proposed techniques were designed for the HDC/VSA model known as Sparse Binary Distributed Representations. However, they can be adapted to hypervectors in formats of other HDC/VSA models, as well as for representing spatial objects other than 2D images.

Place, publisher, year, edition, pages
Springer Nature, 2022
Keywords
Hyperdimensional computing, Vector symbolic architectures, Spatial object representation, Neural-likedistributed representations, Shift equivariance, Image classification and retrieval
National Category
Information Systems Computer Sciences
Research subject
Dependable Communication and Computation Systems
Identifiers
urn:nbn:se:ltu:diva-93082 (URN)10.1007/s00521-022-07619-1 (DOI)000850761000006 ()2-s2.0-85137526660 (Scopus ID)
Funder
Swedish Foundation for Strategic Research, UKR22-0024
Note

Validerad;2022;Nivå 2;2022-11-29 (hanlid);

Funder: National Academy of Sciences of Ukraine (0120U000122, 0121U000016, 0117U002286); Ministry of Education and Science of Ukraine (0121U000228, 0122U000818)

Available from: 2022-09-19 Created: 2022-09-19 Last updated: 2022-11-29Bibliographically approved
Volkov, O., Komar, M., Rachkovskij, D. & Volosheniuk, D. (2022). Technology of Autonomous Take-Off and Landing for the Modern Flight and Navigation Complex of an Unmanned Aerial Vehicle. Cybernetics and Systems Analysis, 58(6), 882-888
Open this publication in new window or tab >>Technology of Autonomous Take-Off and Landing for the Modern Flight and Navigation Complex of an Unmanned Aerial Vehicle
2022 (English)In: Cybernetics and Systems Analysis, ISSN 1060-0396, E-ISSN 1573-8337, Vol. 58, no 6, p. 882-888Article in journal (Refereed) Published
Place, publisher, year, edition, pages
Springer Nature, 2022
National Category
Control Engineering
Identifiers
urn:nbn:se:ltu:diva-96473 (URN)10.1007/s10559-023-00521-1 (DOI)000927817900001 ()2-s2.0-85145743550 (Scopus ID)
Available from: 2023-04-13 Created: 2023-04-13 Last updated: 2023-07-04Bibliographically approved
Kleyko, D., Davies, M., Frady, E. P., Kanerva, P., Kent, S. J., Olshausen, B. A., . . . Sommer, F. T. (2022). Vector Symbolic Architectures as a Computing Framework for Emerging Hardware. Proceedings of the IEEE, 110(10), 1538-1571
Open this publication in new window or tab >>Vector Symbolic Architectures as a Computing Framework for Emerging Hardware
Show others...
2022 (English)In: Proceedings of the IEEE, ISSN 0018-9219, E-ISSN 1558-2256, Vol. 110, no 10, p. 1538-1571Article in journal (Refereed) Published
Abstract [en]

This article reviews recent progress in the development of the computing framework vector symbolic architectures (VSA) (also known as hyperdimensional computing). This framework is well suited for implementation in stochastic, emerging hardware, and it naturally expresses the types of cognitive operations required for artificial intelligence (AI). We demonstrate in this article that the field-like algebraic structure of VSA offers simple but powerful operations on high-dimensional vectors that can support all data structures and manipulations relevant to modern computing. In addition, we illustrate the distinguishing feature of VSA, “computing in superposition,” which sets it apart from conventional computing. It also opens the door to efficient solutions to the difficult combinatorial search problems inherent in AI applications. We sketch ways of demonstrating that VSA are computationally universal. We see them acting as a framework for computing with distributed representations that can play a role of an abstraction layer for emerging computing hardware. This article serves as a reference for computer architects by illustrating the philosophy behind VSA, techniques of distributed computing with them, and their relevance to emerging computing hardware, such as neuromorphic computing.

Place, publisher, year, edition, pages
IEEE, 2022
Keywords
Computing framework, computing in superposition, data structures, distributed representations, emerging hardware, holographic reduced representation (HRR), hyperdimensional (HD) computing, Turing completeness, vector symbolic architectures (VSA)
National Category
Computer Sciences Computer Systems
Research subject
Dependable Communication and Computation Systems
Identifiers
urn:nbn:se:ltu:diva-93971 (URN)10.1109/JPROC.2022.3209104 (DOI)000870302900008 ()2-s2.0-85141794287 (Scopus ID)
Projects
Defense Advanced Research Projects Agency’s (DARPA)VIP (Super-HD Project)AIE (HyDDENN Project)Intel’s THWAI
Funder
EU, Horizon 2020, 839179Swedish Foundation for Strategic Research, UKR22-0024
Note

Validerad;2022;Nivå 2;2022-11-10 (hanlid);

Funder: Air Force Office of Scientific Research (AFOSR) (FA9550-19-1-0241); National Academy of Sciences of Ukraine (0120U000122, 0121U000016, 0122U002151 and 0117U002286); Ministry of Education and Science of Ukraine (0121U000228 and 0122U000818); NIH (R01-EB026955); NSF (IIS-1718991)

Available from: 2022-11-10 Created: 2022-11-10 Last updated: 2022-12-06Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-3414-5334

Search in DiVA

Show all publications