Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Compressed Superposition of Neural Networks for Deep Learning in Edge Computing
University of Ljubljana, Faculty of Computer and Information Science, Ljubljana, Slovenia.
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.ORCID iD: 0000-0003-0069-640x
University of Ljubljana, Faculty of Computer and Information Science, Ljubljana, Slovenia.
2021 (English)In: 2021 International Joint Conference on Neural Networks (IJCNN) Proceedings, IEEE, 2021Conference paper, Published paper (Refereed)
Abstract [en]

This paper investigates a combination of the two recently proposed techniques: superposition of multiple neural networks into one and neural network compression. We show that these two techniques can be successfully combined to deliver a great potential for trimming down deep convolutional neural networks. The work can be relevant in the context of implementing deep learning on low-end computing devices as it enables neural networks to fit edge devices with constrained computational resources (e.g. sensors, mobile devices, controllers). We study the trade-offs between the model compression rate and the accuracy of the superimposed tasks and present a CNN pipeline where the fully connected layers are isolated from the convolutional layers and serve as a general purpose neural processing unit for several CNN models. We show how deep models can be highly compressed with a limited accuracy degradation when additional compression is performed within the superposition principle.

Place, publisher, year, edition, pages
IEEE, 2021.
Series
International Joint Conference on Neural Networks (IJCNN), E-ISSN 2161-4407
National Category
Computer Systems
Research subject
Dependable Communication and Computation Systems
Identifiers
URN: urn:nbn:se:ltu:diva-87190DOI: 10.1109/IJCNN52387.2021.9533602ISI: 000722581702063Scopus ID: 2-s2.0-85109547685OAI: oai:DiVA.org:ltu-87190DiVA, id: diva2:1596605
Conference
The International Joint Conference on Neural Networks (IJCNN 2021), virtual, July 18-22, 2021
Note

ISBN för värdpublikation: 978-1-6654-3900-8

Available from: 2021-09-23 Created: 2021-09-23 Last updated: 2022-01-28Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Osipov, Evgeny

Search in DiVA

By author/editor
Osipov, Evgeny
By organisation
Computer Science
Computer Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 62 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf