Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Gesture-Timbre Space: Multidimensional Feature Mapping Using Machine Learning and Concatenative Synthesis
Department of Computing, Goldsmiths, University of London, London, SE14 6NW, UK.ORCID iD: 0000-0003-3771-0869
Informatics Department, University of Leicester, University Road, Leicester, LE1 7RH, UK.ORCID iD: 0000-0001-6993-2445
Luleå University of Technology, Department of Social Sciences, Technology and Arts, Music, Media and Theater.ORCID iD: 0000-0001-9685-4702
Department of Computing, Goldsmiths, University of London, London, SE14 6NW, UK.ORCID iD: 0000-0003-2521-1296
2021 (English)In: Perception, Representations, Image, Sound, Music: 14th International Symposium CMMR 2019, Marseille, France, October 14–18, 2019, Revised Selected Papers / [ed] Richard Kronland-Martinet; Sølvi Ystad; Mitsuko Aramaki, Springer Nature, 2021, p. 600-622Conference paper, Published paper (Refereed)
Abstract [en]

This chapter explores three systems for mapping embodied gesture, acquired with electromyography and motion sensing, to sound synthesis. A pilot study using granular synthesis is presented, followed by studies employing corpus-based concatenative synthesis, where small sound units are organized by derived timbral features. We use interactive machine learning in a mapping-by-demonstration paradigm to create regression models that map high-dimensional gestural data to timbral data without dimensionality reduction in three distinct workflows. First, by directly associating individual sound units and static poses (anchor points) in static regression. Second, in whole regression a sound tracing method leverages our intuitive associations between time-varying sound and embodied movement. Third, we extend interactive machine learning through the use of artificial agents and reinforcement learning in an assisted interactive machine learning workflow. We discuss the benefits of organizing the sound corpus using self-organizing maps to address corpus sparseness, and the potential of regression-based mapping at different points in a musical workflow: gesture design, sound design, and mapping design. These systems support expressive performance by creating gesture-timbre spaces that maximize sonic diversity while maintaining coherence, enabling reliable reproduction of target sounds as well as improvisatory exploration of a sonic corpus. They have been made available to the research community, and have been used by the authors in concert performance.

Place, publisher, year, edition, pages
Springer Nature, 2021. p. 600-622
Series
Lecture Notes in Computer Science (LNISA), ISSN 0302-9743, E-ISSN 1611-3349 ; 12631
Keywords [en]
Concatenative synthesis, Gestural interaction, Human-computer interaction, Interactive machine learning, Reinforcement learning, Sonic interaction design
National Category
Computer Sciences Musicology
Research subject
Musical Performance
Identifiers
URN: urn:nbn:se:ltu:diva-94864DOI: 10.1007/978-3-030-70210-6_39Scopus ID: 2-s2.0-85103451121OAI: oai:DiVA.org:ltu-94864DiVA, id: diva2:1719962
Conference
14th International Symposium on Computer Music Multidisciplinary Research (CMMR), Marseille, France, October 14-18, 2019
Funder
EU, Horizon 2020, 789825
Note

ISBN for host publication: 978-3-030-70209-0; 978-3-030-70210-6

Available from: 2022-12-16 Created: 2022-12-16 Last updated: 2022-12-16Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Visi, Federico Ghelli

Search in DiVA

By author/editor
Zbyszyński, MichaelDi Donato, BalandinoVisi, Federico GhelliTanaka, Atau
By organisation
Music, Media and Theater
Computer SciencesMusicology

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 63 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf