LOWRECORP: the Low-Resource NLG Corpus Building ChallengeShow others and affiliations
2023 (English)In: Proceedings of the 16th International Natural Language Generation Conference: Generation Challenges / [ed] Simon Mille, Association for Computational Linguistics , 2023, p. 1-9Conference paper, Published paper (Refereed)
Abstract [en]
Most languages in the world do not have sufficient data available to develop neural-network-based natural language generation (NLG) systems. To alleviate this resource scarcity, we propose a novel challenge for the NLG community: low-resource language corpus development (LOWRECORP). We present an innovative framework to collect a single dataset with dual tasks to maximize the efficiency of data collection efforts and respect language consultant time. Specifically, we focus on a text-chat-based interface for two generation tasks – conversational response generation grounded in a source document and/or image and dialogue summarization (from the former task). The goal of this shared task is to collectively develop grounded datasets for local and low-resourced languages. To enable data collection, we make available web-based software that can be used to collect these grounded conversations and summaries. Submissions will be assessed for the size, complexity, and diversity of the corpora to ensure quality control of the datasets as well as any enhancements to the interface or novel approaches to grounding conversations.
Place, publisher, year, edition, pages
Association for Computational Linguistics , 2023. p. 1-9
National Category
Natural Language Processing
Research subject
Machine Learning
Identifiers
URN: urn:nbn:se:ltu:diva-110853DOI: 10.18653/v1/2023.inlg-genchal.1Scopus ID: 2-s2.0-105016340598OAI: oai:DiVA.org:ltu-110853DiVA, id: diva2:1916422
Conference
16th International Natural Language Generation Conference: Generation Challenges, Prague, Czechia, September 11–15, 2023
Note
Funder: EPSRC (EP/T024917/1);
ISBN for host publication: 979-8-89176-003-5;
2024-11-272024-11-272025-10-21Bibliographically approved