Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Dataset collection from a SubT environment
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.ORCID iD: 0000-0001-8235-2728
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.ORCID iD: 0000-0002-1046-0305
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.ORCID iD: 0000-0001-7631-002x
Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.ORCID iD: 0000-0001-8870-6718
Show others and affiliations
2022 (English)In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 155, article id 104168Article in journal (Refereed) Published
Abstract [en]

This article presents a dataset collected from the subterranean (SubT) environment with a current state-of-the-art sensors required for autonomous navigation. The dataset includes sensor measurements collected with RGB, RGB-D, event-based and thermal cameras, 2D and 3D lidars, inertial measurement unit (IMU), and ultra wideband (UWB) positioning systems which are mounted on the mobile robot. The overall sensor setup will be referred further in the article as a data collection platform. The dataset contains synchronized raw data measurements from all the sensors in the robot operating system (ROS) message format and video feeds collected with action and 360 cameras. A detailed description of the sensors embedded into the data collection platform and a data collection process are introduced. The collected dataset is aimed for evaluating navigation, localization and mapping algorithms in SubT environments. This article is accompanied with the public release of all collected datasets from the SubT environment. Link: Dataset (C) 2022 The Author(s). Published by Elsevier B.V.

Place, publisher, year, edition, pages
Elsevier, 2022. Vol. 155, article id 104168
Publication channel
2-s2.0-85132735325
Keywords [en]
Dataset, SubT, RGB, RGB-D, Event-based and thermal cameras, 2D and 3D lidars
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
Robotics and Artificial Intelligence
Identifiers
URN: urn:nbn:se:ltu:diva-92577DOI: 10.1016/j.robot.2022.104168ISI: 000833416900009Scopus ID: 2-s2.0-85132735325OAI: oai:DiVA.org:ltu-92577DiVA, id: diva2:1688919
Funder
EU, Horizon Europe, 869379 illuMINEation
Note

Validerad;2022;Nivå 2;2022-08-19 (hanlid)

Available from: 2022-08-19 Created: 2022-08-19 Last updated: 2024-03-26Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Koval, AntonKarlsson, SamuelMansouri, Sina SharifKanellakis, ChristoforosTevetzidis, IliasHaluska, JakubNikolakopoulos, George

Search in DiVA

By author/editor
Koval, AntonKarlsson, SamuelMansouri, Sina SharifKanellakis, ChristoforosTevetzidis, IliasHaluska, JakubNikolakopoulos, George
By organisation
Signals and Systems
In the same journal
Robotics and Autonomous Systems
Computer Vision and Robotics (Autonomous Systems)

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 134 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf