Change search
Link to record
Permanent link

Direct link
BETA
Kanellakis, Christoforos
Publications (10 of 16) Show all publications
Mansouri, S. S., Arranz, M. C., Kanellakis, C. & Nikolakopoulos, G. (2019). Autonomous MAV Navigation in Underground Mines Using Darkness Contours Detection. In: : . Paper presented at 12th International Conference on Computer Vision Systems (ICVS 2019).
Open this publication in new window or tab >>Autonomous MAV Navigation in Underground Mines Using Darkness Contours Detection
2019 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This article considers a low-cost and light weight platform for the task of autonomous flying for inspection in underground mine tunnels. The main contribution of this paper is integrating simple, efficient and well-established methods in the computer vision community in a state of the art vision-based system for Micro Aerial Vehicle (MAV) navigation in dark tunnels. These methods include Otsu's threshold and Moore-Neighborhood object tracing. The vision system can detect the position of low-illuminated tunnels in image frame by exploiting the inherent darkness in the longitudinal direction. In the sequel, it is converted from the pixel coordinates to the heading rate command of the MAV for adjusting the heading towards the center of the tunnel. The efficacy of the proposed framework has been evaluated in multiple experimental field trials in an underground mine in Sweden, thus demonstrating the capability of low-cost and resource-constrained aerial vehicles to fly autonomously through tunnel confined spaces.

Keywords
Micro Aerial Vehicles (MAVs), Vision-based Navigation, Autonomous Drift Inspection, Otsu's Theshold, Moore-Neighborhood Tracing
National Category
Control Engineering Other Civil Engineering
Research subject
Control Engineering; Operation and Maintenance
Identifiers
urn:nbn:se:ltu:diva-75270 (URN)
Conference
12th International Conference on Computer Vision Systems (ICVS 2019)
Funder
EU, Horizon 2020, 730302
Available from: 2019-07-09 Created: 2019-07-09 Last updated: 2019-08-13
Kanellakis, C. & Nikolakopoulos, G. (2019). Guidance for Autonomous Aerial Manipulator Using Stereo Vision. Journal of Intelligent and Robotic Systems
Open this publication in new window or tab >>Guidance for Autonomous Aerial Manipulator Using Stereo Vision
2019 (English)In: Journal of Intelligent and Robotic Systems, ISSN 0921-0296, E-ISSN 1573-0409Article in journal (Refereed) Epub ahead of print
Abstract [en]

Combining the agility of Micro Aerial Vehicles (MAV) with the dexterity of robotic arms leads to a new era of Aerial Robotic Workers (ARW) targeting infrastructure inspection and maintenance tasks. Towards this vision, this work focuses on the autonomous guidance of the aerial end-effector to either reach or keep desired distance from areas/objects of interest. The proposed system: 1) is structured around a real-time object tracker, 2) employs stereo depth perception to extract the target location within the surrounding scene, and finally 3) generates feasible poses for both the arm and the MAV relative to the target. The performance of the proposed scheme is experimentally demonstrated in multiple scenarios of increasing complexity.

Place, publisher, year, edition, pages
Springer, 2019
Keywords
Vision based guidance, Aerial manipulator, MAV
National Category
Control Engineering
Research subject
Control Engineering
Identifiers
urn:nbn:se:ltu:diva-75822 (URN)10.1007/s10846-019-01060-8 (DOI)
Available from: 2019-09-03 Created: 2019-09-03 Last updated: 2019-09-03Bibliographically approved
Kanellakis, C., Mansouri, S. S., Georgoulas, G. & Nikolakopoulos, G. (2019). Towards Autonomous Surveying of Underground Mine using MAVs geogeo. In: : . Paper presented at 27th International Conference on Robotics in Alpe-Adria-Danube Region, Patras, Greece, June 6-8, 2018 (pp. 173-180). Springer, 67
Open this publication in new window or tab >>Towards Autonomous Surveying of Underground Mine using MAVs geogeo
2019 (English)Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

Micro Aerial Vehicles (MAVs) are platforms that received great attention during the last decade. Recently, the mining industry has been considering the usage of aerial autonomous platforms in their processes. This article initially investigates potential application scenarios for this technology in mining. Moreover, one of the main tasks refer to surveillance and maintenance of infrastructure assets. Employing these robots for underground surveillance processes of areas like shafts, tunnels or large voids after blasting, requires among others the development of elaborate navigation modules. This paper proposes a method to assist the navigation capabilities of MAVs in challenging mine environments, like tunnels and vertical shafts. The proposed method considers the use of Potential Fields method, tailored to implement a sense-and-avoid system using a minimal ultrasound-based sensory system. Simulation results demonstrate the effectiveness of the proposed strategy.

Place, publisher, year, edition, pages
Springer, 2019
Series
Mechanisms and Machine Science, ISSN 2211-0984
Keywords
MAV, Underground Mines, Navigation
National Category
Engineering and Technology Control Engineering
Research subject
Control Engineering; Control Engineering
Identifiers
urn:nbn:se:ltu:diva-70113 (URN)10.1007/978-3-030-00232-9_18 (DOI)000465020800018 ()2-s2.0-85054305469 (Scopus ID)
Conference
27th International Conference on Robotics in Alpe-Adria-Danube Region, Patras, Greece, June 6-8, 2018
Available from: 2018-07-12 Created: 2018-07-12 Last updated: 2019-05-02Bibliographically approved
Mansouri, S. S., Karvelis, P., Kanellakis, C., Kominiak, D. & Nikolakopoulos, G. (2019). Vision-based MAV Navigation in Underground Mine Using Convolutional Neural Network. In: : . Paper presented at IEEE Industrial Electronics Society.
Open this publication in new window or tab >>Vision-based MAV Navigation in Underground Mine Using Convolutional Neural Network
Show others...
2019 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This article presents a Convolutional Neural Network (CNN) method to enable autonomous navigation of low-cost Micro Aerial Vehicle (MAV) platforms along dark underground mine environments. The proposed CNN component provides on-line heading rate commands for the MAV by utilising the image stream from the on-board camera, thus allowing the platform to follow a collision-free path along the tunnel axis. A novel part of the developed method consists of the generation of the data-set used for training the CNN. More specifically, inspired from single image haze removal algorithms, various image data-sets collected from real tunnel environments have been processed offline to provide an estimation of the depth information of the scene, where ground truth is not available. The calculated depth map is used to extract the open space in the tunnel, expressed through the area centroid and is finally provided in the training of the CNN. The method considers the MAV as a floating object, thus accurate pose estimation is not required. Finally, the capability of the proposed method has been successfully experimentally evaluated in field trials in an underground mine in Sweden.

Keywords
Mining Aerial Robotics, Deep Learning for Navigation, MAV
National Category
Robotics
Identifiers
urn:nbn:se:ltu:diva-75674 (URN)
Conference
IEEE Industrial Electronics Society
Available from: 2019-08-23 Created: 2019-08-23 Last updated: 2019-08-23
Mansouri, S. S., Karvelis, P., Kanellakis, C., Koval, A. & Nikolakopoulos, G. (2019). Visual Subterranean Junction Recognition for MAVs based on Convolutional Neural Networks. In: : . Paper presented at IEEE 45th Annual Conference of the Industrial Electronics Society (IECON 2019).
Open this publication in new window or tab >>Visual Subterranean Junction Recognition for MAVs based on Convolutional Neural Networks
Show others...
2019 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This article proposes a novel visual framework for detecting tunnel crossings/junctions in underground mine areas towards the autonomous navigation of Micro Aeril Vehicles (MAVs). Usually mine environments have complex geometries, including multiple crossings with different tunnels that challenge the autonomous planning of aerial robots. Towards the envisioned scenario of autonomous or semi-autonomous deployment of MAVs with limited Line-of-Sight in subterranean environments, the proposed module acknowledges the existence of junctions by providing crucial information to the autonomy and planning layers of the aerial vehicle. The capability for a junction detection is necessary in the majority of mission scenarios, including unknown area exploration, known area inspection and robot homing missions. The proposed novel method has the ability to feed the image stream from the vehicles’ on-board forward facing camera in a Convolutional Neural Network (CNN) classification architecture, expressed in four categories: 1) left junction, 2) right junction, 3) left & right junction, and 4) no junction in the local vicinity of the vehicle. The core contribution stems for the incorporation of AlexNet in a transfer learning scheme for detecting multiple branches in a subterranean environment. The validity of the proposed method has been validated through multiple data-sets collected from real underground environments, demonstrating the performance and merits of the proposed module.

National Category
Engineering and Technology
Identifiers
urn:nbn:se:ltu:diva-75555 (URN)
Conference
IEEE 45th Annual Conference of the Industrial Electronics Society (IECON 2019)
Available from: 2019-08-16 Created: 2019-08-16 Last updated: 2019-08-16
Mansouri, S. S., Kanellakis, C., Georgoulas, G., Kominiak, D., Gustafsson, T. & Nikolakopoulos, G. (2018). 2D visual area coverage and path planning coupled with camera footprints. Control Engineering Practice, 75, 1-16
Open this publication in new window or tab >>2D visual area coverage and path planning coupled with camera footprints
Show others...
2018 (English)In: Control Engineering Practice, ISSN 0967-0661, E-ISSN 1873-6939, Vol. 75, p. 1-16Article in journal (Refereed) Published
Abstract [en]

Unmanned Aerial Vehicles (UAVs) equipped with visual sensors are widely used in area coverage missions. Guaranteeing full coverage coupled with camera footprint is one of the most challenging tasks, thus, in the presented novel approach a coverage path planner for the inspection of 2D areas is established, a 3 Degree of Freedom (DoF) camera movement is considered and the shortest path from the taking off to the landing station is generated, while covering the target area. The proposed scheme requires a priori information about the boundaries of the target area and generates the paths in an offline process. The efficacy and the overall performance of the proposed method has been experimentally evaluated in multiple indoor inspection experiments with convex and non convex areas. Furthermore, the image streams collected during the coverage tasks were post-processed using image stitching for obtaining a single overview of the covered scene.

Place, publisher, year, edition, pages
Elsevier, 2018
National Category
Control Engineering
Research subject
Control Engineering
Identifiers
urn:nbn:se:ltu:diva-68057 (URN)10.1016/j.conengprac.2018.03.011 (DOI)000433648100001 ()2-s2.0-85044107984 (Scopus ID)
Projects
Collaborative Aerial Robotic Workers, AEROWORKS
Funder
EU, Horizon 2020, 644128
Note

Validerad;2018;Nivå 2;2018-03-26 (andbra)

Available from: 2018-03-26 Created: 2018-03-26 Last updated: 2018-08-09Bibliographically approved
Kanellakis, C., Mansouri, S. S., Fresk, E., Kominiak, D. & Nikolakopoulos, G. (2018). Cooperative UAVs as a Tool for Aerial Inspection of Large Scale Aging Infrastructure. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS): . Paper presented at 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),Madrid, Spain,1-5 Oct. 2018 (pp. 5040-5040). Piscataway, NJ: IEEE
Open this publication in new window or tab >>Cooperative UAVs as a Tool for Aerial Inspection of Large Scale Aging Infrastructure
Show others...
2018 (English)In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Piscataway, NJ: IEEE, 2018, p. 5040-5040Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

This work presents an aerial tool towards the autonomous cooperative coverage and inspection of a large scale 3D infrastructure using multiple Unmanned Aerial Vehicles (UAVs). In the presented approach the UAVs are relying only on their onboard computer and sensory system, deployed for inspection of the 3D structure. In this application each agent covers a different part of the scene autonomously, while avoiding collisions. The autonomous navigation of each platform on the designed path is enabled by the localization system that fuses Ultra Wideband with inertial measurements through an Error- State Kalman Filter. The visual information collected from the aerial team is collaboratively processed to create the 3D model. The performance of the overall setup has been experimentally evaluated in realistic wind turbine inspection experiments, providing dense 3D reconstruction of the inspected structures.

Place, publisher, year, edition, pages
Piscataway, NJ: IEEE, 2018
Series
IEEE International Conference on Intelligent Robots and Systems, ISSN 2153-0858, E-ISSN 2153-0866
National Category
Robotics Control Engineering
Research subject
Control Engineering
Identifiers
urn:nbn:se:ltu:diva-72850 (URN)10.1109/IROS.2018.8593996 (DOI)000458872704097 ()978-1-5386-8095-7 (ISBN)978-1-5386-8094-0 (ISBN)978-1-5386-8093-3 (ISBN)
Conference
2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),Madrid, Spain,1-5 Oct. 2018
Funder
EU, Horizon 2020
Note

abstarct + video

Available from: 2019-02-12 Created: 2019-02-12 Last updated: 2019-03-27Bibliographically approved
Mansouri, S. S., Kanellakis, C., Fresk, E., Kominiak, D. & Nikolakopoulos, G. (2017). Cooperative UAVs as a tool for Aerial Inspection of the Aging Infrastructure. In: Marco Hutter, Roland Siegwart (Ed.), Field and Service Robotics: Results of the 11th International Conference. Paper presented at 11th Conference on Field and Service Robotics, FSR 2017, Zürich, 12.-15.9.2017 (pp. 177-189). Cham: Springer
Open this publication in new window or tab >>Cooperative UAVs as a tool for Aerial Inspection of the Aging Infrastructure
Show others...
2017 (English)In: Field and Service Robotics: Results of the 11th International Conference / [ed] Marco Hutter, Roland Siegwart, Cham: Springer, 2017, p. 177-189Conference paper, Published paper (Refereed)
Abstract [en]

This article presents an aerial tool towards the autonomous cooperative coverage and inspection of a 3D infrastructure using multiple Unmanned Aerial Vehicles (UAVs). In the presented approach the UAVs are relying only on their onboard computer and sensory system, deployed for inspection of the 3D structure. In this application each agent covers a different part of the scene autonomously, while avoiding collisions. The visual information collected from the aerial team is collaboratively processed to create the 3D model. The performance of the overall setup has been experimentally evaluated in a realistic outdoor infrastructure inspection experiments, providing sparse and dense 3D reconstruction of the inspected structures.

Place, publisher, year, edition, pages
Cham: Springer, 2017
Series
Springer Proceedings in Advanced Robotics, ISSN 2511-1256 ; 5
National Category
Robotics Control Engineering
Research subject
Control Engineering
Identifiers
urn:nbn:se:ltu:diva-66211 (URN)10.1007/978-3-319-67361-5_12 (DOI)978-3-319-67360-8 (ISBN)978-3-319-67361-5 (ISBN)
Conference
11th Conference on Field and Service Robotics, FSR 2017, Zürich, 12.-15.9.2017
Projects
Collaborative Aerial Robotic Workers, AEROWORKS
Funder
EU, Horizon 2020, 644128
Available from: 2017-10-22 Created: 2017-10-22 Last updated: 2018-05-29Bibliographically approved
Kanellakis, C., Mansouri, S. S. & Nikolakopoulos, G. (2017). Dynamic visual sensing based on MPC controlled UAVs. In: 2017 25th Mediterranean Conference on Control and Automation, MED 2017: . Paper presented at 25th Mediterranean Conference on Control and Automation, MED 2017, University of Malta, Valletta, Malta, 3-6 July 2017 (pp. 1201-1206). Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), Article ID 7984281.
Open this publication in new window or tab >>Dynamic visual sensing based on MPC controlled UAVs
2017 (English)In: 2017 25th Mediterranean Conference on Control and Automation, MED 2017, Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 1201-1206, article id 7984281Conference paper, Published paper (Refereed)
Abstract [en]

This article considers the establishment of a dynamic visual sensor from monocular cameras to enable a reconfigurable environmental perception. The cameras are mounted on Micro Aerial Vehicles (MAV) which are coordinated by a Model Predictive Control (MPC) scheme to retain overlapping field of views and form a global sensor with varying baseline. The specific merits of the proposed scheme are: a) the ability to form a configurable stereo rig, according to the application needs, and b) the simple design, the reduction of the payload and the corresponding cost. Moreover, the proposed configurable sensor provides a glpobal 3D reconstruction of the surrounding area, based on a modified Structure from Motion approach. The efficiency of the suggested flexible visual sensor is demonstrated in simulation results that highlight the novel concept of cooperative flying cameras and their 3D reconstruction capabilities

Place, publisher, year, edition, pages
Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017
Series
Mediterranean Conference on Control and Automation, ISSN 2325-369X
National Category
Control Engineering
Research subject
Control Engineering
Identifiers
urn:nbn:se:ltu:diva-65606 (URN)10.1109/MED.2017.7984281 (DOI)000426926300196 ()2-s2.0-85028515340 (Scopus ID)9781509045334 (ISBN)
Conference
25th Mediterranean Conference on Control and Automation, MED 2017, University of Malta, Valletta, Malta, 3-6 July 2017
Projects
Collaborative Aerial Robotic Workers, AEROWORKS
Funder
EU, Horizon 2020, 644128
Available from: 2017-09-12 Created: 2017-09-12 Last updated: 2018-05-29Bibliographically approved
Kanellakis, C. (2017). On Visual Perception for an Aerial Robotic Worker. (Licentiate dissertation). Luleå tekniska universitet
Open this publication in new window or tab >>On Visual Perception for an Aerial Robotic Worker
2017 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Micro Aerial Vehicles and especially multi rotors are gaining more and more attention for accomplishing complex tasks, considering their simple mechanical design and their versatile movement. MAVs are ideal candidates to perform tasks autonomously, to work safely in close proximity and in collaboration with humans, and to operate safely and effectively in natural human environments, like infrastructure inspection-maintenance, underground mine operations and surveillance missions. Adopting this vision, this thesis contributes in the aerial platform ecosystem that can be summarized by the term Aerial Robotic Worker (ARW). An ARW is characterized, among others, by its advanced capabilities on environmental perception and 3D reconstruction and active aerial manipulation.Using cameras for localization, mapping of an ARW as well as guidance on aerial manipulation is appealing mainly because of the small size and cost of such sensors. Nevertheless, visualinformation provided from the cameras is enormous, posing significant challenges in real-time data processing, while meeting the constraints of these platforms. Additionally, another challenge on visual perception considers the usage of multiple agents that collaboratively perceive their surroundings forming an aerial sensor. This thesis also investigates the applicability of visual SLAM algorithms in uncontrolled and cluttered environments. Furthermore, work will be presented on visual guidance for an aerial manipulator, which is challenging regarding the object detection, tracking and the platform approaching strategies. The first contribution will be the establishment of a flexible virtual stereo rig consisted of MPC controlled MAVs. The advantage of this approach is the varying baseline sensor that is composed from independently moving cameras, adjusting the depth perception accordingly. This method is able to provide the 3D reconstruction of the environment in a sparse pointcloud. The second contribution of this this thesis will examine the single agents in two different scenarios. Initially, experimental trials of commonly used visual sensors in hard and challenging environments will be presented in real scale underground ore mine to evaluate the localization and mapping performance of such technology for potential usage in UAVs. Secondly, theoretical work will be performed regarding attitude regulation of a hexacopter for stable hovering based on visual localization. In this work the time delays induced from the processing should be compensated with a switching control scheme which is able to maintain the stability of the platform. Finally, a third contribution of this thesis will be vision for aerial manipulation. The developed system includes a stereo camera that is attached on the end-effector of the aerial manipulator and is used to provide robust target detection and tracking. The visual feedback is processed to co-localize the aerial agent with the target and generate a waypoint that allows to approach the target.

Place, publisher, year, edition, pages
Luleå tekniska universitet, 2017
Series
Licentiate thesis / Luleå University of Technology, ISSN 1402-1757
National Category
Control Engineering
Research subject
Control Engineering
Identifiers
urn:nbn:se:ltu:diva-65535 (URN)978-91-7583-955-4 (ISBN)978-91-7583-956-1 (ISBN)
Presentation
2017-10-10, D2223, Luleå, 09:00 (English)
Funder
EU, Horizon 2020
Available from: 2017-09-11 Created: 2017-09-08 Last updated: 2017-11-24Bibliographically approved
Organisations

Search in DiVA

Show all publications