Change search
Refine search result
1 - 20 of 20
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Fresk, Emil
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Mansouri, Sina Sharif
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Halén, Erik
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Reduced complexity calibration of MEMS IMUs2017In: 2017 25th Mediterranean Conference on Control and Automation, MED 2017, Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 1316-1320, article id 7984300Conference paper (Refereed)
    Abstract [en]

    In this article a reduced complexity calibration method for Micro-Electro-Mechanical Systems (MEMS) Inertial Measurement Units (IMUs) will be presented, which does not need the rotating reference tables, commonly used in the gyroscope calibration. As it will be presented, in the proposed novel scheme fixed angle rotations have been utilized to observe the integral of the gyroscope signals to find the corresponding sensitivity, axis misalignment and acceleration sensitivity matrices. This appraoch has the significant merit of high norm accuracy, easiness of use, low cost and simplicity in construction, thus allowing anyone with a basic electronics knowledge to calibrate an IMU.

    Download full text (pdf)
    fulltext
  • 2.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    On Visual Perception for an Aerial Robotic Worker2017Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Micro Aerial Vehicles and especially multi rotors are gaining more and more attention for accomplishing complex tasks, considering their simple mechanical design and their versatile movement. MAVs are ideal candidates to perform tasks autonomously, to work safely in close proximity and in collaboration with humans, and to operate safely and effectively in natural human environments, like infrastructure inspection-maintenance, underground mine operations and surveillance missions. Adopting this vision, this thesis contributes in the aerial platform ecosystem that can be summarized by the term Aerial Robotic Worker (ARW). An ARW is characterized, among others, by its advanced capabilities on environmental perception and 3D reconstruction and active aerial manipulation.Using cameras for localization, mapping of an ARW as well as guidance on aerial manipulation is appealing mainly because of the small size and cost of such sensors. Nevertheless, visualinformation provided from the cameras is enormous, posing significant challenges in real-time data processing, while meeting the constraints of these platforms. Additionally, another challenge on visual perception considers the usage of multiple agents that collaboratively perceive their surroundings forming an aerial sensor. This thesis also investigates the applicability of visual SLAM algorithms in uncontrolled and cluttered environments. Furthermore, work will be presented on visual guidance for an aerial manipulator, which is challenging regarding the object detection, tracking and the platform approaching strategies. The first contribution will be the establishment of a flexible virtual stereo rig consisted of MPC controlled MAVs. The advantage of this approach is the varying baseline sensor that is composed from independently moving cameras, adjusting the depth perception accordingly. This method is able to provide the 3D reconstruction of the environment in a sparse pointcloud. The second contribution of this this thesis will examine the single agents in two different scenarios. Initially, experimental trials of commonly used visual sensors in hard and challenging environments will be presented in real scale underground ore mine to evaluate the localization and mapping performance of such technology for potential usage in UAVs. Secondly, theoretical work will be performed regarding attitude regulation of a hexacopter for stable hovering based on visual localization. In this work the time delays induced from the processing should be compensated with a switching control scheme which is able to maintain the stability of the platform. Finally, a third contribution of this thesis will be vision for aerial manipulation. The developed system includes a stereo camera that is attached on the end-effector of the aerial manipulator and is used to provide robust target detection and tracking. The visual feedback is processed to co-localize the aerial agent with the target and generate a waypoint that allows to approach the target.

    Download full text (pdf)
    fulltext
  • 3.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Perception Aware Guidance Framework for Micro Aerial Vehicles2020Doctoral thesis, monograph (Other academic)
    Abstract [en]

    Micro Aerial Vehicles (MAVs) are platforms that have received significant research resources within robotics community, since they are characterized by simple mechanical design and versatile movement. These platforms possess capabilities that are suitable for complex task execution, in situations which are impossible or dangerous for the human operator to perform, as well as to reduce the operating costs and increase the overall efficiency of the operation. Until now they have been integrated in the photography-filming industry, but more and more efforts are directed towards remote reconnaissance and inspection applications. Moreover, instead of carrying only sensors these platforms could be endowed with lightweight dexterous robotic arms expanding their operational workspace allowing active interaction with the environment, capabilities that can be vital for applications like payload transportation and infrastructure maintenance. The main objective of this thesis is to establish the concept of the resource-constraint aerial robotic scout and present perception aware frameworks for guidance of the platform and the aerial manipulator as part of the enabling technology towards fully autonomous capabilities. The majority of the works has been developed aiming the application scenario of the MAV deployments in subterranean environments for search and rescue missions, infrastructure inspection and other tasks. A key factor when deploying aerial platforms in dark and cluttered underground tunnels in the lack of illumination which degrades the performance of the visual sensor. It is essential for the inspection or reconnaissance task to get visual feedback from the robot and therefore, this thesis evaluates methods for low light image enhancement in real environments and with datasets collected from flying vehicles, while proposes a preprocessing methodology of the visual dataset for enhancing the 3D mapping of the area. Another capability required when deploying the platforms is the navigation along the tunnel. This thesis establishes robocentric Non Linear Model Predictive Control (NMPC) framework for fast fully autonomous navigation of quadrotors in featureless dark tunnel environments. Additionally, this work leverages the processing of a single camera to generate direction commands along the tunnel axis, while regulating the platform’s altitude. Finally, combining the agility of MAVs with the dexterity of robotic arms leads to a new era of Aerial Robotic Workers (ARWs) with advanced capabilities, suitable for complex task execution. This technology has the potential to revolutionize infrastructure maintenance tasks. The development of efficient and reliable perception modules to guide the aerial platform at the desired target areas and perform the respective manipulation tasks is, among others, an essential step towards the envisioned goal. Thus, the aim of this work is the establishment of a visual guidance system to assist the aerial platform before applying any physical interaction. The proposed system is structured around a robust object tracker and is characterized by stereo vision capabilities for target position extraction, towards an autonomous aerial robotic worker.

    Download full text (pdf)
    fulltext
  • 4.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Karvelis, Petros
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    On Image based Enhancement for 3D Dense Reconstruction of Low Light Aerial Visual Inspected Environments2019Conference paper (Refereed)
    Download full text (pdf)
    fulltext
  • 5.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Karvelis, Petros
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Open Space Attraction Based Navigation in Dark Tunnels for MAVs2019Conference paper (Refereed)
    Download full text (pdf)
    fulltext
  • 6.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kyritsis, George
    Electrical and Computer Engineering Department, University of Patras.
    Tsilomitrou, Ourania
    Electrical and Computer Engineering Department, University of Patras.
    Manesis, Stamatis
    Electrical and Computer Engineering Department, University of Patras.
    A low-cost stereoscopic µP-based vision system for industrial light objects grasping2015In: IEEE Mediterranean Conference on Control and Automation, Torremolinos, Spain, June 16-19, 2015 / [ed] V. Nunoz, Piscataway, NJ: IEEE Communications Society, 2015, p. 759-765, article id 7158837Conference paper (Refereed)
    Abstract [en]

    This article describes a vision-based manipulation process for real-time moving objects tracking and grasping, aiming at industrial manufacturing and assembling applications. The adoption of computer vision techniques for object recognition is implemented by means of a stereoscopic system using color based methods under OpenCV libraries. The visual software is directly coupled with the control software of the robotic arm Katana 6M90G manufactured by Neuronics AG, running under a Linux-based Operating System (OS) distribution Lubuntu, over a low-cost and powerful microprocessor Odroid U3. Experimental studies validate the effectiveness of the implementation, while remarking the advantageous effects of the 3D pose estimation process.

  • 7.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Mansouri, Sina Sharif
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Fresk, Emil
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kominiak, Dariusz
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Cooperative UAVs as a Tool for Aerial Inspection of Large Scale Aging Infrastructure2018In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Piscataway, NJ: IEEE, 2018, p. 5040-5040Conference paper (Refereed)
    Abstract [en]

    This work presents an aerial tool towards the autonomous cooperative coverage and inspection of a large scale 3D infrastructure using multiple Unmanned Aerial Vehicles (UAVs). In the presented approach the UAVs are relying only on their onboard computer and sensory system, deployed for inspection of the 3D structure. In this application each agent covers a different part of the scene autonomously, while avoiding collisions. The autonomous navigation of each platform on the designed path is enabled by the localization system that fuses Ultra Wideband with inertial measurements through an Error- State Kalman Filter. The visual information collected from the aerial team is collaboratively processed to create the 3D model. The performance of the overall setup has been experimentally evaluated in realistic wind turbine inspection experiments, providing dense 3D reconstruction of the inspected structures.

  • 8.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Mansouri, Sina Sharif
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Georgoulas, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Towards Autonomous Surveying of Underground Mine using MAVs2019Conference paper (Refereed)
    Abstract [en]

    Micro Aerial Vehicles (MAVs) are platforms that received great attention during the last decade. Recently, the mining industry has been considering the usage of aerial autonomous platforms in their processes. This article initially investigates potential application scenarios for this technology in mining. Moreover, one of the main tasks refer to surveillance and maintenance of infrastructure assets. Employing these robots for underground surveillance processes of areas like shafts, tunnels or large voids after blasting, requires among others the development of elaborate navigation modules. This paper proposes a method to assist the navigation capabilities of MAVs in challenging mine environments, like tunnels and vertical shafts. The proposed method considers the use of Potential Fields method, tailored to implement a sense-and-avoid system using a minimal ultrasound-based sensory system. Simulation results demonstrate the effectiveness of the proposed strategy.

    Download full text (pdf)
    fulltext
  • 9.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Mansouri, Sina Sharif
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Dynamic visual sensing based on MPC controlled UAVs2017In: 2017 25th Mediterranean Conference on Control and Automation, MED 2017, Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 1201-1206, article id 7984281Conference paper (Refereed)
    Abstract [en]

    This article considers the establishment of a dynamic visual sensor from monocular cameras to enable a reconfigurable environmental perception. The cameras are mounted on Micro Aerial Vehicles (MAV) which are coordinated by a Model Predictive Control (MPC) scheme to retain overlapping field of views and form a global sensor with varying baseline. The specific merits of the proposed scheme are: a) the ability to form a configurable stereo rig, according to the application needs, and b) the simple design, the reduction of the payload and the corresponding cost. Moreover, the proposed configurable sensor provides a glpobal 3D reconstruction of the surrounding area, based on a modified Structure from Motion approach. The efficiency of the suggested flexible visual sensor is demonstrated in simulation results that highlight the novel concept of cooperative flying cameras and their 3D reconstruction capabilities

    Download full text (pdf)
    fulltext
  • 10.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    A robust reconfigurable control scheme against pose estimation induced time delays2016In: 2016 IEEE Conference on Control Applications (CCA): 19-22 Sept. 2016, Piscataway, NU: IEEE Communications Society, 2016, p. 581-586, article id 7587892Conference paper (Refereed)
    Abstract [en]

    Time delays are one of the most common problems when utilizing a visual sensor for pose estimation or navigation in aerial robotics. Such time delays can grow exponentially as a function of the scene's complexity and the size of the mapping during classical Simultaneous Localization and Mapping (SLAM) strategies. In this article, a robust reconfigurable control scheme against pose estimation induced time delays is presented. Initially, an experimental verification of the induced time delays via pose estimation is performed for the attitude problem of a hexacopter, while a switching time delay dependent modeling approach is formulated. In addition, a stability analysis algorithm is introduced in order to evaluate the maximum allowable time delays that the target system can handle for a given LQR controller. The varying nature of the time delays results in a switching system with the latency time to play the role of a switching rule, while simulation results are presented to outline the effects of the time-induced delays in hexarotor-based systems and finally evaluate the overall efficiency of the proposed control scheme.

    Download full text (pdf)
    fulltext
  • 11.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Evaluation of Visual Localization Systems in Underground Mining2016In: 24th Mediterranean Conference on Control and Automation (MED): June 21-24, Athens, Greece, 2016, Piscataway, NJ: IEEE Communications Society, 2016, p. 539-544, article id 7535853Conference paper (Refereed)
    Abstract [en]

    In this article an evaluation of the current technology on visual localization systems for underground mining is presented. The proposed study is considered to be the first step among others towards enabling the vision of underground localization for Unmanned Micro Aerial Vehicles. Furthermore, the aim of this article, is to verify applicable and reliable low cost existing methods and technologies for the problem of UAV localization in underground, harsh mining environments and more specifically in one of the biggest mines in Europe, the iron ore mine of LKAB in Kiruna, Sweden. In the experimental trials, the sensors employed were a RGB-D camera, a Kinect 2 and a Playstation 3 Eye web camera used in two configurations, as a stereo rig and as a monocular visual sensor. The processing of the stored data from the experiments will provide an insight for the applicability of these sensors, while it will identify what further technological and research developments are required in order to develop affordable autonomous UAV solutions for improving the underground mining production processes.

    Download full text (pdf)
    fulltext
  • 12.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Guidance for Autonomous Aerial Manipulator Using Stereo Vision2019In: Journal of Intelligent and Robotic Systems, ISSN 0921-0296, E-ISSN 1573-0409Article in journal (Refereed)
    Abstract [en]

    Combining the agility of Micro Aerial Vehicles (MAV) with the dexterity of robotic arms leads to a new era of Aerial Robotic Workers (ARW) targeting infrastructure inspection and maintenance tasks. Towards this vision, this work focuses on the autonomous guidance of the aerial end-effector to either reach or keep desired distance from areas/objects of interest. The proposed system: 1) is structured around a real-time object tracker, 2) employs stereo depth perception to extract the target location within the surrounding scene, and finally 3) generates feasible poses for both the arm and the MAV relative to the target. The performance of the proposed scheme is experimentally demonstrated in multiple scenarios of increasing complexity.

  • 13.
    Kanellakis, Christoforos
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Survey on Computer Vision for UAVs: Current Developments and Trends2017In: Journal of Intelligent and Robotic Systems, ISSN 0921-0296, E-ISSN 1573-0409, Vol. 87, no 1, p. 141-168Article in journal (Refereed)
    Abstract [en]

    During last decade the scientific research on Unmanned Aerial Vehicless (UAVs) increased spectacularly and led to the design of multiple types of aerial platforms. The major challenge today is the development of autonomously operating aerial agents capable of completing missions independently of human interaction. To this extent, visual sensing techniques have been integrated in the control pipeline of the UAVs in order to enhance their navigation and guidance skills. The aim of this article is to present a comprehensive literature review on vision based applications for UAVs focusing mainly on current developments and trends. These applications are sorted in different categories according to the research topics among various research groups. More specifically vision based position-attitude control, pose estimation and mapping, obstacle detection as well as target tracking are the identified components towards autonomous agents. Aerial platforms could reach greater level of autonomy by integrating all these technologies onboard. Additionally, throughout this article the concept of fusion multiple sensors is highlighted, while an overview on the challenges addressed and future trends in autonomous agent development will be also provided.

    Download full text (pdf)
    fulltext
  • 14.
    Mansouri, Sina Sharif
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Arranz, Miguel Castano
    Luleå University of Technology, Department of Civil, Environmental and Natural Resources Engineering, Operation, Maintenance and Acoustics.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Autonomous MAV Navigation in Underground Mines Using Darkness Contours Detection2019Conference paper (Refereed)
    Abstract [en]

    This article considers a low-cost and light weight platform for the task of autonomous flying for inspection in underground mine tunnels. The main contribution of this paper is integrating simple, efficient and well-established methods in the computer vision community in a state of the art vision-based system for Micro Aerial Vehicle (MAV) navigation in dark tunnels. These methods include Otsu's threshold and Moore-Neighborhood object tracing. The vision system can detect the position of low-illuminated tunnels in image frame by exploiting the inherent darkness in the longitudinal direction. In the sequel, it is converted from the pixel coordinates to the heading rate command of the MAV for adjusting the heading towards the center of the tunnel. The efficacy of the proposed framework has been evaluated in multiple experimental field trials in an underground mine in Sweden, thus demonstrating the capability of low-cost and resource-constrained aerial vehicles to fly autonomously through tunnel confined spaces.

    Download full text (pdf)
    fulltext
  • 15.
    Mansouri, Sina Sharif
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Fresk, Emil
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kominiak, Dariusz
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Cooperative UAVs as a tool for Aerial Inspection of the Aging Infrastructure2017In: Field and Service Robotics: Results of the 11th International Conference / [ed] Marco Hutter, Roland Siegwart, Cham: Springer, 2017, p. 177-189Conference paper (Refereed)
    Abstract [en]

    This article presents an aerial tool towards the autonomous cooperative coverage and inspection of a 3D infrastructure using multiple Unmanned Aerial Vehicles (UAVs). In the presented approach the UAVs are relying only on their onboard computer and sensory system, deployed for inspection of the 3D structure. In this application each agent covers a different part of the scene autonomously, while avoiding collisions. The visual information collected from the aerial team is collaboratively processed to create the 3D model. The performance of the overall setup has been experimentally evaluated in a realistic outdoor infrastructure inspection experiments, providing sparse and dense 3D reconstruction of the inspected structures.

    Download full text (pdf)
    fulltext
  • 16.
    Mansouri, Sina Sharif
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Georgoulas, Georgios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kominiak, Dariusz
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Gustafsson, Thomas
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    2D visual area coverage and path planning coupled with camera footprints2018In: Control Engineering Practice, ISSN 0967-0661, E-ISSN 1873-6939, Vol. 75, p. 1-16Article in journal (Refereed)
    Abstract [en]

    Unmanned Aerial Vehicles (UAVs) equipped with visual sensors are widely used in area coverage missions. Guaranteeing full coverage coupled with camera footprint is one of the most challenging tasks, thus, in the presented novel approach a coverage path planner for the inspection of 2D areas is established, a 3 Degree of Freedom (DoF) camera movement is considered and the shortest path from the taking off to the landing station is generated, while covering the target area. The proposed scheme requires a priori information about the boundaries of the target area and generates the paths in an offline process. The efficacy and the overall performance of the proposed method has been experimentally evaluated in multiple indoor inspection experiments with convex and non convex areas. Furthermore, the image streams collected during the coverage tasks were post-processed using image stitching for obtaining a single overview of the covered scene.

    Download full text (pdf)
    fulltext
  • 17.
    Mansouri, Sina Sharif
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kominiak, Dariusz
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Deploying MAVs for autonomous navigation in dark underground mine environments2020In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 126, article id 103472Article in journal (Refereed)
    Abstract [en]

    Operating Micro Aerial Vehicles (MAVs) in subterranean environments is becoming more and more relevant in the field of aerial robotics. Despite the large spectrum of technological advances in the field, flying in such challenging environments is still an ongoing quest that requires the combination of multiple sensor modalities like visual/thermal cameras as well as 3D and 2D lidars. Nevertheless, there exist cases in subterranean environments where the aim is to deploy fast and lightweight aerial robots for area reckoning purposes after an event (e.g. blasting in production areas). This work proposes a novel baseline approach for the navigation of resource constrained robots, introducing the aerial underground scout, with the main goal to rapidly explore unknown areas and provide a feedback to the operator. The main proposed framework focuses on the navigation, control and vision capabilities of the aerial platforms with low-cost sensor suites, contributing significantly towards real-life applications. The merit of the proposed control architecture is that it considers the flying platform as a floating object, composing a velocity controller on the x, y axes and altitude control to navigate along the tunnel. Two novel approaches make up the cornerstone of the proposed contributions for the task of navigation: (1) a vector geometry method based on 2D lidar, and (2) a Deep Learning (DL) method through a classification process based on an on-board image stream, where both methods correct the heading towards the center of the mine tunnel. Finally, the framework has been evaluated in multiple field trials in an underground mine in Sweden.

  • 18.
    Mansouri, Sina Sharif
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Karvelis, Petros
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kominiak, Dariusz
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Vision-based MAV Navigation in Underground Mine Using Convolutional Neural Network2019In: IECON 2019: 45th Annual Conference of the IEEE Industrial Electronics Society, IEEE, 2019, p. 750-755Conference paper (Refereed)
    Abstract [en]

    This article presents a Convolutional Neural Network (CNN) method to enable autonomous navigation of low-cost Micro Aerial Vehicle (MAV) platforms along dark underground mine environments. The proposed CNN component provides on-line heading rate commands for the MAV by utilising the image stream from the on-board camera, thus allowing the platform to follow a collision-free path along the tunnel axis. A novel part of the developed method consists of the generation of the data-set used for training the CNN. More specifically, inspired from single image haze removal algorithms, various image data-sets collected from real tunnel environments have been processed offline to provide an estimation of the depth information of the scene, where ground truth is not available. The calculated depth map is used to extract the open space in the tunnel, expressed through the area centroid and is finally provided in the training of the CNN. The method considers the MAV as a floating object, thus accurate pose estimation is not required. Finally, the capability of the proposed method has been successfully experimentally evaluated in field trials in an underground mine in Sweden.

    Download full text (pdf)
    fulltext
  • 19.
    Mansouri, Sina Sharif
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Karvelis, Petros
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Koval, Anton
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Visual Subterranean Junction Recognition for MAVs based on Convolutional Neural Networks2019In: IECON 2019: 45th Annual Conference of the IEEE Industrial Electronics Society, IEEE, 2019, p. 192-197Conference paper (Other academic)
    Abstract [en]

    This article proposes a novel visual framework for detecting tunnel crossings/junctions in underground mine areas towards the autonomous navigation of Micro Aeril Vehicles (MAVs). Usually mine environments have complex geometries, including multiple crossings with different tunnels that challenge the autonomous planning of aerial robots. Towards the envisioned scenario of autonomous or semi-autonomous deployment of MAVs with limited Line-of-Sight in subterranean environments, the proposed module acknowledges the existence of junctions by providing crucial information to the autonomy and planning layers of the aerial vehicle. The capability for a junction detection is necessary in the majority of mission scenarios, including unknown area exploration, known area inspection and robot homing missions. The proposed novel method has the ability to feed the image stream from the vehicles’ on-board forward facing camera in a Convolutional Neural Network (CNN) classification architecture, expressed in four categories: 1) left junction, 2) right junction, 3) left & right junction, and 4) no junction in the local vicinity of the vehicle. The core contribution stems for the incorporation of AlexNet in a transfer learning scheme for detecting multiple branches in a subterranean environment. The validity of the proposed method has been validated through multiple data-sets collected from real underground environments, demonstrating the performance and merits of the proposed module.

    Download full text (pdf)
    fulltext
  • 20.
    Wuthier, David
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kominiak, Dariusz
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Kanellakis, Christoforos
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Andrikopoulos, Georgios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Fumagalli, Matteo
    Aalborg University.
    Schipper, G.
    University of Twente.
    Nikolakopoulos, George
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    On the Design, Modeling and Control of a Novel Compact Aerial Manipulator2016In: 24th Mediterranean Conference on Control and Automation, MED 2016, Piscataway, NJ: IEEE Communications Society, 2016, p. 665-670, article id 7536029Conference paper (Refereed)
    Abstract [en]

    The aim of this article is to present a novel fourdegree-of-freedom aerial manipulator allowing a multirotorUnmanned Aerial Vehicle (UAV) to physically interact with theenvironment. The proposed design, named CARMA (CompactAeRial MAnipulator), is characterized by low disturbances onthe UAV flight dynamics, extended workspace (with regard toits retracted configuration) and fast dynamics (compared to theUAV dynamics). The dynamic model is formulated and a controlstructure consisting of an inverse kinematics algorithm and independentjoint position controllers is presented. Furthermore,the design specifications of the prototype are analyzed in detail,while experimental evaluations are conducted for the extractionof the manipulator’s workspace and the evaluation of system’stracking capabilities over pick-and-place trajectories. Finally,it is shown that the selected joint position sensors, combinedwith the derived inverse dynamic algorithm allow to determinethe wrenches exerted at the base, due to swift motions of thearm.

    Download full text (pdf)
    fulltext
1 - 20 of 20
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf