Change search
Refine search result
1 - 41 of 41
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Blomquist, Mats
    et al.
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Range camera on conveyor belts: estimating size distribution and systematic errors due to occlusion1999In: Proceedings of SPIE, the International Society for Optical Engineering, ISSN 0277-786X, E-ISSN 1996-756X, Vol. 118, p. 118-126Article in journal (Refereed)
    Abstract [en]

    When range cameras are used for analyzing irregular material on a conveyor belt there will be complications such as missing segments caused by occlusion. Also, a number of range discontinuities will be present. In the framework towards stochastic geometry, conditions are found for the cases when range discontinuities take place. The test objects are pellets for the steel industry. An illuminating laser plane will give range discontinuities at the edges of each individual object. These discontinuities are used to detect and measure the chord created by the intersection of the laser plane and the object. From the measured chords we derive the average diameter and its variance. An improved method is to use a pair of parallel illuminating light planes to extract two chords. The estimation error for this method is not larger than the natural shape fluctuations (the difference in diameter) for the pellets. The laser-camera optronics is sensitive enough both for material on a conveyor belt and free falling material leaving the conveyor.

  • 2.
    Blomquist, Mats
    et al.
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Tracking range discontinuities in dynamic scenes: a smart range camera1994In: Intelligent robots and computer vision XIII: algorithms and computer vision: 31 October - 2 November 1994, Boston, Massachusetts, SPIE - International Society for Optical Engineering, 1994, p. 249-260Conference paper (Refereed)
    Abstract [en]

    The problem studied in this paper is algorithms for fast and reliable extraction of range discontinuities in dynamic scenes. The application is to control the motion of a robot using a range scanning sensor. When estimating the pose of the objects in a scene, it is obvious that range discontinuities and flat surfaces have the largest information content. The concept studied consists of a smart camera chip together with a scanning illuminating laser. Feedback loops are closed between the chip and the scanning laser so as to follow along different types of range discontinuities in the scene. More explicitly: two types of feedback laws are outlined so as to track along range discontinuities both with and without occlusion; the laser can also track along a `generalized cylinder', say, a cable free in space or laying on an uneven surface; the tracking accuracy is estimated as the laser follows along the `curve of discontinuity'. The results are too preliminary and are not in this paper. In an earlier study, the Hough transform was found to be very robust in extracting the coordinates of planar surfaces. The edge parameters in this study are thus complementary to these surface parameters. Compared with complete range scanning of the entire scene, it seems possible to gain at least one order of magnitude in speed. This is important since these extracted range features are inside the feedback loop of the robot

  • 3.
    Forsberg, J.
    et al.
    Luleå tekniska universitet.
    Larsson, Ulf
    Luleå tekniska universitet.
    Åhman, Per
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Navigation in cluttered rooms using a range measuring laser and the Hough transform1993In: Intelligent autonomous systems, IAS-3: proceedings of the International Conference, Pittsburgh, Pennsylvania, February 15 - 18, 1993 / [ed] F. C. A. Groen; Shigeo Hirose, Washington: IOS Press, 1993, p. 248-257Conference paper (Refereed)
  • 4.
    Forsberg, J.
    et al.
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    An autonomous plastering robot for walls and ceilings1995In: Intelligent autonomous vehicles 1995: a postprint volume from the 2nd IFAC Conference, Helsinki University of Technology, Espoo, Finland, 12 - 14 June 1995 / [ed] Aarne Halme; Kari Koskinen, Oxford: Pergamon Press, 1995, p. 301-306Conference paper (Refereed)
    Abstract [en]

    This paper presents a robot and algorithms for autonomous spray plastering of walls and ceilings during the construction of apartment houses. Successful experimental tests are described where the robot measures the size of the room and the location of doors and windows. The work sequence is then planned and executed autonomously. To sense the environment the robot uses a range measuring scanning laser. The range weighted Hough transform is used to observe the walls and the resulting observations are used to update estimates in an extended Kalman filter based map. Association of observations to estimates is performed using the estimated probability of a correct match, taking correlations between estimates into account when needed

  • 5.
    Forsberg, Johan
    et al.
    Luleå tekniska universitet.
    Graff, Daniel
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    An autonomous plastering robot for walls and ceilings1995In: Elsevier IFAC Publications / IFAC Proceedings series, ISSN 1474-6670, Vol. 28, no 11, p. 301-306Article in journal (Refereed)
    Abstract [en]

    This paper presents a robot and algorithms for autonomous spray plastering of walls and ceilings during the construction of apartment houses. Successful experimental tests are described where the robot measures the size of the room and the location of doors and windows. The work sequence is then planned and executed autonomously. To sense the environment the robot uses a range measuring scanning laser. The range weighted Hough transform is used to observe the walls and the resulting observations are used to update estimates in an extended Kalman filter based map. Association of observations to estimates is performed using the estimated probability of a correct match, taking correlations between estimates into account when needed

  • 6.
    Forsberg, Johan
    et al.
    Luleå tekniska universitet.
    Högström, Thomas
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Semiautonomous navigation of mobile robots1995In: Mobile Robots IX: [Mobile Robots Conference], 2-4 November 1994, Boston, Massachusetts, Bellingham, Wash: SPIE - International Society for Optical Engineering, 1995, p. 128-138Conference paper (Refereed)
    Abstract [en]

    The paper is on operations for semi-autonomous mobile robots. The robot is assumed to be remotely controlled by an operator. It is difficult for an operator to directly control the robot, especially over a communication link with low bandwidth and/or time delays. Therefore it is necessary to close the control loop in the robot with the operator giving high level commands. The operator is still needed as the fully autonomous robot does not exist today, except for limited scenarios. The scene around the robot is sensed using a scanning range measuring laser and a camera. The high level scene interpretation is done by the operator who also does the high level planning. Which operations the robot are to perform is indicated by pointing in the images or in a map created by the robot. Operations are functions that the robot can perform autonomously. They can be simple, like `Travel 2 m ahead', or more advanced, like `Follow the corridor and take the first door to the right'. Some typical operations are: (1) Lock the heading of the robot when driving on a `straight' line and preprogrammed 90 and 180 turns. (2) Automatically enter a camera defined line; the direction of the camera is used to drop a new coordinate frame at any time (using a knob on the keyboard). The robot will automatically enter this new line and also compensate for the overshoot. (3) Travelling along corridors. The operation is both robust and precise. The precision is about 1 cm at 1 m/s and the robot is not disturbed by people passing it in the corridor. (4) Command for passage through a door works within 1 cm and 0.5 degrees at a speed of 0.5 m/s. The range weighted Hough Transform on laser measurements extracts the walls in an indoor environment. This is used to create an internal map in the robot which is used for operations like corridor following or passing through doors. The map is also useful when presenting information to the operator

  • 7.
    Forsberg, Johan
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Larsson, Ulf
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Mobile robot navigation using the range-weighted Hough transform1995In: IEEE robotics & automation magazine, ISSN 1070-9932, E-ISSN 1558-223X, Vol. 2, no 1, p. 18-26Article in journal (Refereed)
    Abstract [en]

    Accurate navigation of a mobile robot in cluttered rooms using a range-measuring laser as a sensor has been achieved. To extract the directions and distances to the walls of the room the range-weighted Hough transform is used. The following experimental results are emphasized: The robot extracts the walls of the surrounding room from the range measurements. The distances between parallel walls are estimated with a standard deviation smaller than 1 cm. It is possible to navigate the robot along any preselected trajectory in the room. One special case is navigation through an open door detected by the laser. The accuracy of the passage is 1 cm at a speed of 0.5 m/s. The trajectory is perpendicular to the wall within 0.5 degrees in angle. When navigating through corridors, the accuracy is better than 1 cm at 0.8 m/s-the maximum speed of the robot. Odometric data and laser measurements are combined using the extended Kalman filter. The size of the cluttered rectangular room and the position and orientation (pose) of the robot are estimated during motion. The extraction and the resulting navigation are very robust against both spurious measurements in the laser measurements and disturbing objects.

  • 8.
    Forsberg, Johan
    et al.
    Luleå tekniska universitet.
    Larsson, Ulf
    Luleå tekniska universitet.
    Åhman, Per
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    The Hough transform inside the feedback loop of a mobile robot1993In: Proceedings, IEEE International Conference on Robotics and Automation: May 2 - 6, 1993, Atlanta, Georgia, Los Alamitos, Calif: IEEE Communications Society, 1993, Vol. 1, p. 791-798Conference paper (Refereed)
    Abstract [en]

    Accurate navigation of a mobile robot in cluttered rooms using a range measuring laser as a sensor is considered. The range weighted Hough transform (RWHT) is used to extract the directions and distances to the walls of the room. Using the extracted RWHT peaks in a feedback loop, the mobile robot is capable of a large variety of navigation tasks for navigating the robot through an open door detected by the laser, the accuracy during passage was found experimentally to be 1 cm at a speed of 0.5 m/s. For navigating through corridors, the accuracy was better than 1 cm at the maximum speed, 0.8 m/s, of the robot. Odometric data and laser measurements are combined using the extended Kalman filter. During motion, the size of the cluttered rectangular room is estimated as well as the position and the orientation (pose) of the robot. The navigation is very robust against both spurious laser measurements and disturbing objects

  • 9. Fredriksson, Håkan
    et al.
    Rönnbäck, Sven
    Berglund, Tomas
    Wernersson, Åke
    Hyyppä, Kalevi
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    snowBOTS: a mobile robot on snow covered ice2007In: Proceedings of the 13th IASTED International Conference on Robotics and Applications: August 29 - 31, 2007, Würzburg, Germany / [ed] K. Schilling, Anaheim, Calif.: ACTA Press, 2007, p. 222-228Conference paper (Refereed)
    Abstract [en]

    We introduce snowBOTs as a generic name for robots working in snow. This paper is a study on using scan ning range measuring lasers towards an autonomous snow cleaning robot, working in an environment consisting al most entirely of snow and ice. The problem addressed here is using lasers for detecting the edges generated by "the snow meeting the road". First the laser data were filtered using his togram/median to discriminate against falling snowflakes and small objects. Then the road surface was extracted us ing the range weighted Hough/Radon transform. Finally the left and right edges of the road was detected by thresh olding. Tests have been made with a laser on top of a car driven in an automobile test range just south of the Arctic Circle. Moreover, in the campus area, the algorithms were tested in closed loop with the laser on board a robotized wheelchair.

  • 10.
    Hyyppä, Kalevi
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Wernersson, Åke
    Andersson, Ulf
    Gustafsson, Thomas
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Navigeringsexperiment med autonomt fordon1991In: Robotikdagar: robotteknik och verkstadsteknisk automation mot ökad autonomi ; 30 - 31 maj 1991, Linköping ; proceedings, Linköping, 1991Conference paper (Refereed)
  • 11.
    Hyyppä, Kalevi
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Wiklund, Urban
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Andersson, Ulf
    Gustafsson, Thomas
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Wernersson, Åke
    Hillerström, Gunnar
    Zell, Caj
    Navigational experiments with the autonomous mobile robot ltt using angle measurements to identical beacons1994In: International Conference on Machine Automation: mechatronics spells profitability. Proceedings of the ICMA'94, Tampere University of Technology, 1994Conference paper (Refereed)
  • 12.
    Högström, Tomas
    et al.
    R2A2-lab, Robotics & Autonomous Mechanical Systems, IKP, Linköping University.
    Nygårds, Jonas
    Forsberg, Johan
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Telecommands for Remotely Operated Vehicles1995In: Elsevier IFAC Publications / IFAC Proceedings series, ISSN 1474-6670, Vol. 28, no 11, p. 301-306Article in journal (Refereed)
    Abstract [en]

    The problem addressed is controlling a robot over a communication channel with low bandwidth, a non-neglectable time delay and time jitters. The external sensors on the robot are a time-of-flight range measuring laser and a video camera. Video images are used by the operator for interpreting the scene and are only sent at “fax rate”. The operator is thus interpreting the workspace around the vehicle, making plans for the individual operations in a composite task and is giving high level commands. The telecommands are to be executed autonomously by the robot. Inside the robot the control loop is closed with full bandwidth when each command is executed.

    This paper reports on work in progress and focus on:

    1. Incremental map building, especially simplifications for large buildings and robustness when the vehicle returns to a previously mapped area. This is a very useful support function for the operator.

    2. Tests of telecommands on a mobile robot available for experiments over the Internet. Available sensors are inclinometers, odometry, CCD intensity images and range measuring laser data.

    Other needed telecommands studied previously are:

    Travelling along corridors, follow walls etc. The operation is based on the range weighted Hough transform and is both robust and precise. The repeatability is about 1 cm at 1 m/s. The robot is not disturbed by people passing in the corridor.

    Commands for passage through an open door works within 1 cm and 0.5 degrees at a speed of 0.5 m/s. Passing between irregular obstacles is less accurate.

    Using rate gyros, lock the heading of the robot when driving on a straight line. Also preprogrammed 90- and 180-degrees turns.

    Automatically enter a camera defined line. The direction of the camera can be used to drop a new coordinate frame at any time (using a button on the keyboard). The robot will automatically enter this new line and also compensate for the overshoot.

    To integrate these different commands into a complete system is a resource demanding task for the future.

  • 13.
    Högström, Tomas
    et al.
    Robotics/Autonomous Mechanical Systems, IKP, Linköping University.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    On Segmentation, Shape Estimation and Navigation Using 3D Laser Range Measurements of Forest Scenes1998In: Elsevier IFAC Publications / IFAC Proceedings series, ISSN 1474-6670, Vol. 31, no 3, p. 423-428Article in journal (Refereed)
    Abstract [en]

    The problem addressed is segmentation and shape estimation of three-dimensional laser range measurements of forest scenes. For each tree the position as well as the three dimensional shape of the trunk is estimated. The detection uses a two-dimensional histogram of the measurements, followed by segmentation by fitting cylinders around the measurements of each detected tree trunk. The shape estimation consists of fitting a 3D spline to each tree trunk's centre curve and computing the trunk width by extracting the range discontinuities at the edges. For the experiments a mobile robot carrying a scanning laser range finder was used. Navigation was done by matching the extracted tree trunk positions from subsequent range scans, determining the relative movements

  • 14.
    Klöör, Per Ljunggren
    et al.
    Linköpings universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    On motion estimation for a mobile robot navigating in natural environment: matching laser range measurements using the distance transform1992In: Proceedings of the 1992 lEEE/RSJ International Conference on Intelligent Robots and Systems, 1992, p. 1421-1428Conference paper (Refereed)
    Abstract [en]

    The goal behind this paper is to find a generic method for controlling the motion of a robot relative to an object of an arbitrary shape. In this paper we study; - modelling laser range measurements for different type of objectslsurface properties. Outdoor scenes are emphasized. - testing the distance transform on measurements for estimating the motion of a robot relative to an object of arbitmy shape. Emphasis in the present study is to get experience of the error mechanism in the distance transform when tested on cluttered laser measurements. Both natural and man-made objects are used and compared in the tests.

  • 15.
    Larsson, Ulf
    et al.
    Luleå tekniska universitet.
    Chenevier, F.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    On rate gyros for improved navigation by reducing dead-reckoning errors1998In: Intelligent autonomous vehicles 1998: a proceedings volume from the 3rd IFAC Symposium, Madrid, Spain, 25 - 27 March 1998 / [ed] Miguel Á. Salichs; Aarne Halme, Oxford: Pergamon Press, 1998, p. 207-212Conference paper (Refereed)
    Abstract [en]

    Consider a vehicle using a number of landmarks for navigation. Between the observation of the landmarks, the motion of the vehicle is estimated using internal sensors like odometers and rate gyros. The output from a dead reckoning algorithm is used in a Kalman filter and/or used to make association of observed landmarks. In such a system the accuracy of the dead reckoning algorithm is essential for the overall performance. In the paper `slip angles' at the wheel-surface contact point are included in the kinematic model. The slip angle is assumed to be essentially proportional to the side force. The topics considered in the paper are: dead reckoning model with a slip angle; improved dead reckoning when a rate gyro is used; and equations for comparing the error growth. Four different algorithms for dead reckoning are studied and compared. Two of them just model the slippage and two of them are based on more accurate measurements of rotation, using a rate gyro

  • 16.
    Larsson, Ulf
    et al.
    Luleå tekniska universitet.
    Chenevier, F.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    On rate gyros for improved navigation by reducing dead-reckoning errors1998In: Elsevier IFAC Publications / IFAC Proceedings series, ISSN 1474-6670, Vol. 31, no 3, p. 207-212Article in journal (Refereed)
    Abstract [en]

    Consider a vehicle using a number of landmarks for navigation. Between the observation of the landmarks, the motion of the vehicle is estimated using internal sensors like odometers and rate gyros. The output from a dead reckoning algorithm is used in a Kalman filter and/or used to make association of observed landmarks. In such a system the accuracy of the dead reckoning algorithm is essential for the overall performance. In the paper `slip angles' at the wheel-surface contact point are included in the kinematic model. The slip angle is assumed to be essentially proportional to the side force. The topics considered in the paper are: dead reckoning model with a slip angle; improved dead reckoning when a rate gyro is used; and equations for comparing the error growth. Four different algorithms for dead reckoning are studied and compared. Two of them just model the slippage and two of them are based on more accurate measurements of rotation, using a rate gyro

  • 17.
    Larsson, Ulf
    et al.
    Luleå tekniska universitet.
    Forsberg, Johan
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Mobile robot localization: integrating measurements from a time-of-flight laser1996In: IEEE transactions on industrial electronics (1982. Print), ISSN 0278-0046, E-ISSN 1557-9948, Vol. 43, no 3, p. 422-431Article in journal (Refereed)
    Abstract [en]

    This paper presents an algorithm for environment mapping by integrating scans from a time-of-flight laser and odometer readings from a mobile robot. The range weighted Hough transform (RWHT) is used as a robust method to extract lines from the range data. The resulting peaks in the RWHT are used as feature coordinates when these lines/walls are used as landmarks during navigation. The associations between observations over the time sequence are made in a systematic way using a decision directed classifier. Natural geometrical landmarks are described in the robot frame together with a covariance matrix representing the spatial uncertainty. The map is thus built up incrementally as the robot moves. If the map is given in advance, the robot can find its location and navigate relative to this a priori given map. Experimental results are presented for a mobile robot with a scanning range measuring laser having 2-cm resolution. The algorithm was also used for an autonomous plastering robot on a construction site. The sensor fusion algorithm makes few erroneous associations.

  • 18.
    Larsson, Ulf
    et al.
    Luleå tekniska universitet.
    Forsberg, Johan
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    On robot navigation using identical landmarks: integrating measurements from a time-of-flight laser1994In: 1994 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI '94: October 2 - 5, 1994, Las Vegas, Nevada, USA, Piscataway, NJ: IEEE Communications Society, 1994, p. 17-26Conference paper (Refereed)
    Abstract [en]

    This paper presents an algorithm for fusing scans from a time-of-flight laser and odometer readings from the robot. The range weighted Hough transform is used as a robust method to extract lines from the range data. The resulting peaks are used as feature coordinates when these lines/walls are used as landmarks during navigation. The associations between observations over the time sequence are made in a systematic way using a decision directed classifier. Natural geometrical landmarks are described in the robot frame together with a covariance matrix representing the spatial uncertainty. The map is thus built incrementally as the robot moves. If the map is given in advance the robot can find its location and navigate relative to the map. Experimental results and simulations are presented for a mobile robot with a scanning range measuring laser with 2 cm resolution

  • 19.
    Larsson, Ulf
    et al.
    Luleå tekniska universitet.
    Jannok, D.
    Luleå tekniska universitet.
    Sandberg, U.
    Luleå tekniska universitet.
    Åhman, P.
    Luleå tekniska universitet.
    Franzén, M.
    Luleå tekniska universitet.
    Nilsson, R.
    Luleå tekniska universitet.
    Wernersson, Åke
    Hyyppä, Kalevi
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Gustafsson, Thomas
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Calman : computerized articulated lawn mower with automatic navigation1993In: Proceedings: Robotikdagar 2-3 juni 1993 : automatiserad tillverkning - från högteknologi till tillämpning : Robotics Workshop, Linköping 1993, Linköping: Linköping University Electronic Press, 1993Conference paper (Refereed)
  • 20.
    Larsson, Ulf
    et al.
    Luleå tekniska universitet.
    Zell, Caj
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Hyyppä, Kalevi
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Wernersson, Åke
    Navigating an articulated vehicle and reversing with a trailer1994In: Proceedings, 1994 IEEE International Conference on Robotics and Automation: May 8 - 13, 1994, San Diego, California, Los Alamitos, Calif: IEEE Computer Society Press , 1994, p. 2398-2404Conference paper (Refereed)
    Abstract [en]

    This paper describes two related tests; to navigate an articulated lawn mower and to reverse a mobile robot with a trailer. In both cases the vehicles are to follow a prespecified trajectory. The navigation principal is based on measured directions to several identical beacons, consisting of strips of reflective tapes. The angular sensor is a rotating laser for the illumination of the beacons and a highly sensitive electro-optical receiver for detecting the directions to the beacons. A Kalman filter is used to combine the measurements from the odometers with the detected angles to the known position of the beacons. To measure the angle between the robot and the trailer the same laser was used. This was done by placing two reflective beacons on the trailer. The repeatability was within 2 centimetre at low speed. The navigation of these two different types of vehicles turns out to be, essentially, the same problem. The sensitivities are different. Emphasis is on robust state estimation

  • 21.
    Nilsson, Bernt
    et al.
    Robotics and Autonomous Mechanical Systems, Department of Mechanical Engineering, University of Linköping.
    Nygårds, Jonas
    Luleå tekniska universitet.
    Larsson, Ulf
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Control of flexible mobile manipulators: positioning and vibration reduction using an eye-in-hand range camera1999In: Control Engineering Practice, ISSN 0967-0661, E-ISSN 1873-6939, Vol. 7, no 6, p. 741-751Article in journal (Refereed)
    Abstract [en]

    In this paper, the positioning of a long flexible manipulator on a moving platform is investigated. The problem is to position the gripper at a requested relative distance in front of an object with unknown location. For this purpose, the gripper is equipped with a range camera giving the distance to surrounding objects within, ∼ 1% and with a sampling rate above 1 kHz. The range measurements are used in combination with internal angle measurements from joint encoders to estimate both the flexibility in the mechanical construction and the relative distance from gripper to object. This is solved satisfactorily by an extended Kalman filter (EKF). For the motion control of the manipulator, a time-scaled feedback controller is suggested. A fast inner loop is used to damp out oscillations and reject disturbances, both from the platform and the manipulator. An outer control loop, with a lower closed-loop bandwidth, then steers the gripper, based on the range measurements, to the requested final position in front of the object. This loop assumes a stationary and rigid platform and a rigid manipulator. At this moment, only simulations of a flexible manipulator on a rigid platform have been studied. However, the results show that the flexibility can be estimated from indirect measurements of the range to the object and the joint angles. Also, good damping and disturbance rejection are achieved, as long as the bandwidth of the actuators is sufficiently high compared to the oscillation. The use of range measurements of the surrounding objects makes the positioning task very robust against an uncertain platform position.

  • 22.
    Nilsson, Bernt
    et al.
    Robotics and Autonomous Mechanical Systems Dept. of Mechanical Engineering, University of Linköping .
    Nygårds, Jonas
    Robotics and Autonomous Mechanical Systems Dept. of Mechanical Engineering, University of Linköping .
    Larsson, Ulf
    Luleå tekniska universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    On Control of Flexible Mobile Manipulators: Positioning and Vibration Reduction Using an Eye-in-Hand Range Camera1998In: Elsevier IFAC Publications / IFAC Proceedings series, ISSN 1474-6670, Vol. 31, no 3, p. 619-625Article in journal (Refereed)
    Abstract [en]

    10.1016/S1474-6670(17)44154-1

  • 23.
    Nilsson, Bernt
    et al.
    Robotics and Autonomous Mechanical Systems, IKP, Linköping University.
    Nygårds, Jonas
    Robotics and Autonomous Mechanical Systems, IKP, Linköping University.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    On-range sensor feedback for mobile robot docking within prescribed posture tolerances1997In: Journal of Robotic Systems, ISSN 0741-2223, E-ISSN 1097-4563, Vol. 14, no 4, p. 297-312Article in journal (Refereed)
    Abstract [en]

    The problem addressed is feedback from noncontact sensing for guiding robots during docking and gripping. The sensor used is a "range camera" onboard a mobile robot (MRb). To specify the docking task completely both the posture (position/orientation) and the required tolerances must be given. These tolerances are then used in the feedback control loop during docking. The algorithms are divided into three parts: the extraction of posture parameters from the "range camera," dynamic filtering for finding association gates and protecting the system against spuriousness in the measurements, and finally a feedback controller. The feedback controller is separated into geometric control and tolerance control. The geometric control uses a range varying LQG-designed feedback control law to generate the trajectories toward the object. The tolerance control adjusts the approach velocity so that the robot is given a sufficient number of observations and control cycles to meet the required tolerances. Thus, during the approach there is a conditional re-planning of the trajectory. For simplicity, only three kinematic state variables (x, y, θ) are used for the MRb. Gripping using an industrial robot (IRb) is an equivalent problem. Successful experiments were made with range resolution varying more than a factor of 50. Thus, the resolution volume in the (x, y, θ)-space varied by several orders of magnitude during the tests. The final errors in range and orientation are essentially limited by the resolution in the "range camera." A persistent conclusion from the experiments is the importance of correct association between the range measurements and the corresponding parts of the object

  • 24.
    Nilsson, Bernt
    et al.
    Linköpings universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Active uncertainty reduction during gripping using range cameras-dual control1995In: Human Robot Interaction and Cooperative Robots: Proceedings, 1995, p. 406-413Conference paper (Refereed)
    Abstract [en]

    This paper is on sensor based control for guiding a robot to correct gripping of objects having a large position uncertainty. An eye-in-hand mounted range camera is considered. A probabilistic problem formulation based on the requested posture at gripping and corresponding tolerances is presented. The problem is solved approximately using dynamic programming for a 1-degree-of-freedom manipulator. A five-step dual control law is studied in more detail. A typical case is that in the first part of the control sequence the robot steers towards the optimal sensing position and in the last part the error with respect to the gripping posture is minimized. Since range camera sensing introduces both range dependent noise and occlusion there is a need for `exploratory moves'. This behavior is formalized and includes `dual control'.

  • 25.
    Nygards, Jonas
    et al.
    Linköpings universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Specular and transparent objects in moving range cameras: active reduction of ambiguities1994In: Intelligent Robots and Computer Vision XIII: 3D Vision, Product Inspection, and Active Vision / [ed] David P. Casasent, Bellingham, Wash, 1994, p. 453-469Conference paper (Refereed)
    Abstract [en]

    This paper is written to further understanding of the basic limitations of eye-in-hand range cameras for the handling of specular and transparent objects. The basic underlying assumption for a range camera is one diffuse reflex. Specular and transparent objects usually give multiple reflections interpreted as different types of `ghosts' in the range images. These `ghosts' are likely to cause serous errors during gripping operations. As the robot moves some of these `ghosts' move inconsistently with the true motion. In this paper we study, experimentally and theoretically, how the range measurements can be integrated in a consistent way during the motion of the robot. The paper is experimental with emphasis on parts with `optical complications' including multiple scattering. Occlusion is not studied in this paper. Some of our findings include: (1) For scenes with one plane mirror there is a complete understanding of the `deambiguation' by motion. Also, the coordinates of the mirror can be estimated without one single observation of the mirror itself. The other objects in the scene are not `mirror like.' (2) For polished steel cylinders, the inclination and radius can be estimated from the curved ray-traces on plane matte surfaces.

  • 26.
    Nygårds, J.
    et al.
    Linköpings universitet.
    Högström, T.
    Linköpings universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Docking to pallets with feedback from a sheet-of-light range camera2000In: Intelligent Robots and Systems: (IROS 2000). Proceedings., Piscataway, NJ: IEEE Communications Society, 2000, p. 1853-1859Conference paper (Refereed)
    Abstract [en]

    The problem studied is feedback for docking a mobile robot or AGV, to a pallet. The pallet is of known size but with an essentially unknown load. The pallet has an initial uncertainty in pose (position and orientation) of the order ±15 cm and ±20 degrees. The docking error is required to be within ±1 cm and ±1 degree with “very low” failure rate. For the docking a combination of a range camera and a video camera is used. In the paper the range camera is emphasized. Experimental results from this work in progress are presented. Successful docking has been made with typical ±5 mm as errors. Currently one weak part is the integration with the control system on board the robot. Our persistent experience from this and earlier tests is that the weak part when using non-contact sensing for feedback in robots is the association problem. It should be mentioned that the resolution of a range camera is strongly distance dependent. One finding in the paper is that this type of docking is feasible and can be made self-monitoring.

  • 27.
    Nygårds, Jonas
    et al.
    Linköpings universitet.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Specular objects in range cameras: reducing ambiguities by motion1994In: Multisensor Fusion and Integration for Intelligent Systems: MFI '94, Piscataway, NJ: IEEE Communications Society, 1994, p. 320-328Conference paper (Refereed)
    Abstract [en]

    Range cameras using structured light and triangulation are essentially based on the assumption of one diffuse reflection from the measured surfaces, Specular and transparent objects usually give multiple reflections and direct triangulation can give different types of `ghosts' in the range images. These `ghosts' are likely to cause serious errors during gripping operations. As the robot moves some of the `ghosts' move in an inconsistent way. In this paper, the authors study, experimentally and theoretically, how the range measurements can be integrated in a consistent way during the motion of the robot. Emphasis is on parts with `optical complications' including multiple scattering. For a scene with one planar mirror the `ghosts' are shown to lie in a plane separated from the laser plane. In this case the orientation and position of the mirror can be estimated

  • 28. Rönnbäck, Sven
    et al.
    Fredriksson, Håkan
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Rosendahl, David
    Hyyppä, Kalevi
    Wernersson, Åke
    An autonomous vehicle for a RobotDay2007Conference paper (Other academic)
    Abstract [en]

    This paper describes the autonomous car that is being built for the RobotDay competition, arranged by SICK GmbH. The basic rule is to let an autonomous vehicle drive a track as fast as possible, based on any sensor technology. For this race SICK donated one S300 professional proximity sensor to each team. Our vehicle will use the circle sector expansion (CSE) method to avoid obstacles and to find its way along the track. We show initial results where the CSE method was used to guide the vehicle through a test track defined by cones and other objects.

  • 29. Rönnbäck, Sven
    et al.
    Hyyppä, Kalevi
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Wernersson, Åke
    On coordinating autonomous vehicles by tracking features using MATLAB2005Conference paper (Refereed)
  • 30.
    Rönnbäck, Sven
    et al.
    Luleå tekniska universitet.
    Hyyppä, Kalevi
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Wernersson, Åke
    On passing a doorway with an autonomous Internet connected wheelchair using MATLAB2005In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems: Edmonton, AB, Canada, 2 - 6 August 2005, Piscataway, NJ: IEEE Communications Society, 2005, p. 1532-1537Conference paper (Refereed)
    Abstract [en]

    If a wheelchair for disabled is used for semi-autonomous navigation indoors it must be able to navigate through doors. A door and doorway can be parameterized with five parameters. A divide and conquer implementation of the Hough transform is used to segment outlines from range scans. The client software remote controls the wheelchair from the MATLAB environment. The software consists of several Java threads that run concurrently. Sensor data are polled by threads and put into databases to reduce the network lag. The databases are used by a controller and a Kalman filter. Since most of the implementation is coded in Java it is possible to run it as a stand alone program on a computer that has Java installed. From 10 runs the trajectory offset was calculated to 0.9 cm with a standard deviation of 1.4 cm. The standard deviation of the heading was 2.2 degrees. This performance is essentially independent of the initial starting pose

  • 31. Rönnbäck, Sven
    et al.
    Hyyppä, Kalevi
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Wernersson, Åke
    Remote CAN operations in Matlab over the Internet2004In: Proceedings: 2004 Second International IEEE Conference "Intelligent Systems" : June 22 - 24, 2004, St. Constantine and Helena resort, Varna, Bulgaria, Piscataway, NJ: IEEE Communications Society, 2004, p. 123-128Conference paper (Refereed)
    Abstract [en]

    This paper describes the implementation of a CAN server that acts as a CAN tool to a client. It can be used to monitor, observe and send messages to a distant CAN network over IEEE802.11b (Wave-LAN). The CAN server is controlled by one or several clients that can connect to it by TCP/IP. It is possible to send and receive CAN messages over Internet from a MATLAB environment since the client software is written in Java. The CAN server collects CAN messages and stores them into a ring buffer. The messages in the ring buffer are classified by their identifier and stored into a database. The CAN tool has been used in a demonstrative application example that consist of a remotely controlled wheelchair. In the example the wheelchair was programmed to run in a square. The positions obtained by odometric CAN messages are compared with the position from the navigation system onboard the wheelchair.

  • 32. Rönnbäck, Sven
    et al.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    On filtering of laser range data in snowfall2008In: 2008 4th International IEEE Conference Intelligent Systems: Varna, Bulgaria, 6 - 8 September 2008, Piscataway, NJ: IEEE Communications Society, 2008, Vol. 2, p. 33-39Conference paper (Refereed)
    Abstract [en]

    This paper is towards reducing the clutter from snowflakes for outdoor robots. There are three basic cases: 1. Normal operation of the laser with objects within range for detection. 2. Close range with objects close to the laser i.e. shorter than the pulse length. 3. Free space as background i.e. all detects are false. The findings are summarized as: The two lasers used were a pulsed wide beam laser and a modulated narrow beam laser. The narrow beam laser has much better penetration between the snowflakes. We did not use snow and rain threshold setting for the wide beam laser. Median filtering shows a substantial reduction in snowflake detects. The gamma distribution describes fairly well the distribution of detected snowflakes.

  • 33.
    Rönnbäck, Sven
    et al.
    Umeå universitet, Umeå.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Range statistics and suppressing snowflakes: detects for laser range finders in snowfall2010In: Intelligent Systems: From Theory to Practice, Berlin: Springer Science+Business Media B.V., 2010, p. 261-277Chapter in book (Refereed)
    Abstract [en]

    This paper presents statistics on registrations from laser range finders in snowfall. The sensors are standard laser range finders in robotics, the LMS200 and the URG-04LX. Three different working cases were identified for the pulsed laser range finder. 1) Normal operation with background objects present within the range of the sensor. 2) Close range objects where ranges to objects are shorter than the pulse length. 3) Free-space in the background. The findings are summarized as: Two laser range finders have been used, one that sends out a pulsed wide beam and one with a modulated narrow laser beam. The narrow beam laser has better penetration between the snowflakes. Median filtering shows a substantial reduction in snowflake detects. The gamma distribution describes fairly well the range distribution of detected snowflakes. In an intense snowfall where about 24% of the ranges detected snowflakes. A time-polar median filter showed good results in suppressing snowflakes in range data

  • 34.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Lasers for robot control: navigation, telecommands and sensor needs for the future2000In: Conference digest : 2000 Conference on Lasers and Electro-Optics Europe: Nice Acropolis, Nice, France, 10 - 15 Septembre 2000, IEEE Communications Society, 2000Conference paper (Refereed)
    Abstract [en]

    Summary form only given. For designing robots, nature is a large source of inspiration. Since lasers does not exist in nature they offers new challenging possibilities for robot control. We outline two cases: laser based beacon navigation used in industry and current effort for fusing laser and vision so as to get reliable navigation in unstructured environments.

  • 35.
    Wernersson, Åke
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Telerobotics: Towards Extending Your Sensing and Hands into a Remote Reality2004In: Elsevier IFAC Publications / IFAC Proceedings series, ISSN 1474-6670, Vol. 37, no 7, p. 1-4Article in journal (Refereed)
    Abstract [en]

    Abstract:

    The problem addressed is control of robots and/or sensing, in workspaces at remote locations. Topics taken up in this plenary talk includes laser sensing, scene interpretation, telecommands for local autonomy, research issues especially for multirobot systems. Applications includes aerospace testing in north Sweden.

    Telecommands are studied for high level control of robots over a communication channel with a non-neglectable time delay, time jitters and variable bandwidth. The sensors onboard the robot are a time-of-flight range measuring laser and a video camera. A few images are used by the operator for interpreting the scene. From the interpretations of the workspace around the robot, the operator specifies interactively the sequence of individual operations during a composite task. Each telecommand is then executed autonomously by closing the feedback loop between the robot and the objects in the surrounding workspace.

  • 36.
    Wernersson, Åke
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Blomquist, Mats
    Luleå tekniska universitet.
    Nygårds, Jonas
    Linköpings universitet.
    Högström, Thomas
    Linköpings universitet.
    Telecommands for semiautonomous operations1995In: Telemanipulator and telepresence technologies: [Telemanipulator and Telepresence Technologies Conference of the SPIE International Symposium on Photonics for Industrial Applications], 31 October-1 November 1994, Boston, Massachusetts / [ed] Hari Das, Bellingham, Wash: SPIE - International Society for Optical Engineering, 1995, p. 2-12Conference paper (Refereed)
    Abstract [en]

    The problem addressed is controlling the relative position between the robot and an object using a sequence of operations specified at a high level. The goal is telecommands for robots to operate over a communication net with low bandwidth, a non-neglectable time delay and time jitter. Using an eye in hand range sensor, motion commands can be entered and executed relative to a range map generated from measurements. The ratio of task relevant information/bandwidth is high for this sensor (it returns range in a plane between fingers in the gripper). Conventional images are only sent to the operator at "fax rate"

  • 37. Wernersson, Åke
    et al.
    Hyyppä, Kalevi
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Åström, Karl
    Hedström, Arne
    Rahm, Jonas
    Laser guided vehicles, LGV's: industrial experiences and needs for the future2005In: Proceedings of the Third Swedish Workshop on Autonomous Robotics, Stockholm, 2005Conference paper (Other academic)
  • 38.
    Wernersson, Åke
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Nygårds, Jonas
    Linköpings universitet.
    Model based fusion of laser and camera: range discontinuities and motion consistency2000In: Information Fusion: FUSION 2000, 2000, p. 16-21Conference paper (Refereed)
    Abstract [en]

    Consider a robot to measure or operate on man made objects randomly located in the workspace. The optronic sensing onboard the robot are a scanning range measuring time-of-flight laser and a CCD camera. The plane surfaces are modeled and parameters extracted using the Radon/Hough transform. This extraction is very robust and motion is also included in a natural way. This paper gives additional results for range discontinuities. A multiple model framework for fusion of sensor information from laser and camera using parametric models of planar and cylindrical surfaces is suggested. An important issue is the mutual consistency between the motion, the range discontinuing, occlusion and properties of the sensor combination. Typical applications are; Robust features for use during navigation in cluttered areas. Model for verification and updating of CAD-models when navigating inside buildings and industrial plants. Accumulating sensor readings into a map during operation of a telecommanded robot.

  • 39.
    Wernersson, Åke
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Nygårds, Jonas
    Linköpings universitet.
    On covariances for fusing laser rangers and vision with sensors onboard a moving robot1998In: Proceedings of the 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems: Innovations in Theory, Practice and Applications, Piscataway, NJ: IEEE Communications Society, 1998, p. 1053-1059Conference paper (Refereed)
    Abstract [en]

    Consider a robot to measure or operate on man made objects randomly located in the workspace. The optronic sensing onboard the robot are a scanning range measuring time-of-flight laser and a CCD camera. The goal of the paper is to give explicit covariance matrices for the extracted geometric primitives in the surrounding workspace. Emphasis is on correlation properties of the stochastic error models during motion. Topics studied include: (i) covariance of Radon/Hough peaks for plane surfaces; (ii) covariances for the intersection of two planes; (iii) equations for combining vision features, plane surfaces and range discontinuities; and (iv) explicit equations of how the covariance matrices are transformed during the robot motion. Typical applications are; models for verification and updating of CAD-models when navigating inside buildings and industrial plants, and accumulating sensor readings for a telecommanded robot

  • 40.
    Wernersson, Åke
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Taylor, Geoffrey R
    Intelligent Robotics Research Centre, Monash University, Clayton.
    Kleeman, Lindsay
    Intelligent Robotics Research Centre, Monash University, Clayton.
    Robust colour range sensing for robotic applications using a stereoscopic light stripe scanner2002In: Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE Communications Society, 2002, p. 86-91Conference paper (Refereed)
    Abstract [en]

    This paper presents an integrated, low-level approach to removing sensor noise, cross talk, spurious specular reflections, and solving the association problem in a light stripe scanner. Most single-camera scanners rely on the laser brightness exceeding that of the entire image. Our system uses two cameras to measure the stripe and combines knowledge of the light plane orientation to produce useful validation properties. The key result is the development of a condition relating image plane measurements and camera intrinsic parameters, which allows validationlassociation to be performed independently of 3D reconstruction. The same equations are used to improve ranging accuracy compared to single-camera systems. We also show how the system may be self-calibrated using measurements of an arbitrary nonplanar target. As validation allows operation in ambient light, registered colour and range are captured in the same sensor. An experimental scanner demonstrates the effectiveness of the proposed techniques.

  • 41. Wernersson, Åke
    et al.
    Wiklund, Urban
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Andersson, Ulf
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Signals and Systems.
    Hyyppä, Kalevi
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Vehicle navigation using image information on association errors1989In: Intelligent autonomous systems: 2nd International conference : Papers / [ed] Takeo Kanade; F C A Groen; L O Hertzberger, Amsterdam: Stichting International Congress of Intelligent Autonomous Systems , 1989, p. 814-822Conference paper (Refereed)
    Abstract [en]

    Describes how image-like measurements can be used for accurate navigation of a vehicle. The `camera' is a rotating laser scanner and the `landmarks' are stripes of identical retroreflecting tape. This arrangement can be seen as a linear circular 360 degree camera with a rotating scanning at a rate of 1 scan/second. The most likely objects the `camera' will see are these identical pieces of tape. Tests were carried out using an AGV (Autonomous Guided Vehicle) running on a floor. At a speed of 0.3 m/s the trajectory fluctuates less than ±2 mm. For this special case it is possible to give a fairly complete and rigorous mathematical model. The model includes the motion, the sensor, missing and erroneous measurements etc

1 - 41 of 41
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf