Nuclear Environments Inspection with Micro Aerial Vehicles: Algorithms and Experiments
In this work, we address the estimation, planning, control and mapping problems to allow a small quadrotor to autonomously inspect the interior of hazardous damaged nuclear sites. These algorithms run onboard on a computationally limited CPU. We investigate the effect of varying illumination on the system performance. To the best of our knowledge, this is the first fully autonomous system of this size and scale applied to inspect the interior of a full scale mock-up of a Primary Containment Vessel (PCV). The proposed solution opens up new ways to inspect nuclear reactors and to support nuclear decommissioning, which is well known to be a dangerous, long and tedious process. Experimental results with varying illumination conditions show the ability to navigate a full scale mock-up PCV pedestal and create a map of the environment, while concurrently avoiding obstacles.
Keywords:Inspection, Aerial Robotics
Nuclear site decommissioning is a complex and tedious process, which includes clean-up of radioactive materials and progressive dismantling of the site as shown in Fig. 1. Because of the finite life of a reactor, the decommissioning is an essential step in a nuclear power plant. This process is also critical if there is an accident, as was the case in the Fukushima Daiichi Nuclear Power Plant in 2011 website_world_nuclear (). For planning the decontamination and decommission, surveys of the inside of the containment vessel are crucial. Fig. 2 shows a simplified cross-section of a multi-story containment for a boiling water reactor (BWR). The primary goal of this work is to develop a fully autonomous system that is capable of inspecting inside damaged sites such as ones seen in Fig. 1. We consider an example problem scenario of visually inspecting inside the Primary Containment Vessel (PCV) of a damaged nuclear power plant unit.
Micro Aerial Vehicles (MAVs), equipped with on-board sensors, are ideal platforms for autonomous inspection of cluttered, confined, and hazardous environments. The use of aerial platforms in nuclear settings poses several challenges such as the need to navigate in damaged/unknown environments, without GPS, under low illumination, without communication link to human-pilot or user interface, etc. Furthermore, a damaged nuclear site generally poses constraints on access (entry hatches can be less than 0.1-0.3m in diameter) and the operating conditions can be adversely affected by condensation and fog.
A survey of various autonomous systems deployed in inspection of damaged areas is provided in 6386301_rescue_robots (); Murphy2016 (). A combination of ground and air vehicle is used in Michael_JFR () to inspect a building after an earthquake. Autonomous inspection of penstocks and tunnels using aerial vehicles is done in Ozaslan2016 ().
Focusing on nuclear environment, wall climbing robots have been popular for inspecting undamaged steam generators at nuclear sites wall_climb_Li2017 (). The authors in aerial_mapping_01431161.2016.1252474 () provide an overview of aerial robots for radiation mapping, while drone_radio_6172936 () develops a drone equipped with sensors for nuclear risk characterization. Inspection of outdoor environments is primarily done with the aid of GPS. In the case of GPS-denied environments, remote teleoperation is used teleop_s17102234 (). In daiichi_01439911211249715 () a robot is used in nuclear sites for damage assessment. Unmanned construction equipment and robots were used for surveillance work and cleaning up rubble outside buildings daiichi_6106792 (). Finally, robots_tepco () provides various robots that have been deployed in nuclear settings.
All the previous solutions were remotely teleoperated without considering the autonomy component required by the complex and remote operations in a nuclear (eventually damaged) scenario. Moreover, the physical size of the robots range from 1 m to 10 m. These systems are big with weights usually ranging from 2 kg to more than 1000 kg and can carry bulky sensors. Our problem of navigating inside inside a PCV is very different. Previous attempts to visually inspect the reactor Unit 1 with ground robots have been unable to complete the missions as they got stuck in confined environments of the damaged sites.
This work presents multiple contributions. First, we develop a fully autonomous system with state estimation, control and mapping modules for PCV inspection task concurrently running onboard a custom designed 0.16 m platform. Second, using the onboard map created during the navigation, the vehicle is able to automatically replan its path to avoid obstacles. Finally, we conduct a set of preliminary studies and experiments in different conditions in a mock-up representing a PCV pedestal at Southwest Research Institute (SwRI). To the best of our knowledge, this is the first fully autonomous system of this size and scale applied to inspect the interior of a full scale mockup PCV for supporting nuclear decommissioning. Although we motivate by specific example of inspecting inside a PCV where the entry point is of order of 0.1 to 0.2 m, this system can be used in autonomous inspection of any damaged/tight regions.
2 Technical Approach
The platform is a 0.16 m diameter, 236 g quadrotor (Fig. 3) using , with a 2 cell Lipo battery 7.4 V. The proposed solution is based on our previous work LoiannoRAL2017 (), where we focused on the ability to perform aggressive maneuvers while tracking a simple trajectory. In the proposed scenario, there are several additional challenges that need to be addressed. The vehicle needs to navigate considering different illumination conditions, concurrently creating a map of the environment and replanning its path to avoid obstacles, while reaching the final mission goal. In the following, we present a brief overview of the key onboard approaches for estimation, mapping and planning.
2.1 State Estimation and Control
As shown in LoiannoRAL2017 (), the Visual Inertial Odometry (VIO) system localizes the rigid body with respect to the inertial frame combining Inertial Measurement Unit (IMU) data and downward facing Video Graphics Array (VGA) resolution camera with field of view. The prediction step is based on the IMU integration. The measurement update step is given by the standard D landmark perspective projection onto the image plane leading to the Extended Kalman Filter (EKF) updates. An additional Unscented Kalman Filter (UKF) is able to estimate the state of the vehicle for control purpose at 500 Hz. From a control point of view, we use a proportional and derivate nonlinear controller both for the position and attitude loops LoiannoRAL2017 (); McLamrock (); Mellinger2011 (). The control inputs are chosen as
with the desired acceleration, , , , positive definite terms. The subscript denotes a commanded value. The quantities are the orientation, angular rate, and translation errors respectively, defined in McLamrock (); Mellinger2011 () and the orientation, the angular rate in the body frame. The attitude loop runs on the Digital Signal Processor (DSP) available on the board at 1 kHz, whereas the position loop runs on the Advanced RISC Machines (ARM) processing unit concurrently with the estimation and planning pipelines.
The frontward-facing stereo camera is employed to create a dense map of the environment (see Fig. 3). The stereo camera mapping algorithm uses two rectified images of resolution at Hz with degrees horizontal field of view to determine the location of obstacles in the environment. To generate a high frame rate map for real-time obstacle avoidance, the rectification process of the two images is split into two separate threads. Both images are concurrently rectified within 3 ms on the ARM processor. A block matcher algorithm produces a disparity map by searching for matches along the epipolar line and provides distance information. The tradeoff between speed and accuracy can be explored by changing the block matcher filter size. Larger filter window sizes generate less noise in disparity maps at a higher computational cost. The disparity information is then used to generate a 3D pointcloud of the environment from the pinhole camera projection model. We use 3D voxel grids to discretize the space and the 3D points are used as votes to obtain a voxel grid occupancy map for planning.
Given a pre-generated global polynomial trajectory and depth map of the environment we use the work in vladyslav_planning () for on-line reactive planning. The generated trajectory deviates from the global path according to encountered obstacles. Quintic B-splines are used to ensure the required smoothness of the trajectory, which are continuous up to the forth derivative of position (snap). Locality of trajectory changes due to changes in the control points, this means that is a change in one control point affects only a few segments in the entire trajectory. Closed-form solutions are available for position at a given time, derivatives with respect to time (velocity, acceleration, jerk, snap) and integrals over squared time derivatives, thus allowing real-time performance.
3 Experimental Results
An important step prior to deployment is the evaluation of developed technologies in mock-up facilities. In this section, we report on the experiments carried out in a mock-up of the PCV pedestal (5 m diameter) and CRD ramp as shown in Fig. 2. Three different tests were conducted to validate the proposed setup and strategy. During all the tests, the nominal velocity of the vehicle was limited to 0.5m/s.
3.1 Obstacle avoidance course
For these experiments the vehicle is supposed to navigate along an obstacle course that consists of multiple vertical 102 mm PVC pipes, 25.4mm bundled vertical tubes, and aluminum trusses. During the test, the UAS started from a predefined location on the floor and was given a single waypoint located at the far end of the obstacle field. As there were essentially no overhead obstacles, the altitude of the UAS was manually restricted to force it to travel through the obstacle field as opposed to the lower-cost flight path located above the obstacles. Fig. 4 shows the obstacle course and the generated map of one of the multiple obstacle avoidance test trials.
3.2 Luminance tests
The current platform is not yet equipped with a lighting source. However, to compensate for variations in light, we can exploit the automatic camera exposure algorithm. To determine an estimate of the illuminance required for the vehicle to accurately extract visual features to maintain adequate localization, experiments with varying luminance were conducted. The PCV pedestal was fitted with multiple LED light strips whose luminous emittance was controlled directly by varying the drive voltage of the lights. Experiments were conducted at three different voltage levels (with all external sources turned off), to find the lowest illuminance to operate the localization algorithm. The corresponding lux values were measured at 7 different locations at constant height above the grates inside the pedestal. Data was collected with an LX1330B digital light meter. The average values corresponding to the voltage levels are 41.8 lx @12V, 9.5 lx @8V, 0.5 lx @7.5V. Fig. 5 (left column) shows the trajectories followed by the UAS in different luminance, whereas in Fig. 5 (right column) we report the control errors between the commanded position and tracked positions. Fig. 6 (left column) shows the RMSE (Root Mean Squared Errors), whereas Fig. 6 (right column) presents distribution of the tracking error across three runs. Finally, the reader should notice that the 7.5V experiment was performed only once to avoid additional damage to the vehicle due drift the in VIO and subsequent crash.
3.3 PCV inspection
The third, and most complex, test scenario consisted of multiple stationary and suspended obstacles (of varying sizes) at unknown locations inside of the pedestal. The vehicle starts from the external point, moves into the cluttered pedestal area and performs the site inspection and environment mapping while avoiding obstacles (Fig. 7). The planning and obstacle avoidance operate in the 3D space and the vehicle safely maneuvers from the pedestal mock-up. Multiple runs with varying obstacle locations were conducted.
4 Main Experimental Insights
There are several insights that can be obtained from our experiments. First, we showed the possibility to concurrently run, onboard an aerial platform with limited computational capability, the state estimation, control, mapping and planning algorithms with obstacle avoidance to solve a complex nuclear inspection task. Second, we were able to successfully detect and avoid obstacles 0.25m in diameter with a stereo camera even with a relatively small stereo baseline. Finally, experimental evidence shows that an onboard LED payload will be necessary to allow navigation in very dark conditions with illuminance lower than roughly 8 lx.
The replanning strategy can be improved to avoid local minima by utilizing a global planner. To achieve higher speeds, faster sensors and algorithms are needed to locate the obstacles in shorter amount of time. This is essential in operations like the presented one, where there are mission time constraints in addition to spatial ones. We believe these experimental insights will allow the development of smaller scale platforms up to 0.1 m in diameter and will aid in on-board LED payload design to enable the vehicle to perform autonomous navigation and obstacle avoidance in reduced lighting conditions.
Acknowledgments. We would like to acknowledge the Richard Garcia and Monica Garcia from SwRI who enabled us to conduct these experiments in PCV mock-up at the San Antonio, TX facility. This work was supported by the TEPCO L99048MEC grant, Qualcomm Research, ARL grants W911NF-08-2-0004, W911NF-17-2-0181, ONR grants N00014-07-1-0829, N00014-14-1-0510, ARO grant W911NF-13-1-0350, NSF grants IIS-1426840, IIS-1138847, DARPA grants HR001151626, HR0011516850. This work was supported in part by C-BRIC, one of six centers in JUMP, a Semiconductor Research Corporation (SRC) program sponsored by DARPA.
- (1) Tokyo Electric Power Company. Fukushima daiichi nuclear power plant photo collection. [Online]. Available: http://photo.tepco.co.jp/en/
- (2) World Nuclear Association. Decommissioning nuclear facilities. [Online]. Available: http://www.world-nuclear.org/information-library/nuclear-fuel-cycle/nuclear-wastes/decommissioning-nuclear-facilities.aspx
- (3) R. R. Murphy, “A decade of rescue robots,” in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct 2012, pp. 5448–5449.
- (4) R. R. Murphy, S. Tadokoro, and A. Kleiner, Disaster Robotics. Cham: Springer International Publishing, 2016, pp. 1577–1604.
- (5) N. Michael, S. Shen, K. Mohta, V. Kumar, K. Nagatani, Y. Okada, S. Kiribayashi, K. Otake, K. Yoshida, K. Ohno, E. Takeuchi, and S. Tadokoro, “Collaborative Mapping of an Earthquake-Damaged Building via Ground and Aerial Robots,” Journal of Field Robotics, vol. 29, no. 5, pp. 832–841, 2012.
- (6) T. Ozaslan, G. Loianno, J. Keller, C. J. Taylor, V. Kumar, J. M. Wozencraft, and T. Hood, “Autonomous navigation and mapping for inspection of penstocks and tunnels with mavs,” IEEE Robotics and Automation Letters, vol. 2, no. 3, pp. 1740–1747, July 2017.
- (7) J. Li, X. Wu, T. Xu, H. Guo, J. Sun, and Q. Gao, “A novel inspection robot for nuclear station steam generator secondary side with self-localization,” Robotics and Biomimetics, vol. 4, no. 1, p. 26, Dec 2017.
- (8) D. Connor, P. G. Martin, and T. B. Scott, “Airborne radiation mapping: overview and application of current and future aerial systems,” International Journal of Remote Sensing, vol. 37, no. 24, pp. 5953–5987, 2016.
- (9) K. Boudergui, F. Carrel, T. Domenech, N. GuÃ©nard, J. P. Poli, A. Ravet, V. Schoepff, and R. Woo, “Development of a drone equipped with optimized sensors for nuclear and radiological risk characterization,” in 2011 2nd International Conference on Advancements in Nuclear Instrumentation, Measurement Methods and their Applications, June 2011, pp. 1–9.
- (10) J. Aleotti, G. Micconi, S. Caselli, G. Benassi, N. Zambelli, M. Bettelli, and A. Zappettini, “Detection of nuclear sources by uav teleoperation using a visuo-haptic augmented reality interface,” Sensors, vol. 17, no. 10, 2017. [Online]. Available: http://www.mdpi.com/1424-8220/17/10/2234
- (11) S. Kawatsuma, M. Fukushima, and T. Okada, “Emergency response by robots to fukushimaâdaiichi accident: summary and lessons learned,” Industrial Robot: the international journal of robotics research and application, vol. 39, no. 5, pp. 428–435, 2012.
- (12) K. Ohno, S. Kawatsuma, T. Okada, E. Takeuchi, K. Higashi, and S. Tadokoro, “Robotic control vehicle for measuring radiation in fukushima daiichi nuclear power plant,” in 2011 IEEE International Symposium on Safety, Security, and Rescue Robotics, Nov 2011, pp. 38–43.
- (13) Tokyo Electric Power Company. Application of robot technology. [Online]. Available: http://www.tepco.co.jp/en/decommision/principles/robot/index-e.html
- (14) Wikipedia. Boiling water reactor. [Online]. Available: https://en.wikipedia.org/wiki/Containment_building/
- (15) G. Loianno, C. Brunner, G. McGrath, and V. Kumar, “Estimation, control, and planning for aggressive flight with a small quadrotor with a single camera and imu,” IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 404–411, April 2017.
- (16) T. Lee, M. Leoky, and N. H. McClamroch, “Geometric tracking control of a quadrotor uav on se(3),” in 49th IEEE Conference on Decision and Control (CDC), Dec 2010, pp. 5420–5425.
- (17) D. Mellinger and V. Kumar, “Minimum Snap Trajectory Generation and Control for Quadrotors,” in IEEE International Conference on Robotics and Automation, Shangai, China, 2011, pp. 2520–2525.
- (18) V. Usenko, L. von Stumberg, A. Pangercic, and D. Cremers, “Real-time trajectory replanning for mavs using uniform b-splines and a 3d circular buffer,” in IEEE/RSJ International Conference on Intelligent Robots and Systems, 09 2017, pp. 215–222.