Authors: Azhari, F; Kiely, S; Sennersten, C; Lindley, C; Matuszak, M; Hogwood, S
Show More

Citation as:   ris   bibtex   endnote   text   Zotero


Abstract:
Unmanned aerial vehicles (UAVs) are being employed in a rapidly increasing number of applications in mining, including the emerging area of mapping underground void spaces such as stopes, which are otherwise inaccessible to humans, automated ground vehicles and survey technologies. Void mapping can provide both visual rock surface and 3D structural information about stopes, supporting more effective planning of ongoing blast designs. Underground stope mapping by UAVs, however, involves overcoming a number of engineering challenges to allow flights beyond operator line-of-sight where there is no global positioning system (GPS), natural or artificial light, or existing communications infrastructure. This paper describes the construction of a UAV sensor suite that uses sound navigation and ranging (SONAR) data to create a rough 3D model of the underground UAV operational environment in real time to provide operators with high situational awareness for beyond line-of-sight operations. The system also provides a backup when dust obscures visual sensors to provide situation awareness and a coarser, but still informative, 3D model of the underground space. Typically, light detection and ranging (LIDAR) systems have superseded SONAR sensors for similar applications. LIDAR is much more accurate than SONAR, but has several disadvantages. SONAR sensor data is sparse, and therefore much easier to process in real time on-board the UAV than LIDAR. The SONAR sensor hardware is also lighter than current LIDAR systems, which is of importance regarding the constrained payload capacity of UAVs. However, the most important factor that makes SONAR stand out in this application is its ability to operate in dusty or smoke-filled environments. The UAV system was tested both above and below-ground using a predefined path with check point locations for the UAV to follow. Due to the lack of GPS, survey points in combination with photogrammetry allowed the UAV’s location to be estimated. This allowed the system to be tested to determine how accurate the SONAR data is in comparison with 3D modelling via photogrammetry of images from a separate digital single-lens reflex camera. Comparing the shape of void surfaces determined by photogrammetry with that determined by SONAR provides quantifiable accuracy when the photogrammetry models are used as ground truth data. Aboveground and underground pilot studies have determined that SONAR sensors provide acceptable accuracy compared with modelling via photogrammetry, sufficient to provide effective situational awareness for human operation of the UAV beyond line-of-sight. Keywords: SONAR, UAV, 3D mapping, underground, global navigation satellite system

Keywords: SONAR, UAV, 3D mapping, underground, global navigation satellite system

Citation:
Azhari, F, Kiely, S, Sennersten, C, Lindley, C, Matuszak, M & Hogwood, S 2017, 'A comparison of sensors for underground void mapping by unmanned aerial vehicles', in M Hudyma & Y Potvin (eds), Proceedings of the First International Conference on Underground Mining Technology, Australian Centre for Geomechanics, Perth, pp. 419-430.

References:
Agisoft LLC 2017, Agisoft Photoscan, viewed 22 May 2017, http://www.agisoft.com
Bloesch, M, Omari, S, Hutter, M & Siegwart, R 2015, ‘Robust visual inertial odometry using a direct EKF-based approach’, Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Institute of Electrical and Electronics Engineers, New York, pp. 298–304.
Cree, Inc. 2017, XLamp XHP70, viewed 24 February 2017, http://www.cree.com/LED-Components-and-Modules/Products/XLamp
/Arrays-Directional/XLamp-XHP70
Elfes, A 1990, ‘Sonar-based real-world mapping and navigation’, in IJ Cox & GT Wilfong (eds), Autonomous Robot Vehicles, Springer Verlag, Berlin, pp. 233–249.
Engel, J, Schöps, T & Cremers, D 2014, ‘LSD-SLAM: Large-scale direct monocular SLAM’, in D Fleet, T Pajdla, B Schiele & T Tuytelaars (eds), Proceedings of the European Conference on Computer Vision 2014, Springer, Cham, pp. 834–849.
Forster, C, Pizzoli, M & Scaramuzza, D 2014, ‘SVO: Fast semi-direct monocular visual odometry’, Proceedings of the IEEE International Conference on Robotics and Automation 2014, Institute of Electrical and Electronics Engineers, New York, pp. 15–22.
Hustrulid, WA & Bullock, RC 2001, Underground Mining Methods: Engineering Fundamentals and International Case Studies, Society for Mining, Metallurgy, and Exploration, Littleton.
InfoMine 2008, GPS for Underground Operations – Great Potential for Controlling Block Caves, Saving Trapped Miners and Machine Automation, viewed 23 February 2017, http://www.mining.com/gps-for-underground-operations-great-potential-for-controlling-block-caves-saving-trapped-miners-and-machine-automation/
Klein, G & Murray, D 2007, ‘Parallel tracking and mapping for small AR workspaces’, Proceedings of the 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Institute of Electrical and Electronics Engineers, New York, pp. 225–234.
MaxBotix Inc. 2012a, I2CXL-MaxSonar®- EZ™ Series, High Performance Sonar Rangefinder, MB1242, viewed 24 February 2017 http://www.maxbotix.com/documents/I2CXL-MaxSonar-EZ_Datasheet.pdf, p. 11.
MaxBotix Inc. 2012b, I2CXL-MaxSonar®- WR/WRC™ Series, High Performance Sonar Rangefinder, MB7040, viewed 24 February 2017 http://www.maxbotix.com/documents/I2CXL-MaxSonar-WR_Datasheet.pdf, p. 10.
Open Source Robotics Foundation, Inc. 2017, RViz, viewed 22 May 2017, http://wiki.ros.org/rviz
Thrun, S, Burgard, W & Fox, D 2005, Probabilistic Robotics, The MIT Press, Cambridge.




© Copyright 2018, Australian Centre for Geomechanics (ACG), The University of Western Australia. All rights reserved.
Please direct any queries to or error reports to repository-acg@uwa.edu.au