ABSTRACT:

Rock mass characterization involves field data collection, processing, and analysis of the measurement results. Software packages facilitate creation of 3D models as well as the representation of rock mass parameters and resulting block arrays. The computer-generated imagery is traditionally illustrated as static perspective pseudo 3D screenshots, 2D maps or numerical tables and diagrams.

Mixed reality greatly improves data visualization within an immersive and interactive 3D holographic environment enabling interaction with scaled holograms at the actual field dimension. The holograms are inspected individually or via remote collaboration within shared virtual spaces. The physical outcrop together with quantitative aspects of the rock mass structure is portrayed, enhancing communication and understanding of actual field attributes.

This paper presents case examples of mixed reality holographic models based on photogrammetry, terrestrial and hand-held LiDAR devices. The benefits and potential applications of mixed reality in engineering geology and rock mass characterization are summarized.

INTRODUCTION

The use of LiDAR and unmanned aerial vehicles (UAVs) in the geosciences and engineering disciplines require the transformation of vast amounts of 3D data into tangible information (Janeras et al. 2022). Although the collected data is 3D, georeferenced and often of very high resolution, the derived models are traditionally decimated and represented in 2D (e.g. maps, plans, cross sections, outcrop drawings, structural plots and graphs). With extended reality (XR) technology, geodata can now be represented and visualized in a truly 3D immersive environment. XR is a collective term including virtual reality (VR), augmented reality (AR), and mixed reality (MR). AR and MR represent virtual information that is directly registered to the physical 3D environment, and offer real time user interaction (Schmalstieg & Höllerer 2016). Unlike 2D representations of 3D data, such as computer monitors or other planar screens, head mounted displays (HMDs) such as Microsoft's HoloLens 2 can project holographic imagery in true 3D space, while also maintaining reference to the physical environment through simultaneous spatial mapping (Wang et al. 2020). Virtual objects can be manipulated as if they were real objects via integrated hand tracking and gesture recognition. The HMD user experiences perspective and scaled model observations with interactive and intuitive data query functions. The model and interpretive data are visualized in a common framework with layers and hypertexts that provide rapid situational awareness yet low cognitive load for the observer (Anderson 2020).

This content is only available via PDF.
You can access this article if you purchase or spend a download.