Skip to main content

©

IMERSA All rights reserved

Caves In The Dome

This email address is being protected from spambots. You need JavaScript enabled to view it.

Imagine going on a virtual voyage that starts with walking down a paved path past looming limestone walls. The brightly illuminated geological forms surrounding you gradually darken as you descend into a twilight zone where the sunlight wanes.

 

Gradually, the bright entrance disappears behind you. Next, you are further underground, in a cavern so vast that it’s difficult to tell how big it is. There is no atmospheric perspective to provide depth cues, so distant background features look just as sharp as objects in the foreground. The room is filled with organic-looking, mineralized structures stretching from and to the ceiling, illuminated with LED bulbs. Because we are witnessing a live, interactive digital planetarium program and not a movie playback, we can traverse the scene following the flow of the story from the presenter, seeing features from different directions, with the camera moving to locations that would be impossible to be at in real life. Instead of the computer-generated “look” that we are familiar with in many astronomical programs, we are immersed in a visually rich landscape that looks photoreal. Overhead is not blank sky, nor a field of distant stars, but a ceiling reinforcing the illusion that we are in a real place: a cave, in this case, Carlsbad Caverns, located in the United States in southeastern New Mexico.


What I have described is now possible because the process for capturing real-world environments is getting easier, along with computer graphics hardware ever more capable of rendering 3D models built from millions of polygons in realtime. We can now immerse audiences in simulacra of locations that they are familiar with, without the scene looking fake or computer-generated. How do we go from astro­visualizations to experiencing a cave inside a dome? As it turns out, by using tools developed for visual effects, gaming, and virtual reality. To understand how this works, let’s look back at history.


A Common Heritage
Common roots exist between digital fulldome theaters and other immersive technologies. For example, one of the founding fathers of virtual reality (VR) was Ivan Sutherland, who created a head-mounted display in 1967, and who later went on to found a computer graphics company familiar to many planetarians, Evans & Sutherland. The first heyday of VR started in the late 1980s. Throughout the 1990s, university and corporate researchers did pioneering work on VR and produced early applications, including cubic CAVE displays and flight simulators that were the forerunners of multi-channel fulldome projection systems.
Although the technologies of the past were far less capable than the rendering engines of today, there was still an intense interest by the research community to experiment with VR in different contexts. Over the years, the research has shown that experiencing and interacting inside a simulated virtual environment can be useful for training, therapy, and tasks that require spatial awareness (Bowman & McMahan 2007). Cultural preservationists have also discovered that fragile or remote real-world sites can be recreated digitally to be explored by the public, without the risk of physical damage from tourists (Guttentag 2010).


Since their advent at the turn of the millennium, digital planetariums have also turned out to be natural venues for exploring places on Earth. Since 2008, Digital Earth presentations at the Denver Museum of Nature & Science (Yu, Raynolds, & Dechesne 2008; Yu 2009) have taken audiences on tours of our home planet using high resolution satellite imagery from above, and 360° spherical panoramic photography that inserts audiences into scenes on the ground. Fulldome theaters have been used to present an assortment of geoscience content, whether they are educational programs (Shipway 2023), visualization tools for research (Kwasnitschka 2008), or 3D corporate datasets (Neafus & Yu 2007). Highly realistic depictions of locales brimming with computer-generated life have also shown up in many pre-rendered films, such as those from the California Academy of Sciences (Wyatt 2019).


In late 2021, the Denver Museum of Nature & Science began discussions with the National Park Service to determine what types of educational events could be created to celebrate the 2021-2022 “International Year of Caves and Karsts.” Knowing the capabilities of today’s capture technology, we decided to proceed with recreating Carlsbad Caverns in the dome. Although multiple technologies exist for digitizing real-world environments, one that has become more popular in recent years is photogrammetry. It starts with shooting overlapping, digital photos of an object or scene from multiple angles, so common features can be identified between them. Photogrammetry software identifies commonalities between image pairs to create a virtual 3D model. Because the color and lighting information about the object already exists in the original images, the photogrammetrically derived model can be textured with the same pictures used to create the geometry of the model, resulting in a highly realistic scene. The 3D model files can be used in animation software and rendered for movies. They can also be simplified by reducing the number of polygons to the point where they are compact enough for use in real-time planetarium visualization software.


Carlsbad Captured
Originally documented by European Americans in 1898, the set of interconnected chambers that make up Carlsbad Caverns first became a United States National Monument in 1923, then a National Park in 1930, and a World Heritage Site in 1995. The cave system is located in the Guadalupe Mountains, which are the fossil remains of an ancient reef perched offshore in a

prehistoric inland sea 260-270 million years ago (Palmer 2013). Over these millions of years, reef building animals lived and died, with their remains compressed by the weight of newer reef layers above. The crushed, older reef combined with calcium carbonate in the water to create layers of limestone, which was subsequently surrounded by ocean sediment. After the sea dried out 20 million years ago, tectonic processes pushed the reef up, and the surrounding sediments were eroded away to expose the fossil reef as the Guadalupe Mountains today.


Elsewhere in the world, running water dissolves limestone to form caves. In the Guadalupe Mountains, hydrogen sulfide-rich brine reacted with oxygen in fresh water to form sulfuric acid, which was the main agent of dissolution. As the mountains continued to uplift, older caves were pushed up, and younger caves formed below them. If the groundwater table settled at a fixed elevation for a long period of time, the sulfuric acid would etch out a large cavity at that level. The chemical process of dissolution results in gypsum as a byproduct, many large blocks of which can still be found within the caves. In the last 800,000 years, rainwater trickled underground, picking up dissolved calcium carbonate on its way into the subterranean chambers. As the water flowed, drop by drop, it evaporated leaving behind a mineral deposit. Stalactites (formed from deposits descending from the ceiling) and stalagmites (growing from the ground up) are just two of a host of different mineral formations created by the evaporating water.


Carlsbad Caverns is one of the most accessible examples of a large cave system with numerous spectacular rock formations, yet its remote location (about a 2-hour drive from the nearest large airport in El Paso, Texas) means that it is still difficult to get to for many people. The U.S. National Park Service (NPS), which operates the site, has carefully illuminated the cavern and built accessible trails to accommodate visitors. These attributes make it easy to set up equipment for image capture, making Carlsbad Caverns an ideal candidate for a virtual recreation.


After considerable discussions with the NPS, DMNS contracted with Eric Hanson of Blueplanet VR, who went to Carlsbad Caverns in March 2022 to begin two days of photogrammetric image capture, including permission for after-hours work with the assistance of a park ranger. There are 2.5 miles (4 km) of paved trails accessible to visitors for unguided tours that have been built by NPS through the caverns. Visitors cannot wander off these trails, which are bounded by guide rails on either side. The largest chamber is simply called the Big Room, which has a maximum ceiling height of 255 ft (78 m) and covers 8.2 acres (3.3 hectares or big enough to hold 6 football fields/pitches). Eric took images from the trail at several dozen sites. The largest models (e.g., at Caveman Junction and Rock of Ages; Fig. 1) required ~1500 pictures for the photogrammetric model, while the smaller sites needed only a few hundred. Although the caverns are lit to highlight spectacular rock formations, the lighting is kept intentionally dim, which meant a tripod was required to mount the camera for the bracketed HDR exposures. Photography taken during the regular visiting hours was often delayed because Eric had to wait for other guests to move out of the long exposure shots. After hours, when all other visitors had left the cavern, the work proceeded much faster.


Eric created the final textured 3D models using RealityCapture1, which is commonly used in the entertainment industry to create models for visual effects work in film and for video games. Like other photogrammetry software, it is based on the “Structure from Motion” procedure that allows a scene’s 3D geometry and the different camera positions taken to acquire the imagery to be solved iteratively (Ullman 1979). Common features are matched and found in images taken at one position to determine the camera position for those images. Using sets of images from multiple camera positions allows common tie points to be found that connect pictures from those positions.

A polygonal mesh reconstruction is built from the point data describing the object. Textures generated from the original camera images are projected onto the mesh in the final model (Fig. 2). Because the size of the output is not limited by the user’s computer RAM (Dhanda et al. 2019), RealityCapture has the ability to create models with billions of polygonal faces. A scene can be broken into smaller chunks that are worked on separately before they are recombined into the broader model. The output model can be further simplified into one with fewer polygons using 3D modeling software like Autodesk Maya. The models that were loaded into the OpenSpace2 planetarium software typically had about 1 million polygons.


The Carlsbad models were highly realistic when viewed with OpenSpace. Real-life lighting was baked into the textures that mapped the models, so that the 3D models appeared to the eye as they would if the viewer was seeing them in real-life (Fig. 3). The illusion was further enhanced with image backdrops created from the on-site photography, so that the distant backgrounds were not blank spots but contained visual hints of a world beyond the immediate model.


I worked with Patricia Seiser, Director of Cave & Karst Management Science at the National Cave & Karst Research Institute, to develop a series of public visitor programs that she delivered at the Denver Museum of Nature & Science’s Gates Planetarium. In addition to the 3D photogrammetric models, we also showcased multiple 360° panoramic images taken by Ben Gondrez at locations that Eric had not captured. Diagrams explaining the geology of the region and historical photography documenting the exploration of the Cavern helped complete the story that was delivered in the public programs.
Although the photogrammetric capture was done from the visitor trails, the exploration of the digital models was unconstrained. The camera could be manipulated to be placed anywhere inside or around the model. The audience could therefore virtually travel to locations that would be inaccessible in real life. For instance, the Natural Entrance to Carlsbad Cavern is a 1.25 mile (2 km) long, steep trail that was the only way for the public to enter before elevators were built in the 1930s. It starts off at a wide cave entrance at ground level near the Visitors Center. The trail descends via multiple switchbacks that are visible from above in daylight, before disappearing into the twilight ingress into the cavern, where there is progressively less sunlight to see by the deeper you travel.


An overhanging rock shelf lies about 30 m above where the trail enters the twilight zone. In real life, one would need rope and climbing harnesses to access this wall. But we can take audiences on vertiginous flights to this and other areas of the Natural Entrance in OpenSpace. Among the details that Pat could see in the planetarium was a raptor nest, consisting of large twigs and sticks on the ledge of a small alcove. This would have been impossible to witness in real life from the trail, except with binoculars from afar. In the fulldome theater, we could fly to this locale and examine it up close.
Many smaller features in Carlsbad Cavern have also been modeled in photogrammetry. “Bacon drapery” looks like its culinary namesake because of impurities left behind by evaporating lines of water, building up over time into thin, delicate, rippling sheets. These are found off-trail and often located many meters overhead where they would be difficult for visitors to examine in detail. But using OpenSpace, the planetarium audience can be flown around larger-than-life versions of these features to inspect their three-dimensional structure while learning about how they were formed.

A View to the Future
Virtual recreations with photogrammetry and other capture processes are now common in the realm of cultural and natural heritage preservation (e.g., Kingsland 2020). Even the idea of a virtual version of a cave is not new, with the earliest examples at least 20 years old (Lutz & Weintke 1999). Eric started working with photogrammetry in visualization projects with Greg Downing more than a decade ago. His current pipeline was developed to create content for a VR app3.


The way that photogrammetric models are explored in VR with a head-mounted display is very different from that of a fulldome theater. The VR HMD is a single-person experience where the user, isolated from the outside world, can actively navigate around the model with the hand controls and shift their head and body to view the model from different perspectives. Although the headset gives a somewhat constrained view at any given moment (e.g., the Oculus Quest 2 has a field-of-view of 89°4), the user can turn and look at the virtual environment in all directions.


The fulldome theater, on the other hand, is both simultaneously more immersive (with a hemispherical display that shows an instantaneous 180° field-of-view) and less immersive than VR (the audience can’t turn to see anything beyond the edge of the dome). The audience experience is a guided collective tour, as opposed to an individual exploring on their own. Although being able to have autonomous excursions is a selling point for VR, not all users prefer this mode of interaction. Interviews reveal that many people preferred passively sitting through live fulldome presentations showcasing high-resolution satellite imagery of Earth versus interacting on their own with similar types of visualizations from Google Earth (Yu 2009). These visitors explained that since they had little knowledge about the topics presented in the digital dome, they welcomed the guided experience. Even though Google Earth allowed them to travel wherever they wanted to go and to explore at their own pace, they felt they had little to no understanding about what they were seeing, which made that experience less valuable.


The positive visitor feedback we have received from our past Digital Earth and more recent Carlsbad programs shows the promise of fulldome theaters for virtual expeditions into cultural and natural landscapes. With powerful software like OpenSpace making it easier to visualize non-astronomical 3D content, and growing numbers of practitioners of photogrammetry in archaeology, architecture, and other preservation-minded fields, planetarians now have new opportunities for collaboration. There are now many commercial and open-source photogrammetry tools to choose from (Kingsland 2020; Rahaman & Champion 2018). Even free mobile phone apps are now available5, making it possible for more people to pick up these skills and increase the talent pool for creating content. This project also shows the promise of sharing visually rich, 3D content between the mediums of fulldome theaters and VR. With little or no modification, the same model files can be used for two very different audience experiences. Finally, the Carlsbad Caverns project is an example of how image capture and virtual reconstruction workflows from the visual effects and gaming worlds can be adopted and repurposed for fulldome cultural and educational programs. These tools have been refined by billion-dollar entertainment industries, and it would be a shame for us not to take advantage of them.
·······································

Acknowledgements
Thanks to Eric Hanson for his photogrammetry work; Pat Seiser for her superb knowledge about Carlsbad Caverns; Erin Lynch for supporting our imaging fieldwork; and Micah
4 https://smartglasseshub.com/oculus-quest-2-fov/ 5 https://www.unrealengine.com/en-US/realityscan
Acinapurna and the rest of the OpenSpace development team for their user support. OpenSpace is funded in part by NASA under award NNX16AB93A.
·······································

References

Bowman, D. A., & McMahan, R. P. 2007, “Virtual reality: how much immersion is enough?” Computer, 40(7), 36-43.
Dhanda, A., Reina Ortiz, M., Weigert, A., Paladini, A., Min, A., Gyi, M., Su, S., Fai, S., & Santana Quintero, M. 2019, “Recreating cultural heritage environments for VR using photogrammetry,” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 42, 305-310.
Guttentag, D. A. 2010, “Virtual reality: Applications and implications for tourism,” Tourism Management, 31(5), 637-651.
Kingsland, K. 2020, “Comparative analysis of digital photogrammetry software for cultural heritage,” Digital Applications in Archaeology and Cultural Heritage, 18, e00157.
Kwasnitschka, T. 2008, “Geoscience under the Dome: A do-it­yourself approach to fulldome visualization,” The Planetarian, 37(1), 6-9.
Lutz, B., & Weintke, M. 1999, “Virtual Dunhuang art cave: A cave within a CAVE,” Eurograhics ‘99, 18(3), pp. 257-264.
Neafus, D., & Yu, K. C. 2007, “Performing and visual arts, the sciences: visualization brings them together at the Gates Planetarium.” The Planetarian, 36(3), 6-17.
Palmer, A. N. 2013 “Sulfuric acid caves: Morphology and evolution,” in Shroder, J. Frumkin, A. (eds.), Treatise on Geomorphology, San Diego, CA: Academic Press, pp. 241–257.
Rahaman, H., & Champion, E. 2019, “To 3D or not 3D: Choosing a photogrammetry workflow for cultural heritage groups,” Heritage, 2(3), 1835-1851.
Shipway 2023, “Seeking What Works: Reaching beyond the stars: Other disciplines in the planetarium,” The Planetarian, 52(3), 64-68.
Ullman, S. 1979, “The interpretation of structure from motion,” Proceedings of the Royal Society of London. Series B. Biological Sciences, 203(1153), 405-426.
Wyatt, R. 2019, “Academy Style: An Institutional Approach to Fulldome Storytelling,” The Planetarian, 48(2), 24-67.
Yu, K. C., Raynolds, R., & Dechesne, M. 2008, “Using immersive visualizations to improve decision making and enhancing public understanding of Earth resource and climate issues,” in AGU Fall Meeting Abstracts, vol. 2008, pp. GC33B-0764.
Yu, K. C. 2009, “Digital planetariums for geology and geography education: Earth visualizations at the Gates Planetarium,” The Planetarian, 36(3), 6-12.
32 Planetarian Vol. 52, No. 4 December 2023