SIGGRAPH '20: ACM SIGGRAPH 2020 Immersive Pavilion

Full Citation in the ACM Digital Library

SESSION: Immersive History

If this place could talk ... First World War tunnel warfare through haptic VR

Our transdisciplinary team performed a comprehensive site survey of the Hill of Vauquois, combining a variety of to create a digital recreation of the above and below ground features of the craters, trenches, tunnels and galleries that allows us to see and explore the destroyed village of Vauquois as it has never been seen before. Vauquois was a small French village before it became critical high ground that was fiercely contested for four years by the French and Germans during World War I, with the Americans finally taking the position during the Meuse-Argonne offensive of 1918. A quiet, agricultural village became a killing ground starting in the streets, moving to trenches, and finally moving underground into a network of miles of tunnels used to set over 500 mine explosions in four horrific years of continuous combat.

This immersive virtual reality informal learning exhibit allows visitors to experience and learn about life and conditions in the tunnels of Vauquois as never before. Visitors can walk through the virtual environment but also interact with physical representations of objects that are based on actual artifact and material culture left behind 100 years ago. They can touch a carving within the tunnel, pick up and carry a lantern and other objects as they move through the physical exhibit. They are also provided with learning challenge cards in order to guide their learning and exploration. Finally, handouts will be provided detailing the teams research and development trajectory (process and product) and links to educational online resources to download so visitors can continue to explore the tunnels of Vauquois at home or in classrooms.

Our project seeks to make the invisible visible and give visitors a sense of walking in the footsteps of others over time and space. We do so by immersing the visitor in experiences of life underground in the French and German World War I tunnels under the village of Vauquois. It is within these tunnels that soldiers slept, ate, took shelter, mined and fought, for over three years, experiencing first-hand the ravages of a war of attrition. Vauquois spans the spectrum of experience in the First World War, from street fighting, to trench warfare, to tunnel warfare. Unlike other examples of mining on the Western Front, the warring armies at Vauquois built underground cities where the combatants spent the majority of their time and in some cases were actually engaged in combat.

The team has laser scanned a large proportion of the site (Figure 1); the primary challenge faced in the design of this exhibit is how to make visible what would otherwise be invisible and inaccessible and begin to convey the feeling of being under the hill of Vauquois as authentically as possible, while utilizing a small fraction of the space that the site covers. Multiple techniques are utilized to overcome these challenges, including typical physical displays found in many museums, as well as the immersive experience affordances of passive haptics (augmenting high fidelity visuals within the tunnel with touch) and redirected walking (exploring the virtual tunnels that are larger and more complex than the physical exhibit space itself), creating an engaging and informative exhibit. Together these experiences are specifically designed to allow visitors to engage with and persist in a historical inquiry of World War I experiences.

Not only will visitors be able to walk through the virtual environment (<a href="https://youtu.be/99Dky2AKrYc">see point of view video</a>), they will be able to interact with physical representations of objects that are based on actual artifacts and material culture left behind 100 years ago. For example, they will be able to reach out and touch a carving within the tunnel and pick up and carry a lantern and other objects as they move through the virtual reality tunnel itself. They will also be provided with learning challenge cards in order to guide their learning and exploration. In addition to the virtual environment itself, the exhibit includes physical props of segments of the tunnels, artifacts from the site, audio and still photos, and <a href="https://youtu.be/RRJTf_iiFBk">documentary video</a> of the history of the destroyed village of Vauquois and the process of creating the exhibit. Our exhibit is fully immersive, making use of sight, sound and touch to make the most of the technologies we have at our disposal to give a sense of life and walking in the footsteps of the common soldier and miner at Vauquios. Figure 2 and the following text describe what a visitor will see and experience at each point during the exhibit.

Mogao Caves: a VR experience

SESSION: Immersive Entertainment and Storytelling

Davigo: Epic VR vs. PC Battles

DAVIGO is an asymmetrical VR and PC multiplayer game, where VR and PC players interact within the same virtual space, one as a giant and the other as a little warrior. This form of gameplay created novel challenges for our design process to effectively balance the game. These challenges included balancing player speeds across the two scales and ensuring mechanics such as throwing felt natural. Respectively, we solved these challenges by reducing the giant's speed in relation to the VR player's movement and implementing novel throwing mechanics such as physically stimulating joints and adding aim assist features.

Crafting an Interactive Childhood Memory

It's such an exciting time to be an interactive storyteller. With every passing year, our toolbox expands with new ways to immerse audiences into our stories. With Free the Night, a VR 6DoF interactive film, our team sought to create an approachable immersive experience that encourages participation without intimidation.

The Book of Distance: Personal Storytelling in VR

The Book of Distance investigates the language of personal storytelling in virtual reality (VR), borrowing from interactive narrative practice of the past few decades, the language of performative installation, Japanese theatre and creative non-fiction filmmaking. Beyond a simple retelling of historic events—the internment of Japanese Canadians by their own government, a history shared with the U.S.—the work uses a linear storytelling model with local user agency and an embodied central protagonist who occasionally breaks the “fourth wall” to engage with the user. This paper introduces the creative foundation for the storytelling used in The Book of Distance, an approximately 25-minute single-user, room-scale interactive VR experience intended for general audiences, including those who have not experienced VR before.

SESSION: Exploring Reality

MAGES 3.0: Tying the knot of medical VR

In this work, we present MAGES 3.0, a novel Virtual Reality (VR)-based authoring SDK platform for accelerated surgical training and assessment. The MAGES Software Development Kit (SDK) allows code-free prototyping of any VR psychomotor simulation of medical operations by medical professionals, who urgently need a tool to solve the issue of outdated medical training. Our platform encapsulates the following novel algorithmic techniques: a) collaborative networking layer with Geometric Algebra (GA) interpolation engine b) supervised machine learning analytics module for real-time recommendations and user profiling c) GA deformable cutting and tearing algorithm d) on-the-go configurable soft body simulation for deformable surfaces.

SESSION: Mixing XR and Reality

Walk a Robot Dog in VR!

Realistic locomotion in a virtual environment (VE) can help maximize immersion and decrease simulator sickness. Redirected walking (RDW) allows a user to physically walk in VR by rotating the VE as a function of head rotation such that they walk in an arc that fits in the tracking area. However, this requires significant user rotation, often requiring a “distractor” to cause such rotation in commercial tracking spaces. Previous implementations suddenly spawned a distractor (e.g. butterfly) when the user walks near the safe boundary, with limitations like the user causing distraction accidentally by looking around, the distractor not being acknowledged, or getting “stuck” in a corner. We explore a persistent, robot distractor tethered to the user that provides two-way haptic feedback and natural motion constraints. We design a dynamic robot AI which adapts to randomness in the user’s behavior, as well as trajectory changes caused by tugging on its leash. The robot tries to imperceptibly keep the user safe by replicating a real dog’s behaviors, such as barking or sniffing something. We hypothesize that the naturalness of the dog behavior, its responses to the user, and the haptic tethering will work together to allow the user to explore the entire city, ideally without noticing that the dog is a robot.

Haptic-go-round: A Surrounding Platform for Encounter-type Haptic in Virtual Reality Experiences

We present Haptic-go-round, a surrounding platform that allows deploying props and devices to provide haptic feedbacks in any direction in virtual reality experiences. The key component of Haptic-go-round is a motorized turntable that rotates the correct haptic device to the right direction at the right time to match what users are about to touch. We implemented a working platform including plug-and-play prop cartridges and a software interface that allow experience designers to agilely add their haptic components and use the platform for their applications.

A Virtual Obstacle Course within Diverse Sensory Environments

We developed a novel assessment platform with untethered virtual reality, 3D sounds, and pressure sensing floor mat to help assess the walking balance and negotiation of obstacles given diverse sensory load and/or cognitive load. The platform provides a city-like scene with anticipated/unanticipated virtual obstacles. Participants negotiate the obstacles with perturbations of: auditory load by spatial audio, cognitive load by a memory task, and visual flow by generated by avatars movements at various amounts and speeds. A VR system tracks the position and orientation of the participant’s head and feet. A pressure-sensing walkway senses foot pressure and visualizes it in a heatmap. The system helps to assess walking balance via pressure dynamics per foot, success rate of crossing obstacles, available response time as well as head kinematics in response to obstacles and multitasking. Based on the assessment, specific balance training and fall prevention program can be prescribed.

SESSION: Social Experiences in VR

The Outpost

Dr. Crumb's School for Disobedient Pets: VR's next step: Combining escape games, role playing, and immersive theater to create a new type of social entertainment - Live hosted adventures from the comfort of your home.

VR is a powerful and novel medium, yet we are still experimenting on what makes it uniquely magical. After having worked in the VR space for the past 5 years, as a founder of Oculus Story Studio, one of the first engineers on the Oculus Horizon project, and now a co-founder of Adventure Lab, I will talk about how live hosted VR adventures (like social escape games with a live game master) will be the next step forward in solving some of the toxicity and scaling issues for social VR.

Metamorphic: A Social VR Experience

Metamorphic is a social VR experience where participants shape their appearances and surroundings through movement and play in a series of majestically drawn worlds. This transformative encounter explores the ephemeral nature of the self, as bodies become sites of discovery and vehicles for change by offering the radical possibility of effortless transformation.

SESSION: Informative XR

Narupa iMD: A VR-Enabled Multiplayer Framework for Streaming Interactive Molecular Simulations

Here we present Narupa iMD, an open-source software package which enables multiple users to cohabit the same virtual reality space and interact with real-time molecular simulations. The framework utilizes a client-server architecture which links a flexible Python server to a VR client which handles the rendering and user interface. This design helps ensure utility for research communities, by enabling easy access to a wide range of molecular simulation engines for visualization and manipulation of simulated nanoscale dynamics at interactive speeds.

Pixeldust Studios Reptopia Magic Leap Experience

Battle for Survival: Reptopia is an immersive augmented reality experience using the Magic Leap One device developed by Pixeldust Studios at Pixeldust Labs. Pixeldust Studios put together this pilot project for Mandai park development and the Singapore Zoo. This pilot ‘Battle for Survival: Reptopia’ experience was specifically designed to appeal to two key personas at the Zoo: The Indulgent Parent and The Educator. This Magic Leap experience deploys critical components of the magic leap experience and API like spatial 3d environments, spatial 3d audio, controller touch and trigger and head tracking. For purposes of a public experience, we deployed basic dependable and efficient features only, that would be user friendly and easy to adopt quickly for even a novice user of the headset.

The experience is comprised of 5 stations/image-based triggers that will bring to life 5 unique immersive and interactive photoreal 3d habitat worlds in the middle of your room or table, that will showcase its species in action. The experience is narrated by a 3d host character “Icon” who is a panther chameleon. He will guide the user inside the AR exhibit, sometimes in person and at times as background voice. At each station, he will introduce one or two of the reptiles/amphibians by including a brief initial non-interactive storyline that the users can watch and listen and learn various crucial info on the critters including their special skills and hunting mechanisms. Next the users will be allowed to interact with some of these qualities for a couple minutes using the Magic Leap controller and finally they will get to play a fun immersive game for some stations. Below is a brief breakdown of each of the stations AR activities. Each station roughly takes about 3-7 minutes to complete.

SESSION: Advances in XR

DeepView Immersive Light Field Video

This Immersive Pavilion installation introduces our new system for capturing, reconstructing, compressing, and rendering light field video content. By leveraging DeepView, a recently introduced view synthesis algorithm, our system can reconstruct challenging scenes with view-dependent reflections, semi-transparent surfaces, and near-field objects as close as 34 cm to the surface of our 46 camera capture rig. Improving upon past light field video systems that required specialized storage and graphics hardware for playback, our compressed videos can be rendered in a web browser or on mobile VR headsets while being streamed over a gigabit network connection. This makes ours the first system to encode high quality light field video at sufficiently low bandwidth for internet streaming.

ESPN VR Batting Cage

The ESPN VR Batting Cage uses an innovative combination of tracking data with dynamic graphics to create a unique, immersive experience to engage both sports fans and technologists. Although the game is primarily experienced individually through a virtual reality (VR) headset, features like Tournament Play invite users to come together and engage with sports content in a new, virtual space. Users can engage with pro players via tracked pitch data in a way that they are unlikely to in the physical world, and that is more tangible than watching a baseball game.

SESSION: Miscellaneous

Encounters 2.0: A Multiparticipant Audiovisual Art Experience with XR

“Encounters” provides a multiparticipant audiovisual art experience with a cross reality (XR) system which consists of HoloLens units, VIVE Tracker units, a SteamVR system and so on. In the experience, participants fire virtual bullets or beams at physical objects which then create a physical sound and a corresponding virtual visual effect (see Figure 1, center). This is done by placing a “Kuroko” unit, which consists of a VIVE Tracker unit, a Raspberry Pi unit and a solenoid, beside a physical object (see Figure 1, left). We prepared eight Kuroko units which participants can freely place anywhere in the physical space. Therefore, they could interact with physical objects in that space by making sounds using the XR system. In version 2.0 of the Encounters, we've added new audiovisual experiences with lights using a Philips Hue system. When the virtual bullets or beams hit the Light unit, which consists of a Hue light bulb and a VIVE Tracker unit (see Figure 1, right), the light turns off while generating a virtual crash sound and virtual debris particles of light. And then, the virtual object like a ghost of the light (Ghost object) is generated and increases to the size of the light bulb. After that, participants are able to grab and move the Ghost object. If participants move the Ghost object to the light bulb that is off, the Ghost object gets sucked into the light bulb and then the light turns on as if rebirthing. We think that the light experience expands the Encounters’ XR experience because the lights change appearance of physical objects and lights are physical phenomenon but have an immaterial aspect. We believe that this experience would make participants rethink their perspective in XR space.

Sophroneo: Fear not. A VR Horror Game with Thermal Feedback and Physiological Signal Loop.

We present ”Sophroneo: Fear not”, a VR horror experience with thermal feedback interface. To emphasize the supernatural side of the experience we introduce several innovative approaches, such as long intense cold feedback, liminal audio and physiological feedback loops. We built a wearable thermal feedback setup with water-cooled Peltier TEC elements for long cold sensations to improve the immersion and to increase the sense of fear and unease. At several points of the experience the player will have to close their eyes and interact with the VR environment depending on their other senses alone.