During the pandemic, the popularity of VR concerts increased, providing the audience with a novel experience of watching performances where performers and online attendees typically exist in different spaces and times. However, as the pandemic recovered gradually, people returned to live performances physically due to the diminished sense of co-presence experienced in VR concerts. We present Actualities, a seamless live performance consisting of onsite and online form with a multiverse interaction to create an engaging experience for all participants. By utilizing various sensors, we capture signals from the performance venue and digitalize all onsite elements into virtual scenes. The visual content is projected onto screens for the onsite audience, while being simultaneously broadcast via live-streaming to the online audience.
Alice in Gravityland is a VR adventure exploring three gravity experiences with novel, around-the-head vibrotactile feedback using illusory tactile motion. Players are able to 1) change the direction of gravity, 2) navigate through zero gravity, and 3) defy gravity as they walk on walls. The haptic feedback helps improve players’ sense of directionality to improve immersion when experiencing gravity events. Inspired by Lewis Carroll’s Alice’s Adventures in Wonderland (1865), the game invites players to alter gravity to solve puzzles and experience gravity in a unique way through this multi-sensory VR adventure.
In collocated VR classes, instructors need to guide their students, while also remaining aware of the physical environment in order to ensure students’ safety. It is hard to do both simultaneously.
We present a system that utilizes hand-held devices for non-VR instructors, enabling them to explore VR content and interact with students who are fully immersed in VR. The instructor can observe the VR environment or switch between different students’ first-person views by using commonly available hand-held devices, such as smartphones and tablets. The instructor can also use hand-held devices to interact with the VR world itself.
The students can see the real-time video stream of the physical environment as well as a video stream of the instructor. The system enables seamless communication and collaboration, thereby helping to create a better and richer educational experience for VR classes.
To date, systems dealing with Kanji characters have been produced mainly for the purpose of learning, but it is important not only to learn the meaning and writing order of Kanji characters, but also to understand the cultural aspects of them. In regions where Kanji characters are used, various cultures, including Kanji characters, are influenced by nature, so in this study, we propose a calligraphy experience system that conveys the relationship between nature and Kanji characters. When a user looks at the writing order projected on the tabletop and writes Kanji characters created from natural shapes with a brush, the written characters change into the original natural shapes. The user's handwriting could be transformed into a natural form by using projected images.
The Digital Dance Studio VR (DDS-VR) is an innovative user-focused immersive software application for choreographic composition, planning, teaching, learning, and rehearsal. It offers a simple and intuitive immersive interface for creation and manipulation of choreographic sequences in virtual space, permitting exploration of spatial and temporal patterning, musical accompaniment, environment and design aesthetics, and allowing users to change the position, number, rhythm and orientation of dancers. It provides a suite of modular tools for choreographers, dancers, and anyone interested in exploring movement in a digital context – including Film and TV applications in blocking/storyboarding of fight and crowd sequences. The DDS-VR application can empower users to create, visualize, and share digital choreography outside of the physical studio, saving considerable time and money on studio and personnel hire, and bypassing the more professionalized use of real-time graphics software such as Unity and Unreal.
Dimix is a system that integrates 2D images generated by Latent Diffusion Models (LDMs) [et al. 2022] with 3D and interactive Mixed Reality (MR) experiences. This system extracts regions from MR scenes based on panoramic images, adds optional mask images, and applies image-inpainting using LDMs to output an LDMs-generated image (LDMs image) that naturally extends the scene. Furthermore, by performing depth estimation on the images and reconstructing them as 3D scenes, the LDMs image can be treated in the same manner as 3D objects. This enables both basic features like occlusion and collision detection and highly interactive operations such as ink painting, achieving high immersion and realism. Additionally, the system incorporates real-time object detection to constrain the inpainting area, making the LDMs image more convincing. All processing is performed in real-time, allowing users to interact with the world in 3D without waiting for loading or any other preparation by simply uploading their preferred panorama image to the application.
To our best knowledge, Dimix is the first system to seamlessly integrate LDMs images into interactive and three-dimensional MR experiences. Users can immerse themselves in an unprecedented space where the real world seamlessly blends with a world generated by a neural network, offering a glimpse into the future of MR experiences.
Forager is a multisensory virtual reality experience that immerses participants in the captivating world of fungi, delivering a profound connection with nature from various perspectives. Engaging sight, sound, touch, and scent, participants witness the complete life cycle of mushrooms, from spores to mycelium, fruiting body, and the inevitable decay. This transformative journey cultivates an appreciation for the interconnectedness of lifeforms and the wonders of the natural world on its own unique time scale.
Since 2020 Ferryman Collective has been pushing the boundaries of what is possible in theater by using VR technology to give audiences the ability to become a part of the story in a fully immersive way. VR theatre offers the feeling of embodiment and immersion that closely mimics the feeling of going to a theater in person, with the added interactivity of gaming, and the cinematic qualities of film. Ferryman Collective productions have been well received at the most prestigious festivals in the world and garnered many awards along the way.
Gumball Dreams centers around the story of an ancient alien on the precipice of transitioning from this life to the next. Experience an excerpt of Ferryman Collective and Screaming Color's award-winning production, Gumball Dreams. A one-on-one journey with an actor in VR for a 15-minute portion of this immersive theater play.
We present Heightened Empathy, a multi-user interactive experience in bioresponsive virtual reality (VR) designed to visualize emotional states and stimulate different types of empathy between two players in various interactive modes. The experience immerses users in a VR representation of each other’s emotional state, while also reflecting this to the audience using a table-top social robot and projection as they interact with each other. In competitive mode, the goal is to promote cognitive empathy where each user needs to understand the new emotional representation in VR to win. In communication mode, we have users act on one another verbally to promote emotional empathy. The experience aims to enhance empathy perception by placing the players in virtual environments that require them to understand, share, and act on each other’s perspectives and emotions.
Avatar technology has attracted attention, but research in the field of non-humanoid avatars has mainly focused on evaluating the operability and embodiment of new body systems. In addition to these points, we believe that users’ design of their own avatar structure and body motion mapping is also important to improve the user experience. We present Mechanical Brain Hacking, a VR cybernetics simulator that allows for easy modification of both the mapping and the structure of the avatar. Participants take on the role of a damaged robot with limited mobility except for the right hand. They use a surgical machine linked to their avatar’s hand movements to access their own brain and repair its circuits to regain control of their limbs. Haptic and electrical stimulation feedback, along with visual and auditory feedback through a VR headset, provide a fully immersive experience. Participants can even gain an extra limb by attaching additional limb parts. Through the experience of changing motion mapping and body structure in real time, this project offers a unique perspective on the concept of an "editable body", a body that is edited by oneself and in real time.
Adapting a VR game to MR posed challenges in 360º gameplay, UI, security, and thematic consistency. By prioritizing safety, readability, and engagement, we created an enjoyable experience. In this abstract, we explore the design challenges and decisions taken to address these issues.
We present Seated-Walking, a footstep locomotion technique designed for use when sitting on a chair. Seated-Walking is driven by users’ forefoot or rearfoot stepping, which is meant to embody the foot motions of real walking while mitigating potential fatigue. We demonstrate Seated-Walking in two applications. The first application delivers a casual virtual showroom experience, where users sitting iOptoma Corporationn a chair are allowed to walk navigating artworks in the virtual environment. The second application is an intense survival shooter experience where users track and fight enemies while avoiding their attacks, all while seated walking.
SinkInSync presents the design and prototype of a VR-based cross-person EEG neurofeedback platform. This generative VR platform uses one user's brainwave data to procedurally render 3D scenes and passively displays visual cues that are synchronized with the real-time brainwave frequency to another user. With the platform, we aim to explore the potential of VR as an avenue for augmenting cognitive and emotional social connectedness in remote interactions via externally-induced brainwave synchronization between pairs of individuals.
While many VR demos involve two users or one using a virtual agent, few contemplate using social VR. Many live demos at industry conferences do not adequately address the social aspect of VR and will not explore its possibilities. Given that Immersive experiences are moving more and more to social VR experiences, it is crucial to discuss its development and implications in various applications, such as arts, games, entertainment, health, and education.
This abstract provides an overview of the immersive and interactive experience titled "Stay Alive, My Son Chapter 1 & 2" [Bousis 2021] which is based on the memoirs of Pin Yathay [Pin 2000], a survivor of the Cambodian genocide. The abstract summarizes the narrative structure, technological aspects, and thematic significance of the project. Additionally, it highlights the transformative journey of the audience, the utilization of virtual human creation technology, and the director’s vision to evoke empathy and inspire change.
In the three plus years since the COVID-19 pandemic shut down Broadway and regional theaters across the United States, many performing arts institutions are slowly working their way back to pre-pandemic levels in terms of annual revenue [Weinert, 2023]. During this period, many theaters turned to sliding scale ticket models [Mangi, 2022] and hybrid in-person/streamed theater seasons to combat increased inflation related production costs [Vincentelli, 2021].
As we use the lessons learned from the pandemic shutdown to shape the future of American theater, there is a golden opportunity for musical theater producers to embrace virtual reality (VR) technology to reduce development and production related expenses ahead of staging their next musical for a live audience.
The Calling: A Musical VR Experience serves as an immersive historical fantasy inspired by the events that transpired during the infamous Memphis Sanitation Strike of 1968. Created in Unity for Oculus Quest I, II, and The Rift S, The Calling was developed as a “guided experience” serving as the prologue for a larger historical musical currently under development with the Tony Award winning producing team at Apples & Oranges Studios.