A Walk Alone is a virtual reality experience that simulates what it feels like to walk alone at night as a woman. The concept is inspired by the kidnapping and murder of Sarah Everard, a 33-year-old woman who went missing during her walk home from her friend's apartment in March 2021. It is one thing to discuss this universal issue and highlight the precautions women take when walking alone, but it is another thing to experience it. A Walk Alone focuses on displaying the vulnerability of the user by triggering multiple senses in the virtual reality environment. The experience is centered around a linear story involving (1) one night-city environment, (2) the user in first person point-of-view, (3) eerie sound design, and (5) dim street lighting.
The Story Incubated from Your heartbeats.
Concept: You can share your own heartbeat with a robot, and you can feel its growth together.
Main Character: A robot boy called “Maruboro” honest robot, who is not yet familiar with “life”..
He has been left in an old factory quietly. One day, the viewer comes over and gives his/her heart as life to him.
Viewers can project themselves into the characters and enjoy the story.
Logline: The viewer gives his/her heart to Maruboro and breathes a life into him. Maruboro wants to become friends with another robot called Kakuboro, but Maruboro doesn't know how to interact with others. It makes Kakuboro angry.
However, Maruboro desperately wants to make friends and he starts to think from Kakuboro's point of view. Kakuboro finally opens up his heart to Maruboro.
Story: “Beat” is a story elaborated from your “Heart”.
Viewers can experience the work with their hearts in their hands. The heart in the animation vibrates at the same pace at viewers' heartbeat.
The setting of the story is an old factory. Viewers encounter a rusted robot called Maruboro, it was absolutely static.
He doesn't have a “heart” to move.
Viewers can grant Maruboro a new heart by putting theirs on the robot.
He then stands up and starts to move, expressing joy to live out all his strength.
Maruboro looks a little lonely. When he finds that other robots open their hearts and connect with each other, he starts to search for friends.
However, when Maruboro meets with new robots, he doesn't know how to communicate properly.
He tries to open up his heart to another robot called Kakuboro. While in Maruboro's disturbance, Kakuboro dropped an important component of his factory.
Kakuboro becomes angry and tries to gets rid of Maruboro. He broke Maruboro's heart accidentally.
Maruboro is so depressed, but he tries to work with the viewer to search for the lost component.
Finally, Maruboro finds it and give that back to Kakuboro. Kakuboro uses that to run the factory.
The factory then starts to operate and sets off big fireworks.
The hearts of the two robots eventually come together. Maruboro's heart starts to beat again.
“Heart” becomes the key to move the story forward.
The story aims to arouse consciousness of “Heart” via the growth of the robot.
“Bodyreath” is an interactive virtual reality (VR) installation for understanding people with autism, which is divided into two parts, “external” and “internal”. The “external” is the external projection, which is the artistic effect of four virtual avatars based on the characters and behaviors of autistic people. The “internal” is the interactive VR scenarios, allowing the audience to gain insight into the characteristics of the autistic people represented in each feature: “Stereotyped Behavior”, “Children from the Stars”, “Emotional Instability”, and "Invisibility". Our VR installation provides an interactive narrative through body interaction and role perception, allowing the audience to transform their identity into an autistic person and perceive it from their perspective. It helps the audience to appreciate the innocence of autistic people in a different way and to understand them artistically.
Virtual reality (VR) provides new opportunities for the design of interactive music visualizations. Exploring this area, Cyberdream is a prototype VR application realized through the author's practice-led research, which provides a journey through audio-visual environments based on the aesthetics of 1990s rave music. The project provides three audio-visual 'sound toys', which allow the user to interactively 'paint with sound', thereby facilitating creative play. Through its structural form and audio-visual sound toys, Cyberdream indicates new approaches for the design of music visualizations that harness the spatial properties of VR.
Dementia is a global health crisis, of which there is a need to understand the patients’ perception towards improving their quality of life. We propose Dementia Eyes, a mobile AR experience that simulates common visual symptoms of senile dementia based on the known pathology and caregivers’ actual experience with patients. Leveraging an iPhone and a Head-Mounted Display (HMD), we developed a real-time application which allows users to see the world from the perspective of an Alzheimer’s type of dementia (AD) patient. The experience was validated by professional medical workers in Japan, and the result advocates for the efficacy of the empathy we intended to bring to them.
This work represents a non-narrative VR experience with elements of interaction. It consists of two parts. The art installation and the VR work itself. The VR work talks about dislocation- the enforced departure of people from their homes, typically because of war, persecution, or natural disaster. This film takes a look at an absurd moment of disbelief and fear. It examines the internal processes that develop and offers a visual depiction of a moment in time of a person forced to fight for his life in a strange land. A moment of dislocation. The VR installation is a tent that can be typically found in refugee camps (Figure 1). It takes approximately 12m square of space. VR headset is situated in the middle of the tent. On the right side of the entrance of the tent, there is a TV mounted that is showing an animated loop that gives context to the visual treatment of the main character. Inside the tent, there is a cardboard piece with the words “where is justice, where is humanity” written.
We present Leopold's Maneuvers VR, a haptic-enabled virtual reality simulation in which a user determines the size, position, and weight of a fetus within a virtual patient by palpating the patient's abdomen. Users of the application receive corresponding haptic cues (force and vibration) as they touch the fetus through the patient's torso. The physical sensations generated by the SenseGlove, paired with the immersive visuals within the virtual environment, support to create a palpation experience similar to the real Leopold's Maneuvers activity. This application addresses a need in nursing education for an immersive virtual reality experience that integrates direct hand manipulation in the learning of assessment skills.
The intangible cultural heritage of China contains many different forms of performing arts, of which oral performance is an important branch. "Hua'er" is the most popular folk performance sung in the Hui ethnic area of Ningxia. Based on technology of virtual reality (VR) and gesture recognition, our work proposes three design methods: interactive performance narrative, metaphorical elements and embodied cognition, applied to the VR performance, "Flower and the Youth". VR can provide audience with a more immersed experience that contributes to the transmission and dissemination of non-heritage performing arts. Our work provides universal design approaches to the creation of content for future intangible cultural heritage performances.
GIBSON is a novel city walking system that enables distant users to walk together as if they are physically in the same city.
The advancement of virtual reality technology has opened the possibility to travel around the world virtually beyond geographical limitations, but there is still room for improvement to make the experience as realistic as real travel. Unlike conventional virtual travel tools and prior multi-user collaborative XR studies, we designed our system to evoke both a sense of co-presence and a sense of being in the real space. For this purpose, we implemented two main functions: (1) function to transfer real-time audio-visual information of the surroundings and (2) function to transfer body movements of users through avatars. We also combined visual positioning system (VPS) and SLAM to align the user locations.
We conducted user testing to verify the experience of cross-AR/VR city walking using GIBSON. The result suggests that our system could make people feel as if they were walking together in the city even though they are physically distanced.
This work "indefinitely" tells the epitome of the earth and future urban pollution. In contemporary society, high walls have been built between people, and people living in cities are basically replicas. We use VR technology to create a surreal world in VR helmets, focusing on spiritual pollution and environmental pollution. In this ambiguous world, how can people find themselves in the bustling maze of nothingness? Through the three scenes of tomb forest, acid rain wasteland and hazy village, the experimenter uses the handle to understand human diseases, disasters and pollution in the past[Ypsilanti et al., 2018].They also picked up the photo fragments of "Lucas" representing hope and felt the fear of being lost in the spiritual maze. In this scene, they become poor people who are lost and polluted in their souls. At the end of the scene, the experimenter found a door full of hope, picked up all the fragments of Lucas, took a group photo of Lucas and his daughter before departure, and ended the journey.
Marco Polo Go Round is a comedic love story with a very surreal twist in which the user is invited to participate in a couple’s relationship as their world literally falls apart around them.
It is a narrative experience, set in one location that lasts 14 minutes. Motion-captured actors will drive detailed animated characters such that intimate and touching performances exist within a fully immersive 6DOF world.
VR technologies enable 3D visualization and 3D interaction modalities, allowing the general public to get user-centered experience and explore an immersive realistic cultural environment. The art and culture of Dunhuang grotto murals is the world's art treasure. According to the traditional story of “Deer King” and the knowledges of preservation and restoration of the murals, we designed “Meet the Deer King”, a VR game. In this game, we explored and designed "splash-ink" interaction, which derives from traditional Chinese painting style, and we applied this interaction in different scenes. The information about the original game, including its creation background, research focus, game content and implementation effects are mainly described in the supplementary materials.
Nachtalb is an immersive interface that enables brain-to-brain interaction using multisensory feedback. With the help of the g.tec Unicorn Hybrid Black brain-computer-interface (BCI), brain-activity-data is measured and translated visually with the Oculus Quest 2, tactilely with the bHaptics TactSuit and auditorily with 3D Sound. This intends to create a feedback loop that turns brain activity from data-input into sensory output which directly influences the brain activity data-input again.
“Once” is an interactive virtual reality narration film based on post-war experience, which is set after a continuous war period between two countries. Current virtual reality creations about warfare have only been describing the war itself, and its expression easily causes the audience's lack of social empathy, which results in weakening the portray of anti-war consciousness. More importantly, this kind of creation just narrates in one single view of angle and performs imperfectly. Due to those issues, through the three-identity switching, subtitled guidance and stylized visualization, we create this VR narrative film, which uses the image of common people after the war as clues to express the trauma it brings.
Room Tilt Stick is an interactive head-mounted display (HMD) system that provides an illusory experience of tilting a room using a stick and of walking on an acutely inclined slope. In the proposed system, the tilting velocity of a three-dimensional modeled room in the HMD’s view is correlated with the degree of force transferred to the wall or ground with a specially designed stick. Here a weight scale (i.e., a Wii Balance Board) monitors to what degree the participant’s weight is rested against the wall (through a tiltable platform) or ground. After the illusory tilt of the room is completed, participants are instructed to walk on a two meter long wooden slope. The degree of physical inclination is gradually increased according to the walking distance, and the walking slope corresponds to an illusory wall in the HMD’s view. The system was tested in our laboratory exhibition, and 87 out of 100 participants reported a strong sensation of the room tilting by pressing the wall or ground with the stick.
‘The World of Hiroshige’ is an immersive virtual reality experience that allows the participants not only to view the artworks of the Japanese artist Utagawa Hiroshige in 3 dimensions, but also to experience and interact with the Ukiyo or “floating world" that inspired the works.
In spatial navigation, adding haptic cues to visual information lets users understand the spatial information better. Most haptic devices stimulate various body parts, while few devices target our heads that are sensitive to mechanical stimuli. This paper presents Virtual Whiskers, a spatial directional guidance technique using cheek haptics in a virtual space. We created a cheek haptic stimulation device by attaching two tiny robot arms to a Head-Mounted Display. The robot arms trace the cheek with proximity sensors to estimate the cheek surface. Target azimuthal and elevational directions are translated into a point on the cheek surface. The robot arms touch the point to present target directional cues. We demonstrate our technique in two applications.
We demonstrate WizardOfVR, a personalized emotion-adaptive Virtual Reality (VR) game akin to a Harry Potter experience, which uses using off-the-shelf physiological sensors to create a real-time biofeedback loop between a user’s emotional state and an adaptive VR environment (VRE). In our demo, the user initially trains the system during a calibration process using Electroencephalogram (EEG), Electrodermal Activity (EDA), and Heart Rate Variability (HRV) physiological signals. After calibration, the user will explore a virtual forest with adapting environmental factors based on a ’SanityMeter’ determined by the user’s real-time emotional state. The overall goal is to provide more balanced, immersive, and optimal emotional virtual experiences.
In this demonstration, we present X-Wing, a force feedback device using ducted fans attached to a Virtual Reality (VR) Head Mounted Display (HMD). Applying haptic technologies to VR has become a standard solution to enhance the immersive experience. It is also essential for virtual interactions where user experience and performance are crucial. This research focuses on a head-based, wearable device that provides forces to the VR user using exertion force by electric thrusters with controllable strengths and directions. Our system allows the user to experience forces based on their virtual momentum and velocity. It enables a VR user to receive force feedback based on different thrust power converted into translational and rotational force for a unique VR experience such as virtual aerial simulation. We present the prototype to the attendees of SIGGRAPH Asia 2021 through a live demonstration.