A Clever Label is an interactive immersive documentary pilot experience that introduces a novel data visualization mechanic ("Grapho") for curating and presenting connected graph data as simply as a slide deck. Inside the XR experience the audience is guided by a presenter through volumetric video, voice-overs, haptics and subtitles to explore complex data.
Becoming is an operatic VR piece based on a Persian poem by Mowlana Rumi. The critical content of the piece is about the spiritual evolution of humans on Earth. The artistic expression of the piece takes advantage of an advanced ray-tracing audio spatialization system (Space3D), which is capable of creating realistic spatial impressions within changing acoustic environments in real-time. In this piece the user can interact with the environment and influence the progression of the music by touching various elements and by changing the spatialization paths and speeds of various layers of the music. Two audience members can be connected through the network and interact with each other via haptic effects.
Delirious Departures is a VR experience in the form of an installation/performance. It questions the nature of travelling and our relationship to others. It explores the desire and impossibility of travelling, of leaving meaningfully in times of pandemic. The participant moves through impressive environments, based on iconic Belgian railway stations. An actor acts as guide, antagonist and director of the journey. The mocaped actor mixes with canned animations, intelligent avatars and figures in fixed poses, in various degrees of abstraction. Who is real, what is live? The experience of the participant is visualized in an experimental XR set-up which projects a mix of realtime video and computer graphics. The one-on-one performance builds on animation and crowd simulation technology developed by Inria and Cubic Motion (Epic Games).
While the global COVID-19 pandemic did not catalyze widespread adoption of virtual reality (VR) technologies across all industries as some had anticipated, studies like Hall et al. from 2022 have demonstrated that public valuation of VR remains strongly in gaming, entertainment, and socializing [Hall et al., 2022].
As we look towards a future in which indoor gatherings with friends and family are safe and encouraged once again, there is an opportunity to position VR gaming as a go-to add-on to social gatherings by emphasizing ease of access for players of all levels of experience, and designing gameplay that encourages engagement rather than isolation in shared space.
Fruit Golf aims to use an asymmetric multiplayer format to offer an experience that spans collaborative and competitive experiences, and allows players to seamlessly interact with VR, mobile, and physical spaces in ways that most will have never seen before.
Hummingbird: is a modern, innovative performance merging live theater and interactive virtual reality by bringing a group of active participants into a shared space for a live performance. The performance premiered as part of Chicago's Tony Award-winning Goodman Theatre New Stages Festival showcasing innovative and ground-breaking theater works in December 2021. This project bridges art, science and live theater through a collaborative research effort between computer science and design faculty and students at the University of Illinois Chicago (UIC) Electronic Visualization Laboratory (EVL) and Chicago theater directors, actors, videographers and producers. Hummingbird’s story celebrates courage and coming of age through the eyes of a gutsy teen who must outsmart her mother's narcissistic boss and survive dangerous new technology in a live, immersive adventure. Hummingbird extends traditional live theater and makes virtual reality art accessible to a broader audience, demonstrating how virtual reality can transform theatrical storytelling.
Glasshouse is a dance for virtual reality experience in three parts. A near future biosphere where plant and insect life thrive in an ecosystem of integrated biotechnology. Maintained by intuitive glasshouse keepers who farm water and light, biotech agriculture and drone insects work in synergy with ancient flora and heirloom edibles.
Motion sickness, unintuitive navigation, and limited agency are critical issues in VR/XR impeding wide-spread adoption and enjoyable user experiences. To tackle these challenges, we present HyperJump, a novel VR interface merging advantages of continuous locomotion and teleportation/dashing into one seamless, hands-free, and easily learnable user interface supporting both flying and ground-based navigation across multiple scales.
An aspirational goal for virtual reality (VR) is to bring in a rich diversity of real world objects losslessly. Existing VR applications often convert objects into explicit 3D models with meshes or point clouds, which allow fast interactive rendering but also severely limit its quality and the types of supported objects, fundamentally upper-bounding the “realism” of VR. Inspired by the classic “billboards” technique in gaming, we develop Deep Billboards that model 3D objects implicitly using neural networks, where only 2D image is rendered at a time based on the user’s viewing direction. Our system, connecting a commercial VR headset with a server running neural rendering, allows real-time high-resolution simulation of detailed rigid objects, hairy objects, actuated dynamic objects and more in an interactive VR world, drastically narrowing the existing real-to-simulation (real2sim) gap. Additionally, we augment Deep Billboards with physical interaction capability, adapting classic billboards from screen-based games to immersive VR. At our pavilion, the visitors can use our off-the-shelf setup for quickly capturing their favorite objects, and within minutes, experience them in an immersive and interactive VR world – with minimal loss of reality. Our project page: https://sites.google.com/view/deepbillboards/
Whether in virtuality or reality, the decisions or actions pursue the goal we acquire in the virtual or real world. Besides, the decision or action that the users make in virtuality or reality only affects one physical or virtual environment. Previous works have revealed the concept of substitutional reality that utilizes the physical environment to enhance the immersive experience. However, there was no trace that the event in virtuality has parallel happened in the physical space. We present Journal of My Journey, an extended reality system with digital fabrication and sensory feedback for seamless interaction in virtuality and reality. The user’s behavior in the virtuality does affect reality. To show the ideal, we developed an immersive game to explore the possibilities of bringing sensory feedback from the real world to help with puzzle-solving in the game and exporting the output of the decisions made by the players perform in the virtual world into reality by using digital fabrication.
”Living with Smell Dysfunction” is a multi-sensory short film that introduces scents in Virtual Reality (VR) Experience. Through this first-person immersive film in VR, the participant will face daily adventure, confusion and danger as a patient with smell dysfunction. Olfactory disorders simulated in this film are anosmia (absence of smell), hyposmia (diminished sensitivity of smell), and dysosmia (distortion of normal smell). [Schiffman 2007] Olfactory dysfunction and disability are tend to be overlooked and invisible to vast population. This novel immersive experience could bring more discussion and attention to the treatment and daily lives of smell dysfunctional patients.
We have created a 45 minute Virtual Reality Narrative with moments of interaction to engage the viewer. The narrative concerns a woman named Lola, and follows her memories of a summer spent with her estranged uncle in Madrid in 1930. Madrid Noir aims to explore the possibilities for narrative storytelling using the medium of Virtual Reality.
“Meta Flowers” is a multiparticipant installation artwork with Cross Reality (XR) . Wearing a glove-type tactile device consisting of linear resonant actuators (LRA) and a microcontroller with Wi-Fi unit, and HoloLens 2, participants can experience XR through the act of arranging virtual flowers (VFs) which have ‘shadow,’ ‘rigid-body,’ and ‘sound (see Figure 1 (a)).’ The VF blooms at the location of the VIVE Tracker, which is installed at the tip of a metal rod inserted into a vase on the table (see Figure 1 (b) (c)). Real shadows of the vases, sticks, and artificial flowers, as well as the virtual shadows of the VFs, are projected onto the table by the light and images projected from the projector installed on the celling, creating a fusion that is not unnatural. Participants can move the VFs and arrange them in vases on the table. When participants touch the VF, petals of the VF fall and participants get tactile sensations through the haptics glove (see Figure 1 (d)). VFs play sounds while they are in bloom, and their pitch change depending on the position of them. In the other words, the sound and appearance of VFs will change depending on the relationship among participants and VFs. In addition, when real water is poured by the sensor-equipped jug into a vase in which a metal rod without petals is put, the VF blooms again (see Figure 1 (e) (f)).
Our project combines immersive VR, multitouch AR, real-time volumetric capture, motion capture, robotically-actuated tangible interfaces at multiple scales, and live coding, in service of a human-centric way of collaborating. Participants bring their unique talents and preferences to collaboratively tackle complex problems in a shared mixed reality world.
Nachtalb is an immersive interface that enables brain-to-brain interaction using multisensory feedback. With the help of the g.tec Unicorn Hybrid Black brain-computer-interface (BCI), brain-activity-data is measured and translated visually with the Oculus Quest 2, tactilely with the bHaptics TactSuit and auditorily with 3D Sound. This intends to create a feedback loop that turns brain activity from data-input into sensory output which directly influences the brain activity data-input again.
The work introduces Project Hubble, Unity's internal multi-user XR collaboration tool.
Hubble provides a delightful interface to navigate and manipulate freeform and guided experiences. Tools and technology are provided on top for voice chat, avatars, annotations, and sharing.
Inspired by Carl Sagan, Star-Stuff: a way for the universe to know itself is an immersive experience created to remind immersants of their fundamental connection to humanity and the universe. This hybrid VR artwork brings two people together in a surreal experience that can be shared with a remote stranger or a co-present friend. Their bodies are transformed into constellations surrounded by a myriad of orbiting stars whose lifetimes unfold before their eyes. By reframing the body in a shared aesthetic this unique experience encourages immersants to see themselves and others in a common light, as “star-stuff” brought to life, free of superficial characteristics that divide us.
Walking a Turtle is a virtual reality experience where players go on a walk led by a tortoise. Part game, part quantified-self wellness tracker, Walking is a farcical tool for resisting the attention economy.