SIGGRAPH '18- ACM SIGGRAPH 2018 Real-Time Live!

Full Citation in the ACM Digital Library

Deep learning-based photoreal avatars for online virtual worlds in iOS

A deep learning-based technology for generating photo-realistic 3D avatars with dynamic facial textures from a single input image is presented. Real-time performance-driven animations and renderings are demonstrated on an iPhone X and we show how these avatars can be integrated into compelling virtual worlds and used for 3D chats.

Democratising mocap: real-time full-performance motion capture with an iPhone X, Xsens, and Maya

Kite & Lighting reveals how Xsens inertial mocap technology, used in tandem with an iPhone X, can be used for full body and facial performance capture - wirelessly and without the need for a mocap volume - with the results live-streamed to Autodesk Maya in real time.

Gastro Ex: real-time interactive fluids and soft tissues on mobile and VR

Enter Gastro Ex for on smartphones and VR. The entire environment surrounding you is interactable and "squishy," featuring advanced soft-body physics and 3D interactive fluid dynamics. Grab anything. Cut anything. Inject anywhere. Unleash argon plasma. Enjoy emergent surgical gameplay, rendered with breathtaking real-time GI and subsurface scattering.

IKEA immerse interior designer

IKEA Immerse is available in select IKEA stores in Germany. This application enables consumers to create, experience, and share their own configurations in a virtual living and kitchen room set. With seamless e-commerce integration, a high level of detail, and real-time interaction, the VR experience represents an engaging, valuable touch-point.

Mixed reality 360 live: live blending of virtual objects into 360° streamed video

An interactive mixed reality system using live streamed 360° panoramic videos is presented. A live demo for real-time image-based lighting, light detection, mixed reality rendering, and composition of 3D objects into a live-streamed 360° video of a real-world environment with dynamically changing real-world lights is shown.

Oats studios VFX workflow for real-time production with photogrammetry, alembic, and unity

Come see how Oats Studios modified their traditional VFX pipeline to create the breakthrough real-time shorts ADAM Chapter 2 & 3 using Photogrammetry, Alembic, and the Unity real-time engine.

The power of real-time collaborative filmmaking

PocketStudio is designed to allow filmmakers to easily create, play, and stream 3D animation sequences in real time using real-time collaborative editing, a unified workflow, and other real-time technologies, such as augmented reality.

The 'reflections' ray-tracing demo presented in real time and captured live using virtual production techniques

Epic Games, Nvidia, and ILMxLAB would like to present 2018's GDC demo, "Reflections," set in the "Star Wars" universe. In addition, we will record a character performance live using virtual production/virtual reality directly into Unreal Engine Sequencer, and then play the demo with real-time ray tracing live at 24fps.

Virtual production in 'book of the dead': technicolor's genesis platform, powered by unity

We demonstrate a Unity-powered virtual production platform that pushes the boundaries of real-time technologies to empower filmmakers with full multi-user collaboration and live manipulation of whole environments and characters. Special attention is dedicated to high-quality real-time graphics, as evidenced by Unity's "Book of the Dead."

Wonder painter: turn anything into animation

Xiaoxiaoniu's unique patented Wonder Painter technology turns anything into a vivid cartoon animation at a click of your camera. First, draw something, make something (clay, origami, building blocks, etc.), or find something (toy, picture book, etc.). Then take a photo of it and see it come alive!