Building VR in VR with Unreal Engine 4

The Unreal Editor is up and running in VR, so you can build VR content in VR. Using the Oculus Touch and HTC Vive motion controllers, your movement in the real world is mapped one-to-one in VR; you can reach out, grab, and manipulate objects just as you would in real life. You already know how to use this tool, because it works like the world works.

.

These are the early days of the revolution in immersive VR content creation, but we’re so excited about what’s up and running that we couldn’t keep it a secret anymore!  VR movement and editing controls are functional, along with key parts of the Unreal Editor UI, including the Details Panel and the Content Browser.  We’ll be showing more and announcing the release date at GDC on Wednesday March 16, 2016.  And when it’s released, it will be a built-in feature of the freely-downloadable Unreal Engine, with full source on GitHub.

.

Best of all, this isn’t a limited mode for VR preview and tweaking.  It is the Unreal Editor, now running in VR. The same Unreal Editor that’s used by everyone ranging from indies and mod makers to triple-A development teams with $100,000,000 budgets. And it runs in VR!

.

A Box of Toys

.

You start out in the VR editor at a human scale, and can directly manipulate objects by moving around in a room-scale VR setting.  But you can also use a smartphone-like pinching motion to zoom in and out. With one pinch, the world is shrunk to the size of a Barbie Doll house on your table. You can manipulate it granularly and ergonomically, and then zoom back to human scale.

.

Besides directly manipulating objects, you also have a laser pointer. Point at a far-away object and you can move it around, or “reel it in” like a fishing rod. Or teleport to the laser pointer’s target location with a single button click, inspired by Bullet Train’s locomotion.

.

The VR User Interface: iPad meets Minority Report

.

As a pro tool, the Unreal Editor features a rich 2D user interface, and it’s being rolled out naturally in VR: One button-press places an iPad-like tablet in your hand, and you use the other hand to interact with the tablet.  Scroll, press buttons, tweak Object Details, interact with menus, drag objects out of the Content Browser and drop them directly in the world.

.

It’s an intuitive way to place a 2D user interface in a VR world that builds on everyone’s Unreal Editor experience, and the underlying Slate user-interface framework provides a great foundation we’ll build on as we work to roll out the entire Unreal Editor UI in VR.

.

Productivity

.

As game developers, we at Epic pride ourselves in creating high-productivity tools optimized for shipping products, and VR editing provides a great path forward.

.

With a mouse, several operations are often required to transform an object along multiple axes in 3D.  In VR, you can frequently accomplish the same result with a single, intuitive motion.  This should come as no surprise, as a mouse only tracks two degrees of movement (X and Y), but in VR your head and two hands track six degrees of freedom each: X, Y, Z, and three rotational axes. That’s 9 times the high-fidelity input bandwidth!

.

The Long Road to VR

.

Unreal Engine 1 was the first engine featuring real-time tools, bringing What-You-See-Is-What-You-Get editing to your brand-new, leading-edge 60 MHz Pentium computer. You could roam around the environment, place objects, and move lights, all in real time. We take this for granted now, but it was pretty revolutionary in 1998!

.

Over the past 20 years, Epic has grown into a world-class team of engine developers, and Unreal has evolved radically to take advantage of 100,000X performance gains as measured in FLOPS. Unreal Engine 4 introduced Physically-Based Rendering and a slew of new features that together achieve an unprecedented degree of photorealism.

.

However, the mouse-keyboard-and-monitor editing paradigm has remained the same, and a UE4 developer would still find themselves at home in the very first release of UE1.

.

But now…

.

Now it’s 2016, and the Unreal Engine is the first engine to support building VR content in VR. This breakthrough is the brainchild of Epic technical director Mike Fricker, who began prototyping the concept back in 2014 on Oculus DK1.

.

Back when Palmer Luckey connected with Brendan Iribe to build Oculus, Mark Rein saw an early DK1 prototype and advocated for immersive Unreal Editor support. I thought the idea was a little bit crazy because the Unreal Editor demanded precise controls, while early VR hardware struggled with low-fidelity input.

.

But when the HTC Vive and Oculus Touch motion controllers came along, the question of controls answered itself. You can manipulate 3D objects in 3D with the motion controllers just as you do in the real world with your hands. Mike Fricker’s pioneering implementation brought the same magical feeling of using an iPad for the first time: Your brain already knows how to use this thing! For me, VR editing feels so real that I have dreams about it. Not dreams where I’m sitting in front of a computer, but dreams in which I’m there!

.

The Early Days

.

Today is the very start of a long-term effort that will revolutionize the way people create 3D content. Some parts of it are awesome right now, others parts have rough edges, and we’re rapidly advancing on all fronts. Come join us on this journey together!

.

.

.

.

.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.