How was the article?

Industry News
2018/03

Unreal Engine 4 Games Featuring Ray-Tracing DXR Will Release In Late 2018

Ninja Theory’s Hellblade may or may not have been the cup of tea for certain gamers, but one thing was unequivocally and unarguably noteworthy about the project: the ability to directly translate performance capture with cinematic quality graphics into the real-time in-engine runtime of the Unreal Engine 4’s design and animation pipeline.

In simple terms, instead of having a giant Hollywood-style performance capture studio go through all the voice and motion capture procedures in a separate studio, and then capture the data and send it over to the animation team to clean it up and implement it into the game at a different studio, Ninja Theory managed to do all of the cinematic performance capture in-house and record it directly into the Unreal Engine 4 without requiring any third-party clean-up, alterations or modifications.

The significance of this step is that the work that Ninja Theory did on building an injector for the Unreal Engine in collaboration with 3Lateral and Cubic Motion allowed them to build a game like Hellblade on a much smaller budget but with the exact same kind of cinematic quality output usually found in AAA games with $100 million dollar budgets. All done for a fraction of the cost and no loss in visual fidelity.

Well, Epic would have been remiss to pass up an opportunity to take and expand on this technology for the Unreal Engine 4. During this year’s Game Developers Conference, the studio unveiled new partnerships with Tencent, Vicon, 3Lateral and Cubic Motion to help improve on the creative design flow within the Unreal Engine 4 that allows developers to easily and conveniently capture performances within the Unreal Engine 4 in real-time at 60 frames per second. This was demonstrated with Siren, a lifelike Asian model with CG quality fidelity running in real-time within the Unreal Engine 4’s runtime at an uninterrupted 60 frames per second. You can view the demonstration below, courtesy of Engadget.

As pointed out in the video, the facial rigging works in unison with the procedural animation pipeline to allow people who don’t look like the 3D model to act and perform in place of the model that was originally scanned for use in the project. Previously, the actor and the 3D scanned model had to be the same person to avoid motion discrepancies between the rigged model and the real life actor.

The new tech means that you could technically have one actor performing for multiple life-like characters without dimorphic incongruities from appearing in the final animation pass. You can check out an uninterrupted version of the Siren demonstration below.

As explained by Epic Games, in order to continue the long and arduous journey out of the uncanny valley, a lot of work was put into the model itself for high-end rendering thanks to Cubic Motion and 3Lateral working with Epic to improve the way the model is rendered in the Unreal Engine. This includes improved subsurface backscatter so that thinner portions of skin are translucent under specific lighting conditions, as well as as dual specular reflections for a softer rendering of light gradients from hard to soft lights across a character’s face, and screen space irradiance to help with nulling the dead-fish eye effect.

Epic is making it possible for designers and artists to stream Vicon animation data from the performance capture directly into the Unreal Engine animation pipeline. So if you want to mix and match different real life thespians with 3D actors, you can do so without having to go through the back and forth process of taking the data to the animation crew, cleaning it up and then applying it to the engine pipeline after doing the studio work. You can just see the results right away in the Unreal Engine, right there during the filming.

The advancements in these performance capture techniques were demonstrated with a performance by veteran Hollywood actor Andy Serkis giving a performance from Macbeth.

Using 3Lateral’s real-time compression and muscle contraction high-fidelity targets, animators have the ability to modify the performance of an actor using the real-time Unreal Engine 4 animation toolset.

It’s possible to use the animation data sets on any properly rigged model, even if the model itself has no semblance at all to the actor. This ties into what was mentioned before about the adaptive animation tools that Vicon and Tencent used for the Asian model, even when the real life actress looked nothing like the 3D scanned model. Epic used the captured data from Andy Serkis’ Macbeth performance and applied it to a fictional alien character that was properly rigged to read from the data sets. The results are quite impressive.

Some of the updates for the Unreal Engine 4 containing a lot of these features will be available starting April 20th.

This also applies to Epic’s deep dive into ray-trace rendering.

After Hayssam Keilany managed to implement real-time path-tracing techniques in shaders for games like GTA IV half a decade ago, Epic has finally stepped up their game and have managed to achieve life-like real-time lighting through ray-tracing for multi-pass light sourcing and shadows.

Epic demonstrated this technique with a short film featuring Star Wars.

Real-time ray-tracing can be very taxing, and so Epic partnered with Nvidia and Microsoft to develop a new API called DXR, which is a DirectX ray-tracing pipeline that talks directly to the GPU to perform complex lighting and shader tasks utilizing hardware-dependent tech called RTX.

According to Epic Games, we will be seeing some games released later this year making use of Microsoft’s DXR technology and Epic Games’ new ray-tracing pipelines.

Other Industry News