How was the article?

Industry News
2017/12

Neill Blomkamp’s ADAM: Episode 3 Pushes Unity Technology Forward With Real-Time Cinematics

Unity Technologies and accomplished director Neill Blomkamp have been in partnership for some time now. The two have been working to not only showcase the creative side of Unity’s capabilities, but also how powerful the tools are for creating life-like renders, cinematic quality settings, and high-fidelity environments running in real-time. The latest step in that partnership centers around the extended universe of ADAM, called ADAM: Episode 3.

After six months of work with a team of just 25 people from OATS Studio, Blomkamp and the crew have worked with Unity to roll out the third episode in the ongoing sci-fi short film series. The short you’re about to see was originally demonstrated at SIGGRAPH Asia, but is now available for public perusal for a global audience. Check it out below from the OATS Studios channel.

The short starts with a monk meeting with a survivor named Marianne, who explains that she’s attempting to find asylum at a technology-free farm.

We learn that the settlement where she stayed was no longer safe after it had become contaminated. She took flight from the settlement after her brother Jacob was taken by the Consortium when he was captured for attempting to steal drugs to help his family.

The short starts dark and just gets darker.

ADAM Episode 3 - Crushing Dreams

It also seems to introduce a far more complex element to the world of ADAM that no one ever would have guessed based on the previous shorts. In fact, the end of the short is kind of mind blowing after the twist is discovered about Marianne’s brother being turned into a machine for his punishment of trying to save his family. We learn that the savior – the one who forbade the way of the technocrats – is actually a robot himself, yet he has the ability to cure the sickness that ails the populace. It’s crazy.

I don’t know if ADAM could be a feature length film that could attract a mass audience, not like some of Blomkamp’s other works. I still think Rakka and Firebase would make for awesome feature length films.

Anyway, everything you saw in the video above was running real-time in the Unity engine. It was not pre-rendered with post-production touch-up. Accomplished visual effects artist and VFX supervisor at OATS Studios, Chris Harvey, explained in the press release that he was quite impressed with Unity’s capabilities for cinematic quality rendering running in-engine, saying…

“I have had the privilege of leading VFX on blockbusters like ‘Fast and Furious 6’ as well as story-driven sagas such as ‘Zero Dark Thirty.’ In my years, I’ve never had a more powerful and instantly responsive toolkit than Unity,”

 

“Working in real-time in Unity 2017 gives the team the flexibility to make important creative changes without having to suffer the consequences of reshoots, saving us time and money.”

I suppose it puts a completely different light on filming when you realize that if you don’t like the end of a scene but you liked everything else that came before the end of the scene, you don’t have to reshoot the entire scene. Using real-time performance capture you can just save the animation and performance data that you like and then feed the new data into the pipeline after doing a little reworking and tweaking to the key-frames.

ADAM Episode 3 - The Prophet

It’s amazing because as much as we hear about production costs skyrocketing in video games, we also have tools like Unity and the Unreal Engine 4 that can also drastically cut down on costs thanks to being able to both capture and render cinematic quality visuals in real-time. You no longer have to wait a week or two to check the mo-cap data, or find out that the animation sets are incomplete, or that after days of waiting for a high-def CG render that there are artifacts in the background, and so on and so forth.

Unity also has a new alembic-based facial performance capture toolkits that allows for more accurate facial animations based on 30 reference renders streamed per second in order to create a more accurate depiction of the performance. This works in opposition to standard high-poly rigging or cached morph targets based on preset facial capture data.

Essentially, all that mumbo jumbo above means that you get closer ratios of real-to-life rendering of 3D thespians who don’t react or look as if their face is tired.

ADAM Episode 3 - Marianne

This was combined with the use of photogrammetry based environments, utilizing real-life photographic and 3D scanned imagery to render the backgrounds without requiring an entire team of artists to compose the scenes. In the old days you couldn’t just create a cinematic like that with a team of 20 people, it would actually require 20 modelers just to create the environments you see in the ADAM short film.

The team of just over two dozen people managed to create ADAM: Episode 3 without any additional or supplemental programming. They were able to do everything you saw above using the new Timeline feature to edit, orchestrate, and author each scene exactly the way they wanted, right from within the comfort of the Unity 3D toolkit.

It’s as much about showcasing the creative capabilities of OATS Studios’ small team as it is showcasing how cost effective and efficient the Unity engine has become.

You can learn more by visiting the official Unity 3D website.

Other Industry News