With Epic Games having shown off the latest advance in real-time graphics at GDC, could we see the soon-to-be-announced PS5 finally render characters that aren’t stuck in the ‘uncanny valley’?

Raytracing had been used very sparingly in the pre-rendered CGI industry for a while before its wide-scale use in CG films. The effect was traditionally the single most time-consuming of the rendering process. Toy Story had some limited raytracing to help add a realistic shine to Buzz’s visor while Cars would use the effect extensively nearly a decade later. It wasn’t until Monsters University that raytracing would carry the majority of a feature film’s lighting burden. Each of Monster Uni‘s frames took 29 hours to render – and this was in 2013 – the idea of doing this in real-time after such a small space of times sounds like magic to me.

A sentiment which I bet has Arthur C. Clarke looking mighty smug in heaven right now. Real-time raytracing was shown off at GDC with a particular focus on creating realistic 3D digital humans. According to Tim Sweeney, Epic’s CEO was ecstatic at the result on screen. According to Sweeney, real-time ray tracing “is a breakthrough we’ve been waiting for for over 20 years“.

Along with more accurate 3D figures, there are even more benefits to real-time raytracing over standard rasterisation. According to Gamesindustry.biz: “With ray-tracing, it’s possible to create much more accurate reflections, giving objects a definition and dimension that isn’t possible with rasterised images. It also allows for a “more intuitive way of lighting”, one that closely resembles that used on real movie sets and actors“.

The Generation Game

We’ve seen from generation to generation, that developers are getting closer to breaking into the uncanny valley.

In my opinion, LA Noire was the first game that really pushed the boundaries of what we can expect in terms of facial expressions. It was the first game to use Motion Capture. What separated motion capture from its competitors is that it could record the face of an actor at over 1000 frames per second. Motion scan uses 32 cameras placed around the actors face in order to recreate a highly realistic moving human face.

Incredible facial animation for Xbox 360/PS3, but have been surpassed by numerous games since

While the facial expressions in LA Noire were way ahead of their time, one area where motion scan couldn’t run its magic was body movement. Many players believed that “characters were dead from the waist down”.

PlayStation exclusives are fully optimised for the console and generally are of a higher visual standard than Xbox exclusives with Forza Motorsport 7 being a solitary exception. There aren’t equivalent exclusives to Horizon Zero Dawn, Uncharted 4, & God of War; all of these games are pushing the base hardware to its limits without melting 2013’s finest.

In the space of 5 years and one console generation, the improvements are remarkable.

One game which has yet to be released but will more than likely get the closest to bridging the uncanny valley this generation is The Last of Us Part II. It should come as no surprise to anyone that Naughty Dog will be the studio to get the closest to climbing up the other side of the uncanny valley during this generation of consoles.

This technology is the most advanced in all of Sony’s in-house studios. Druckmann even said that “We’ve never been able to have a close-up of eyes before because we couldn’t get the fidelity but now we can.”

It may take a couple of years for technologies like raytracing to be implemented in video games but rest assured, this computationally expensive process in on the way. Even with the “end of Moore’s Law” being a hot topic on the Youtubes.

Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments