Innovation Monitor: Introducing MetaHumans…Epic’s virtual beings
Welcome to this week’s Innovation Monitor.
This newsletter has covered many of the different ways technology will drive the future of media production — from the future of gaming, to the rise of virtual influencers, to the transformation of virtual production techniques.
Last week, we caught an announcement from from Epic Games, the creator of the Unreal Engine, about a new platform MetaHumans that can generate virtual human avatars almost instantaneously. A process that previously took major resources and time now can be easily accessed and widespread.
These aren’t quite ‘deepfakes’ in the way we’ve come to know the term. The MetaHuman Creator Platform feels like a major step towards widespread avatar usage and virtual environments.
We anticipate this will have major implications for every (media) industry, especially in gaming, social/media, and movies. Technology that democratizes processes that were previously limited to big studios will undoubtedly unleash creativity, new experimental content, and more representation. However, tech this powerful and ubiquitous will raise questions and concerns around its societal consequences and oversight. And these are the conversations the NYC Media Lab is here to start!
This edition has a number of exciting videos and demos, so I hope you’ll enjoy taking them in over the weekend! As always, stay safe & thank you for reading, and if you were forwarded this email, you can easily sign up here!
Erica Matsumoto The realistic digital characters we see in games can take a month or more to design. Epic wants to whittle that down from months to minutes with their new cloud-streamed MetaHuman Creator platform. If you haven’t seen the amazing promo, you can watch it here.
Faces are “notoriously hard to get right,” notes Fast Company and time consuming. For example, Triple-A game studios hire actors to spend long hours in mocap suits, and while the models look realistic, they usually end up with dead/creepy eyes. “Even if [faces are] a priority for a game developer, they’re rarely the top priority. That means faces may get less processing power than they need to look convincing.”
One notable example of incredible renderings is Quantic Dream, which actually built its own engine to support realistic faces and facial expressions for cinematic games like Detroit: Become Human.
This attention to detail comes at a price: the studio “cannot create its games atop licensed foundational technologies… as so many other game development studios can. Because according to [studio lead David Cage], solutions like the Unreal Engine just haven’t offered the fidelity you need.”
Epic has said that MetaHumans will be employed in use cases beyond video games, for example, as virtual influencers and AR experiments. In this issue, we’ll explore how such a powerful tool can affect the gaming, social media, and movie industries.
Gaming & Virtual Influencers The Avengers game came out last November, but there was a beta available for play earlier in 2020. Some users experienced cognitive dissonance as the character models looked nothing like their movie counterparts.
While the game doesn’t seem to support custom skins other than unlockables, we can imagine modders for other titles quickly creating MetaHuman character models akin to BabyZone’s. And game developers themselves would be able to cut down on character development times. The tool could help expand creator communities and free up developer time to work on other aspects of a game — two requirements for any early metaverse.
Virtual influencers can take hours to pose and “cloth” and texture. Back in 2018, New York magazine took only 48 hours “step by step from building a default character, aligning geometric features via references, to applying texture, clothing, and more” to create a Miquela copycat.
That’s an impressive schedule, but a designer with these skills would be able to create a realistic digital influencer in a fraction of the time with a tool like MetaHuman Creator. This opens up opportunities for more creators to enter the industry — which will have both positive and harmful consequences.
Dive into the world of digital influencers though a fun issue last year covering Lil Miquela, virtual influencers, and some social media anthropology. Watch Trevor McFedries the creator of Lil Miquela chat with Aria Dean at our NYC Media Lab Summit last year here. Movies with Virtual Humans In 2007, NVIDIA released a very realistic (even by today’s standards) demo of a real-time rendering of Doug Jones’ head.
In 2013, Jensen Huang showed off impressive real-time animation of Digital Ira, adding some life to the realistic-head tech demo.
MetaHuman Creator is several steps above Digital Ira in terms of realism… to the point where, with some additional work, you can comfortably insert them into live-action productions.
Coupled with virtual production techniques being employed by shows like The Mandalorian, post-production could become faster. Here’s a great explainer on volumetric video capture and production:
This Week in Business History February 19th, 1878: Thomas Edison patents the phonograph
The invention was inspired by a mix of Edison’s work around telephones and telegraphs. He was looking to allow for the repeated transmission of telegraph messages and created a method to capture a passage of morse code as a sequence of indentations on a spool of paper.
The first ever recorded message was Mary Had A Little Lamb (you can actually hear Edison’s voice below!)