Epic is Integrating its Hyperrealistic MetaHumans to Unreal Editor For Fortnite
Gaming

Epic is Integrating its Hyperrealistic MetaHumans to Unreal Editor For Fortnite

Unreal Editor for Fortnite gets MetaHumans integration.

Unreal Editor For Fortnite Credit: Epic Games

At last year's Game Developers Conference (GDC), Fortnite developer Epic Games launched Unreal Editor for Fortnite (UEFN), a creative tool for designing, developing, and publishing games and experiences directly into the popular battle royale game. This year, Epic announced some major features that are coming for it. The company announced that their MetaHumans — hyperrealistic human characters that users can anime using only an iPhone — are available in UEFN starting on Wednesday.

Epic said that users can import any of the photorealistic digital characters created using the MetaHuman Creator tool and the MetaHuman Animtor into UEFN, where they can then be used as NPCs in games created by players. The company said that it's fine-tuned the free online software for efficiency and quality, reducing the average size of a MetaHuman from almost 1GB in Unreal Engine 5 to 60MN in UEFN.

Epic also released a presentation video by one of its internal teams to present what the tool is capable of. You can watch it here:

In the video, Epic emphasizes the speed with which MetaHuman Animator produces results while showing us some impressively subtle facial animation. “The animation is produced locally using GPU hardware, with the final animation available in minutes,” Epic's press release reads. This could potentially help the company save money by making performance capture more efficient, but also, according the company, it could allow them to experiment and be more creative.

“Need an actor to give you more, dig into a different emotion, or simply explore a new direction?” The company's press release asks. “Have them do another take. You’ll be able to review the results in about the time it takes to make a cup of coffee.”

Epic explained that the system is smart enough to animate a character's tongue based on the performance's audio, and that the facial animation can be applied to a MetaHuman character "in just a few clicks".

With the launch of Epic's Live LInk Face iOS app in 2020, performance capture using iPhones has been possible in the Unreal Engine, but now, the company's MetaHuman technology promises to combine that with the high level of detail. According to Epic, the MetaHuman Animator can be used with "existing vertical stereo head-mounted camera [systems[ to achieve greater fidelity".

The Blue Dot short film, produced by Epic Games' 3Lateral team and stars actor Radivoje Bukvić performing a monologue based on a poem by Mika Antić, should give some idea of what its animation tool is capable of, the company said.

Related

Fortnite Nerfs Fortnitemares 2024 Weapon

If you want to find out more, you can check out the documentation for the tool is available via the MetaHuman hub on the Epic Developer Community. Epic also released an instructional video on how to use the MetaHuman Animator in Unreal Engine.

About the author

Jake Vyper (971 Articles Published)

Founder of Epicflix.com. Fantasy & Sci-Fi enthusiast.