Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Creative Bloq
Creative Bloq
Technology
Ian Dean

MetaHuman just broke free from Unreal Engine 5 – why everyone can now create lifelike characters

MetaHuman Animator; a man in a film camera and on set.

At Unreal State of Unreal 2025 Epic Games revealed its impressive MetaHumans tech is no longer Unreal Engine-exclusive and tethered to Unreal Engine 5. A new licensing update announced at Unreal Fest in Orlando has opened the door to use MetaHumans in other 3D modelling software or creative apps, including Blender and Maya.

The FAB marketplace (read our report on FAB for details) is now packed with MetaHuman-compatible outfits, hair grooms and accessories ready for drag-and-drop use. You can also sell your MetaHuman content directly on FAB or third-party marketplaces.

(Image credit: Future / Epic Games)

For Maya artists Epic Games has launched MetaHuman for Maya, a powerful plugin that unlocks direct mesh editing, high-end rigging controls, and groom export tools, enabling artists to push past the limits of the default MetaHuman look while staying fully pipeline-compatible.

In short: MetaHumans are more flexible, accessible, and powerful than ever. With cross-platform animation, marketplace support, and pro-grade rigging in Maya, bringing digital characters to life just got a whole lot easier, no matter what your pipeline looks like.

(Image credit: Future / Epic Games)

Animation made easy

MetaHumans are stepping into the spotlight with Unreal Engine 5.6. Epic Games has refreshed MetaHuman Animator, enabling real-time facial animation from standard webcams, Android phones, and just about any mono camera that works with Live Link.

You no longer need expensive stereo HMC rigs or even an iPhone to get high-fidelity, on-the-fly animation. I saw this in action at Unreal Fest in a number of demos, which showcased how you can simply stand in front of the camera or camera phone and start recording live, saving animation to hand tweak afterwards.

Whether you’re capturing live performances on set or just want instant visual feedback, your MetaHuman can now keep pace with the actor in real time. I got to take part, and watch as the face mapping tech in MetaHuman animator matched my expressions and lip synced perfectly.

It even works with just audio, so no camera, for just recording live mouth movement. This means you can now animate a MetaHuman using only audio. Epic’s latest tools analyse vocal input in real time to generate lifelike facial motion, including emotion-aware performance and automatic head movement. You can even fine-tune the emotional tone manually, giving you full control to match your project’s mood or message.

Visit the Epic Games Unreal Engine website for more details. Read our guides to the best laptops for 3D modelling and best camera phones to gauge the hardware you may need.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.