It’s a big day for MetaHuman! We’re really excited to announce that our groundbreaking digital human toolset is leaving Early Access, with the MetaHuman 5.6 release bringing new licensing options and integrations that dramatically extend MetaHumans' reach, as well as major new functionality across the board.
Now embedded in Unreal Engine, MetaHuman Creator also boasts enhanced fidelity, together with new authoring workflows that will vastly expand the options for faces, bodies, and clothing.
Meanwhile, updates to MetaHuman Animator include the ability to generate real-time animation from almost any camera or audio source.
Before we dive into the details, let’s take a look at some of the amazing work that’s being created with MetaHuman by talented developers across the globe.
MetaHuman Creator
Author MetaHumans in Unreal Engine
As we mentioned earlier, MetaHuman Creator is now embedded within Unreal Engine, bringing several highly requested advantages. First off, there’s no longer any need to wait for an instance or deal with limited-time sessions, while the export process has been replaced by a quick assembly step, making it faster and easier to iterate.
Studios managing pipelines will benefit from the fact that MetaHuman Creator, MetaHuman Animator, and Mesh to MetaHuman are all now part of a single application, simplifying character creation workflows—not to mention the increased flexibility local asset management offers.
And finally, you’ll now have full access to the source code of the MetaHuman Creator tool, so you can extend and customize it to suit the needs of your character creation pipeline.
You will still need an internet connection, however. The local in-editor authoring workflow is enhanced by cloud services that deliver autorigging and texture synthesis.
New parametric body system
MetaHuman Creator’s highly regarded intuitive face authoring tools have now been extended to bodies. The new parametric body system enables you to easily create a near infinite range of plausible body shapes based on extensive real-world scanned data that you can sculpt and blend between—just as you can for faces.
You can also adjust or constrain measurements for height, chest, waist, leg length, and many more parameters, making it possible to create MetaHumans with specific proportions for applications such as online fashion retail.
New UE Outfit Asset
Just like real humans, MetaHumans need clothes, and the more extensive the choices, the better. You can now author realistic mesh-based clothing in DCC apps and use the new Unreal Engine Outfit Asset to generate entire wardrobes for your MetaHumans.
The feature enables you to combine multiple garments, including accessories, into a unified Outfit Asset. You can pre-author clothing assets for the best fit for a range of body sizes and shapes; these are interpolated to create clothing that fits any body, automatically resizing against characters’ body shapes.
Outfits can be packaged into the MetaHuman format and be bought and sold on the Fab marketplace, as well as on third-party marketplaces, increasing the range of ready-made MetaHuman attire available for everyone.
Enhanced visual fidelity
We constantly strive to improve MetaHuman so you can achieve the highest-possible fidelity and realism.
With that goal, the MetaHuman database of real-world scan data has been significantly expanded to offer more variation in character mesh shapes and textures. In addition, improvements to the capturing and processing of scanned data now retain more authentic realism, such as blemishes.
The result is more lifelike facial textures and the ability to create a more varied range of characters with MetaHuman Creator. You’ll also enjoy a closer match to your imported mesh or captured footage with Mesh to MetaHuman.
MetaHuman Animator
High-fidelity real-time animation from webcams and smartphones
MetaHuman Animator now enables you to capture actors’ performances on mono cameras—including most webcams and many smartphones—and use these to animate your MetaHumans in Unreal Engine in real time. Whereas capture was previously restricted to stereo HMCs and iPhones, we now support any camera that works with Unreal Engine LiveLink, including certain Android smartphones.
Did we say real time? You heard right! Whether it’s for live performances or getting instant feedback for on-set workflows, your MetaHuman can keep right up with your actor. The option for offline processing remains available for use where it best suits your project requirements.
Enhanced audio-driven animation
And the fun doesn’t stop there! You can also now generate real-time animation solely from audio, making live performances even more accessible.
In addition, MetaHuman Animator is now able to take into account the emotion of the speaker from the audio to provide more lifelike animation; you can also adjust this manually to suit your needs, enabling you to create more relatable MetaHuman performances that work better according to particular contexts.
MetaHuman Animator also adds plausible head movement to the audio-driven performance, producing more believable results out of the box.
We hope you enjoy MetaHuman 5.6 and our brand-new website, and we look forward to seeing MetaHumans everywhere!