Animate Metahumans Using an Audio File in Unreal Engine 5.5

  Рет қаралды 10,281

Pixel Prof

Pixel Prof

Күн бұрын

Пікірлер: 67
@commontimeproductions
@commontimeproductions 2 ай бұрын
This is amazing! Thanks so much for sharing that this is now available. After the last "free" plugin that was allowing audio to lip sync decided to stop all of us using it for free and started charging obscene prices and deemed only for large studios, so many of us indie devs were completely derailed in using the metahuman lip sync functionality. Epic is amazing for making this a part of the engine finally. No longer will we be stopped by a pay wall for our projects. Hooray! I will be sharing your video :)
@tbowren
@tbowren 2 ай бұрын
For sure this was my most favorite thing I saw and demoed at Unreal Fest. Thanks for making a video so quickly!
@PixelProf
@PixelProf 2 ай бұрын
Hah! Thanks Tony!
@Monoville
@Monoville 2 ай бұрын
Great simple tutorial, so glad they've implemented this. Gave it a couple of tests and the results are very impressive, particularly when combined with the facial animations from city sample to replace the eye movements/blinking. Will still use the previous animator method for accurate facial, but this is a great quick method.
@PixelProf
@PixelProf 2 ай бұрын
Glad it helped!
@cwchris_
@cwchris_ 14 күн бұрын
Question for you, it seems when I do this it works kinda, but the animations seem overlapped. Is there a fix to that to where you use specific parts from both animations?
@Monoville
@Monoville 14 күн бұрын
@cwchris_ as long as you only copy and paste the specific parts (eg eye movement) from one amd delete those specific parts from the other there shouldn't be any overlap
@olavandreasterrasource8034
@olavandreasterrasource8034 2 ай бұрын
thanks you so much i have been looking for this tutorial for weeks
@pondeify
@pondeify 2 ай бұрын
oh man this is awesome - thanks for sharing!
@JayceeeFilmVfx
@JayceeeFilmVfx 21 күн бұрын
Thanks for this Nick! :)
@dddharmesh
@dddharmesh 2 ай бұрын
Amazing. So quick and easy to follow.... Thanks
@aerospacenews
@aerospacenews 2 ай бұрын
Appreciate the effort that went into this video and that Epic is rolling out the capability @PixelProf
@duchmais7120
@duchmais7120 2 ай бұрын
Awesome. I recall trying something like this way back using Blender phonemes/visemes (mouth shapes)...A, O, E, W/R, T/S, L/N, U/Q, M/B/P, F/V to aid in Lip Sync. Where each phoneme was associated with specific lip and facial movements. Haven't tried it in years. Waiting to try this out once they Release of the Unreal Engine 5.5. The Eye movements (Blinking) is what is puzzling Me.
@nsyed3d
@nsyed3d 2 ай бұрын
This is great, Thank you.
@ValicsLehel
@ValicsLehel 2 ай бұрын
If can be Blueprinted and feed a wav to playback as a stream would be perfect.
@mazreds
@mazreds 2 ай бұрын
I agree
@xaby996
@xaby996 2 ай бұрын
Yep. But its hard wired inside of MetaHuman Performance asset. Chances of this being live are low..
@VolkanKucukemre
@VolkanKucukemre Ай бұрын
Thanks for the tutorial. So, at the moment this can't be used by streamed audio and we basically need to bake the animation?
@PixelProf
@PixelProf Ай бұрын
Yup. Currently bake animation from a recorded WAV file.
@tmaintv
@tmaintv 2 ай бұрын
Very interesting thanks. What can u do with the depth input?
@PixelProf
@PixelProf 2 ай бұрын
Depth input is to process data captured with the LiveLink Face iOS app when it’s set to “MetaHuman Animator” mode.
@lordnaps
@lordnaps 2 ай бұрын
pretty cool but the lack of any emotion on the rest of the face because its only tracking audio and not a facial structure makes me wonder what this could be used for practically
@PixelProf
@PixelProf 2 ай бұрын
Since Unreal has an extremely capable (ind continuously improving), layered animation system, the synchronized viseme animation from this process can be readily combined with other motion capture, hand-keyed or procedural animation sources. For example, a LiveLink Face performance capture can be quite effective for overall face expressions, but lacks fidelity around the mouth, so something I’m hoping to experiment with soon is using this to process the audio from a LiveLink capture, and then art-direct the layering of results from both performance capture methods. Also, I imagine this is a step in the direction of realtime audio->animation, which could facilitate performance based on realtime voice generation systems (just speculation on my part for now)
@lordnaps
@lordnaps 2 ай бұрын
@@PixelProfwould love to see that idea of mixing the live link and facial cap. but i definitely seeing it going in that direction
@xavierhatten9011
@xavierhatten9011 2 ай бұрын
@@PixelProf ok I"ve done this with Livelink Face app you can move the head In real time to make it more human like but the eyes needs to be able to blink more but you can run the animation and do the head movement at the same time
@Ronaldograxa
@Ronaldograxa 2 ай бұрын
I wanted to use meta humans to create digital avatars but I see tools like hey gen coming up and I really wonder if meta human is actually worth the time.. AI is moving too quick
@ETT-b6q
@ETT-b6q 2 ай бұрын
Great tutorial. I’m trying to get additional animation on top of that audio file to animation. Normally I would bake mocap to rig and add additive- but with this facial mocap it breaks or reduces the lip sync to very small movements. How would you go about it?
@nandini8904
@nandini8904 Ай бұрын
This is a great video! I have a question- I keep getting an error while trying to install the metahuman plugin, Error Code: MD-0011-0. Can't find any help online, was wondering if you know how to fix this. I heard that it's because of the fab migration.
@RichardRiegel
@RichardRiegel 2 ай бұрын
Yap... then export it via FBX, import to 5.4 and smoke it.. works perfect for now :)
@SpinxSage
@SpinxSage 2 ай бұрын
I haven't gotten lucky with this, any advice?
@RichardRiegel
@RichardRiegel 2 ай бұрын
⁠@@SpinxSageit´s hard to help if you not specific describe where’s the problem 😊
@7ribeh
@7ribeh Ай бұрын
would you be able to help me with this proces?:)
@lunabeige
@lunabeige Ай бұрын
does it work for other languages other than english?
@PixelProf
@PixelProf Ай бұрын
I haven't tried other languages, but my understanding is that yes, it generally works in other languages as well.
@Felix-iv2ns
@Felix-iv2ns 2 ай бұрын
runtime?
@PixelProf
@PixelProf 2 ай бұрын
Not yet.
@abdullahalsaadi5991
@abdullahalsaadi5991 Ай бұрын
Hi. Great tutorial. Do you know if it is possible to stream audio data into this instead of using a pre-existing audio file?
@PixelProf
@PixelProf Ай бұрын
So far it just processes audio recording files.
@hanasprod
@hanasprod Ай бұрын
do you know how to animate the eyes?
@PixelProf
@PixelProf Ай бұрын
Yup…. Add the face board control rig in layered mode, then use that to animate the eyes to taste. Will try to make time to record a tutorial video on this.
@GustavoTommaso-xk8vk
@GustavoTommaso-xk8vk 19 күн бұрын
Into the sequencer, click on the + on the Face Layer, Control Rig > Select [Layered] option!!!
@metternich05
@metternich05 2 ай бұрын
Could you do a more comprehensive tutorial on metahuman lip sync? There's almost none out there.
@PixelProf
@PixelProf 2 ай бұрын
Sure… are you looking for something on editing animation that already came from performance or audio capture(like this video shows), ….. or “starting from scratch” using only keys & curves on the control board for animating?
@metternich05
@metternich05 2 ай бұрын
@@PixelProf Starting from scratch is more accurate :) I'm somewhat familiar with UE, I do environments but haven't even tried metahumans. What I actually have in mind is creating an avatar or virtual character that would be the face of a youtube channel and do all the talking. I'm thinking of something more realistic than the one above, with more complex facial expressions, head movement and even gestures. I'm not sure how much effort this is. Though with all the AI rage out there, avatars will be on the rise, if they aren't already.
@dimension3plus
@dimension3plus Ай бұрын
I can't open the 5.5 (preview) project after updating to the official 5.5. :(
@ke_sahn
@ke_sahn 2 ай бұрын
is it possible to combine this with a performance capture?
@PixelProf
@PixelProf 2 ай бұрын
Yup. Body motion can be applied independently and other facial performance can be layered on with Unreal’s Sequencer.
@andrewstrapp
@andrewstrapp Ай бұрын
@@PixelProf Would LOVE a tutorial on this if you ever have a chance. Thank you! You're the best.
@terezaancheva1
@terezaancheva1 Ай бұрын
I've installed 5.5 and metahuman plugin is still missing from the content browser.. Do you have any advice?
@PixelProf
@PixelProf Ай бұрын
The Metahuman plugin is installed from the "Fab Library" section of your Epic Launcher (on the Unreal Engine->Library page). Be sure that you have added it to your Fab Library (should be free or already added), search for "Metahuman" in Launcher and it should be found as one of the options with an "Install to Engine" button. Click that button and you can select a compatible, installed engine to add to. Hope this helps.
@haydenbushfield6351
@haydenbushfield6351 Ай бұрын
for some reason when i try this the animation does not play in the sequencer? could this just be a bug for me?
@PixelProf
@PixelProf Ай бұрын
Not sure.... I didn't do any "behind the scenes" tricks to get this to work, just what you see in this video. (I think the only edits were to skip waiting times)
@jonaltschuler2024
@jonaltschuler2024 2 ай бұрын
Hoping to try this for real time streamed audio, but looks like it’s not there yet. If anyone has suggestions for that, please let me know 🙏
@PixelProf
@PixelProf 2 ай бұрын
Yup. Fingers crossed that's a near-future thing, but this function (for now) is recorded audio/post-process only.
@SpinxSage
@SpinxSage 2 ай бұрын
When I export it and bring it into my level sequence, my metahuman does not move. But when checking the animation sequence, the head does move
@manojkennedy26
@manojkennedy26 2 ай бұрын
Data type selection was not visible, what will be the solution
@PixelProf
@PixelProf 2 ай бұрын
Should be there if you installed the 5.5 version of the plugin into Unreal 5.5 It’s not available in earlier versions.
@olavandreasterrasource8034
@olavandreasterrasource8034 2 ай бұрын
now how do can i use the eyes i cant find it out
@PixelProf
@PixelProf 2 ай бұрын
I'll work on a follow-up video that shows how to apply adjustments and otherwise animate the face and eyes in conjunction with the viseme results from this tool, but it is basically to use sequencer to bake this result onto a face control board, and layer on additional animation inputs.
@olavandreasterrasource8034
@olavandreasterrasource8034 2 ай бұрын
i am looking forward to see your next videos thanks so much
@GustavoTommaso-xk8vk
@GustavoTommaso-xk8vk 19 күн бұрын
Into the sequencer, click on the + on the Face Layer, Control Rig > Select [Layered] option!!!
@massinissa8697
@massinissa8697 2 ай бұрын
it is already in Nvidia omniverse app with some expression faces !
@archcast5550
@archcast5550 6 күн бұрын
This will make metahuman sdk obsolete
@virtualworldsbyloff
@virtualworldsbyloff 2 ай бұрын
Animate = Lipsync = CLICKBAIT
Fixing "Auto" Retargeting Issues in Unreal Engine 5.5
7:00
Pixel Prof
Рет қаралды 1,6 М.
MetaHuman Animator for ANDROID and WEBCAM is finally here
10:35
Жездуха 41-серия
36:26
Million Show
Рет қаралды 5 МЛН
УНО Реверс в Амонг Ас : игра на выбывание
0:19
Фани Хани
Рет қаралды 1,3 МЛН
#UE5 Series: Animation Layers in UNREAL Engine 5.5
15:30
SARKAMARI
Рет қаралды 2,6 М.
Fixing 5 Common Metahuman Issues (Unreal Engine Tutorial)
14:15
Virtual Production Insider
Рет қаралды 1,8 М.
Everything New In Unreal Engine 5.5!
21:22
Smart Poly
Рет қаралды 28 М.
I Remade Star Wars VFX in 1 Week
10:39
ErikDoesVFX
Рет қаралды 2 МЛН
Unreal Engine 5.5 Metahuman Audio To Facial Animation Tutorial
9:16
MetaHuman Animation Tutorial | Unreal Engine 5.5
14:23
MR GFX Unreal
Рет қаралды 7 М.
Жездуха 41-серия
36:26
Million Show
Рет қаралды 5 МЛН