This is amazing! Thanks so much for sharing that this is now available. After the last "free" plugin that was allowing audio to lip sync decided to stop all of us using it for free and started charging obscene prices and deemed only for large studios, so many of us indie devs were completely derailed in using the metahuman lip sync functionality. Epic is amazing for making this a part of the engine finally. No longer will we be stopped by a pay wall for our projects. Hooray! I will be sharing your video :)
@tbowren2 ай бұрын
For sure this was my most favorite thing I saw and demoed at Unreal Fest. Thanks for making a video so quickly!
@PixelProf2 ай бұрын
Hah! Thanks Tony!
@Monoville2 ай бұрын
Great simple tutorial, so glad they've implemented this. Gave it a couple of tests and the results are very impressive, particularly when combined with the facial animations from city sample to replace the eye movements/blinking. Will still use the previous animator method for accurate facial, but this is a great quick method.
@PixelProf2 ай бұрын
Glad it helped!
@cwchris_14 күн бұрын
Question for you, it seems when I do this it works kinda, but the animations seem overlapped. Is there a fix to that to where you use specific parts from both animations?
@Monoville14 күн бұрын
@cwchris_ as long as you only copy and paste the specific parts (eg eye movement) from one amd delete those specific parts from the other there shouldn't be any overlap
@olavandreasterrasource80342 ай бұрын
thanks you so much i have been looking for this tutorial for weeks
@pondeify2 ай бұрын
oh man this is awesome - thanks for sharing!
@JayceeeFilmVfx21 күн бұрын
Thanks for this Nick! :)
@dddharmesh2 ай бұрын
Amazing. So quick and easy to follow.... Thanks
@aerospacenews2 ай бұрын
Appreciate the effort that went into this video and that Epic is rolling out the capability @PixelProf
@duchmais71202 ай бұрын
Awesome. I recall trying something like this way back using Blender phonemes/visemes (mouth shapes)...A, O, E, W/R, T/S, L/N, U/Q, M/B/P, F/V to aid in Lip Sync. Where each phoneme was associated with specific lip and facial movements. Haven't tried it in years. Waiting to try this out once they Release of the Unreal Engine 5.5. The Eye movements (Blinking) is what is puzzling Me.
@nsyed3d2 ай бұрын
This is great, Thank you.
@ValicsLehel2 ай бұрын
If can be Blueprinted and feed a wav to playback as a stream would be perfect.
@mazreds2 ай бұрын
I agree
@xaby9962 ай бұрын
Yep. But its hard wired inside of MetaHuman Performance asset. Chances of this being live are low..
@VolkanKucukemreАй бұрын
Thanks for the tutorial. So, at the moment this can't be used by streamed audio and we basically need to bake the animation?
@PixelProfАй бұрын
Yup. Currently bake animation from a recorded WAV file.
@tmaintv2 ай бұрын
Very interesting thanks. What can u do with the depth input?
@PixelProf2 ай бұрын
Depth input is to process data captured with the LiveLink Face iOS app when it’s set to “MetaHuman Animator” mode.
@lordnaps2 ай бұрын
pretty cool but the lack of any emotion on the rest of the face because its only tracking audio and not a facial structure makes me wonder what this could be used for practically
@PixelProf2 ай бұрын
Since Unreal has an extremely capable (ind continuously improving), layered animation system, the synchronized viseme animation from this process can be readily combined with other motion capture, hand-keyed or procedural animation sources. For example, a LiveLink Face performance capture can be quite effective for overall face expressions, but lacks fidelity around the mouth, so something I’m hoping to experiment with soon is using this to process the audio from a LiveLink capture, and then art-direct the layering of results from both performance capture methods. Also, I imagine this is a step in the direction of realtime audio->animation, which could facilitate performance based on realtime voice generation systems (just speculation on my part for now)
@lordnaps2 ай бұрын
@@PixelProfwould love to see that idea of mixing the live link and facial cap. but i definitely seeing it going in that direction
@xavierhatten90112 ай бұрын
@@PixelProf ok I"ve done this with Livelink Face app you can move the head In real time to make it more human like but the eyes needs to be able to blink more but you can run the animation and do the head movement at the same time
@Ronaldograxa2 ай бұрын
I wanted to use meta humans to create digital avatars but I see tools like hey gen coming up and I really wonder if meta human is actually worth the time.. AI is moving too quick
@ETT-b6q2 ай бұрын
Great tutorial. I’m trying to get additional animation on top of that audio file to animation. Normally I would bake mocap to rig and add additive- but with this facial mocap it breaks or reduces the lip sync to very small movements. How would you go about it?
@nandini8904Ай бұрын
This is a great video! I have a question- I keep getting an error while trying to install the metahuman plugin, Error Code: MD-0011-0. Can't find any help online, was wondering if you know how to fix this. I heard that it's because of the fab migration.
@RichardRiegel2 ай бұрын
Yap... then export it via FBX, import to 5.4 and smoke it.. works perfect for now :)
@SpinxSage2 ай бұрын
I haven't gotten lucky with this, any advice?
@RichardRiegel2 ай бұрын
@@SpinxSageit´s hard to help if you not specific describe where’s the problem 😊
@7ribehАй бұрын
would you be able to help me with this proces?:)
@lunabeigeАй бұрын
does it work for other languages other than english?
@PixelProfАй бұрын
I haven't tried other languages, but my understanding is that yes, it generally works in other languages as well.
@Felix-iv2ns2 ай бұрын
runtime?
@PixelProf2 ай бұрын
Not yet.
@abdullahalsaadi5991Ай бұрын
Hi. Great tutorial. Do you know if it is possible to stream audio data into this instead of using a pre-existing audio file?
@PixelProfАй бұрын
So far it just processes audio recording files.
@hanasprodАй бұрын
do you know how to animate the eyes?
@PixelProfАй бұрын
Yup…. Add the face board control rig in layered mode, then use that to animate the eyes to taste. Will try to make time to record a tutorial video on this.
@GustavoTommaso-xk8vk19 күн бұрын
Into the sequencer, click on the + on the Face Layer, Control Rig > Select [Layered] option!!!
@metternich052 ай бұрын
Could you do a more comprehensive tutorial on metahuman lip sync? There's almost none out there.
@PixelProf2 ай бұрын
Sure… are you looking for something on editing animation that already came from performance or audio capture(like this video shows), ….. or “starting from scratch” using only keys & curves on the control board for animating?
@metternich052 ай бұрын
@@PixelProf Starting from scratch is more accurate :) I'm somewhat familiar with UE, I do environments but haven't even tried metahumans. What I actually have in mind is creating an avatar or virtual character that would be the face of a youtube channel and do all the talking. I'm thinking of something more realistic than the one above, with more complex facial expressions, head movement and even gestures. I'm not sure how much effort this is. Though with all the AI rage out there, avatars will be on the rise, if they aren't already.
@dimension3plusАй бұрын
I can't open the 5.5 (preview) project after updating to the official 5.5. :(
@ke_sahn2 ай бұрын
is it possible to combine this with a performance capture?
@PixelProf2 ай бұрын
Yup. Body motion can be applied independently and other facial performance can be layered on with Unreal’s Sequencer.
@andrewstrappАй бұрын
@@PixelProf Would LOVE a tutorial on this if you ever have a chance. Thank you! You're the best.
@terezaancheva1Ай бұрын
I've installed 5.5 and metahuman plugin is still missing from the content browser.. Do you have any advice?
@PixelProfАй бұрын
The Metahuman plugin is installed from the "Fab Library" section of your Epic Launcher (on the Unreal Engine->Library page). Be sure that you have added it to your Fab Library (should be free or already added), search for "Metahuman" in Launcher and it should be found as one of the options with an "Install to Engine" button. Click that button and you can select a compatible, installed engine to add to. Hope this helps.
@haydenbushfield6351Ай бұрын
for some reason when i try this the animation does not play in the sequencer? could this just be a bug for me?
@PixelProfАй бұрын
Not sure.... I didn't do any "behind the scenes" tricks to get this to work, just what you see in this video. (I think the only edits were to skip waiting times)
@jonaltschuler20242 ай бұрын
Hoping to try this for real time streamed audio, but looks like it’s not there yet. If anyone has suggestions for that, please let me know 🙏
@PixelProf2 ай бұрын
Yup. Fingers crossed that's a near-future thing, but this function (for now) is recorded audio/post-process only.
@SpinxSage2 ай бұрын
When I export it and bring it into my level sequence, my metahuman does not move. But when checking the animation sequence, the head does move
@manojkennedy262 ай бұрын
Data type selection was not visible, what will be the solution
@PixelProf2 ай бұрын
Should be there if you installed the 5.5 version of the plugin into Unreal 5.5 It’s not available in earlier versions.
@olavandreasterrasource80342 ай бұрын
now how do can i use the eyes i cant find it out
@PixelProf2 ай бұрын
I'll work on a follow-up video that shows how to apply adjustments and otherwise animate the face and eyes in conjunction with the viseme results from this tool, but it is basically to use sequencer to bake this result onto a face control board, and layer on additional animation inputs.
@olavandreasterrasource80342 ай бұрын
i am looking forward to see your next videos thanks so much
@GustavoTommaso-xk8vk19 күн бұрын
Into the sequencer, click on the + on the Face Layer, Control Rig > Select [Layered] option!!!
@massinissa86972 ай бұрын
it is already in Nvidia omniverse app with some expression faces !