The sweetest update so far with Audio2face to Metahuman
@MiguePizar Жыл бұрын
This is very useful, I was tired of having to match the audio after doing facial mocaps. Thank you as always for showing the latests of unreal and related.
@richardaudette808410 ай бұрын
Just a quick thanks - with this Audio2Face and your Audio2Gesture tutorials, I've been able to build a handful of video clips with nVidia's models and Unreal. I've had a blast putting them together. Thanks!
@hocestbellumchannel Жыл бұрын
Thanks for all your hard work man! Can you make an extended version of this tutorial showing how to record the animation into a clip for sequencer use?
@erosschiavon2987 Жыл бұрын
have you figure out how to do it??
@dethswurl117 Жыл бұрын
If anyone's watching this video today, audio2face now supports Unreal Engine 5.3 When you go to copy the folder like in the video, the folder is called "ACE" now, so copy that whole folder
@shinjikei4649 Жыл бұрын
I was waiting for this!!!!!!!!!!!!!!!!!!!!!!!! This is a game changer.
@tricks3594 Жыл бұрын
BROOOOOOOOOOOO this is insane!! thank you for sharing!
@ryansalazaracosta2596 Жыл бұрын
Yeah! time to go back to visit A2F.
@rsunghun Жыл бұрын
I was planning to buy iphone 12 mini for the facial mocap but I may have to try this first XD
@Jsfilmz Жыл бұрын
your like me i like free
@gauravjain4249 Жыл бұрын
Amazing, thanks a lot Nvidia and JSFIMZ explain very well.
@노딩맨10 ай бұрын
For ver 2023.2.0, Copy folder shoud be "C:\Users\ooo\AppData\Local\ov\pkg\audio2face-2023.2.0\ue-plugins\audio2face-ue-plugins\ACEUnrealPlugin-5.2
@zebius4157 Жыл бұрын
I agree they should retarget directly to the metahuman rig like how metahuman animator which also has 4d data does instead of an arkit based proxy, which as is, is okay i guess, but seeing the raw results and results from directly wrapping the mesh impresses me a lot more than arkit tbh
@fractalsynapse8 ай бұрын
This is a serious unlock for indy production ->>> cinematics. Need a video / workflow on the Ominverse side for live cam audio / track segmentation. They have some work to do on their track UI - Awesome Video - TY!
@motionislive5621 Жыл бұрын
How to combine this with audio 2 gesture please ! ?
@supermattis108 Жыл бұрын
Awesome! I installed right away!
@davidwillno Жыл бұрын
It worked! So far this has been the ONLY face mocap solution for Android / PC (non-iphone) users. Kudos! Is there a way of copying this animation (in UE5.2) so I can apply it in UE5.3 (where my project is)?
@victorflaviodeandradearauj7049 Жыл бұрын
This is very very awesome!
@colehiggins111 Жыл бұрын
So amazing. What does recording look like? Do you basically just pull up the track editor and hit record? Does it create a separate timeline for the facial animation? Would love to see that process.
@christiandebney1989 Жыл бұрын
wow, have been wondering if this was possible. i have some great audio by actors that ive been trying to work out how to get into my metahuman... thanks!
@unitednorthpole Жыл бұрын
Thanks for your tutorials!
@syedhannaan29745 ай бұрын
i really needed help with integrating body animations with this ,such that it stays idle whenever its not moving lips, and moves the body whenever the lips move ,i will have like an idle animation and talking animation ,idle when lips dont move and talking whenever lips move
@dmingod999 Жыл бұрын
this is 🔥
@saemranian Жыл бұрын
Perfect, Thanks for sharing.
@DLVRYDRYVR Жыл бұрын
Thanks Professor
@foshizzlfizzl Жыл бұрын
Is there a possibility to over exaggerate the mimic? Because it always looks like somebody holding the metahuman's mouth. That, unfortunately, makes it kind of boring to look at. But the simplicity of using this tool is ridiculously well done.
@Jsfilmz Жыл бұрын
oh yea you can change the sliders
@tingxu9398 Жыл бұрын
I'm doing the exact same steps but the metahuman does not have any face change when I play the audio in audio2face...Anyone knows how to solve this?
@Atenkai Жыл бұрын
Thank you ! Insane.
@PatrickTheDM Жыл бұрын
I JUST CAN'T KEEP UP! And I kinda like it.
@Jsfilmz Жыл бұрын
lol how i feel
@ai_and_chill Жыл бұрын
yea but how would do you trigger the different animations to happen without manually changing things in omniverse? i feel like the original method of exporting to fbx once you generated your omniverse animation was more suitable for ue5 sequences that aren't live. even in the instance of it being live, how would you quickly change the source audio file? is there a way to change the source audio file in omniverse from ue5? is there a way to record via ue5 when the animation from omniverse live is playing?
@Jsfilmz Жыл бұрын
take recorder
@ai_and_chill Жыл бұрын
@@Jsfilmz lol love the rabbit trail response. i'll check it out!
@Stephen_Falken Жыл бұрын
Yes, it's the second time I hear about take recorder. Can you tell something more? Is it like I record each take, and then import it to the sequencer for the final render? Any chance for a few more details, some step by step guide?
@TransformXRED Жыл бұрын
How would you do this... , get the expression and movements of the mouth "backed" in the metahuman after being played from audio2face, then just animating the head afterwards whole having the face animated. The previous method with exporting the animated data made more sense to me (since I'm just learning about it now). Because when imported in the metahuman, the animation is linked to it now. I'm trying to wrap my head around every steps to get a custom face (with textures from photos), to its animation, then export it to be used in blackmagic fusion (or rendered from UE5 as a video)
@ActionCopilot Жыл бұрын
Same here 🙋♂ I also prefer the method with the export of animated data and I think it is still in this new version, but it is not optimized: The mouth does not close, the eyes do not blink with the same intensity. Everything looks great in Audio2Face but when you export the animated data (JSON or USD Cache) and import it to a DCC like Blender or Unreal Engine this quality is lost and it will not look like it does in Audio2Face😢 NVIDIA has acknowledged that this problem has existed since August 2022 but there is still no solution. Have you experienced this same problem?
@petaravramovic7998 Жыл бұрын
Thanks for this brilliant tutorial.
@LucidAnimationTV11 ай бұрын
"Double click this and copy this folder right here"... I'm struggling but great tutorial.
@DJDaymos Жыл бұрын
I'm not sure if I'm brave enough to try this.. I tried the last and spent many hours and your tutorial (which was great) last year and failed and I'm a pretty advanced animator but Unreal I just find so hard to use. I produce better work transferring a scene with characters from iclone to omiverse and render with no major issues. Unreal is a mess.. you try to import a scene from Iclone and the characters lose their animation and revert to t pose.
@Jsfilmz Жыл бұрын
LOL
@jbach10 ай бұрын
Thanks for sharing! If you are starting with text, what is the recommended workflow for converting the text to an audio waveform for use in Audio2Face?
@AIImpactPartners11 ай бұрын
Jay have you taken the Audio2Face headless API exported an animation and then imported it via script or blue print into unreal (to be used in sequence). Not in the editor, strictly programmatically?
@babasahebpinjar62903 ай бұрын
I could achieve this in viewport, but when I play the game the metahuman isn't speaking. is there any setting or link i need to enable ?
@serh007 Жыл бұрын
I want to repeat what is in the video, but I am a complete beginner, I still need to understand the left part and nvidia, but everything is difficult with Unreal Engine 5, make an instruction on how to create a project from scratch to this point.
@Nashfanfl13 Жыл бұрын
Do you have a livelink with an iphone video tutorial trying to look through your catalog lots of videos.
@gcharb2d Жыл бұрын
Great video, thanks, which would you say is best, MH animator, or Audio2Face ??? 🤔
@ATomCzech Жыл бұрын
They should do the same also for voice2gesture. I don't even understand why it is completely different app, why it is not possible to apply both features in one place :-(
@24pfilms Жыл бұрын
Would love to see this with character creator then into UE 5
@christiandebney1989 Жыл бұрын
i have it all working but I cant see any wav files when i try to load my own.
@ellismitchell696511 ай бұрын
Could you send me a link to download the Meta-Human Software, that you are using in your video. I'd greatly appreciate it, thanks.
@volpe76810 ай бұрын
Amazing tutorial!! I have a question,Someone know if there is a way to use an Audio source that i receive real time in unreal engine by a api request ?
@StudioCosi-Brasil Жыл бұрын
Good, how record this?
@Jsfilmz Жыл бұрын
take recorder
@StudioCosi-Brasil Жыл бұрын
@@Jsfilmzan idea to a new vídeo!😎
@clau77xp94 Жыл бұрын
Thank for Share this :)
@Jsfilmz Жыл бұрын
ofc kakarot i got u
@CalvinWaynes11 ай бұрын
i only get this to work in pie/viewport mode, how u get it to work in standalone?
@zenparvez Жыл бұрын
It's a start but the viseme really need some work. Hope they made it more expressive on next update.
@Jsfilmz Жыл бұрын
most definitely but if you look at video games nowadays and how much they spend on facial animation this right here comes pretty close even animated shows or movies. Give it a couple more papers and yea its gonna be nuts im sure
@zenparvez Жыл бұрын
@@Jsfilmz Agree 100% and eagerly waiting for more updated version.
@armaskdaufuzz22864 ай бұрын
How to use this for custom character non metahuman?
@47humza5 ай бұрын
My live link plugin is not in unreal engine, what to do?
@AIElectrosound Жыл бұрын
Is it free to use commercial?
@sahinerdem5496 Жыл бұрын
Thanks for tutorial, helpfull. Plus those i discovered there is auto-emotion can be done. And my pc and internet is good then there is lag issue, nhighly noticable ping lag. if there is a way working on 2 pcs how it can be? I have 4 pcs.
@ffabiang Жыл бұрын
Hi JS, great video. In your experience, how does this compare to another lipsync creation tool like the MetaHuman SDK?? I'm planning to use it for a project that displays metahumans in real-time.
@benblaumentalism6245 Жыл бұрын
Do you happen to know if this can be driven from voice recordings in Sound Cues and/or Metasound Sources? That would be amazing.
@KiteO26 ай бұрын
Hi man, thanks for the video. Gotta ask. Can you still build the project to have an exe and still have audio2face working?
@24vencedores11 Жыл бұрын
Hello JSFILM, please I need to know if it's possible to use character creator or DAZ MODEL instead of MH.
@jeffreyhao13437 ай бұрын
Audio2face on iphone, is it possible? Thanks.
@mysticpearl1610 ай бұрын
Will this work with AMD? 😭😭😭
@24vencedores11 Жыл бұрын
Nice Tut! But u guys love too much metahuman. I think you should mention costum character as well. I know MH is the best but I don't love it for it's too limited for workflow. The question is.. Can this audio 2face be used with iclone or Daz character ?
@abhishekdubey3488 Жыл бұрын
HI i did a same on UE 5.1 result is not came its connecting but animation is not coming. can you help me ?
@mcdk4716 Жыл бұрын
Hey what’s up man I got UE5 about 5 days ago and have been following tutorials on Niagra system. For 5 days I’ve been watching only 1 tutorial and have been asking the creator to help me with an issue or anyone the comments but I got nobody to help me out. I am starting from scratch on Niagra and when I get to the part where I add color under particle update everything turns blurry. Wtf plz help me. I’m commenting only because of how responsive you were in the past in helping me out
@sjebastinraja9090 Жыл бұрын
Can we combine live-link with Full body animations for Metahuman?
@InsigniaID4 ай бұрын
Have you found the answer? Can it be combined with audio2gesture?
@sjebastinraja90904 ай бұрын
@@InsigniaID No
@dazai4688 Жыл бұрын
Why my metahuman bp don't show the livelink session
@babasahebpinjar62903 ай бұрын
how to package the game with live link enabled ?
@ramzibelhadj5212 Жыл бұрын
the most problem for me in every facial animation the eyes contact still not so realistic i hope they can find solution to that
@jeanctoledo Жыл бұрын
Is there a way to do it in real-time? I'm thinking that would be cool to use the output of a TTS with it.
@rana3dstudio149 Жыл бұрын
It is. in omniverse you can connect omniverse to tts which they use in their tutorial.
@sinaasadiyan9 ай бұрын
@@rana3dstudio149 hi, which tutorial? Is there any option to receive (stream) audio inside Omniverse from TTS implemented in UE5.3 and stram bach the blendshapes to UE? we have implemented STT+Chatgpt+TTS inside UE5.3 and want to add AudiotoFace
@rana3dstudio1499 ай бұрын
@@sinaasadiyan perhaps till now you've fount it in their youtube channel.
@InsigniaID4 ай бұрын
@@sinaasadiyan Can it be combined with audio2gesture?
@mucod2605 Жыл бұрын
Bro I have seen character in unreal engine having sweat and tears on face but when it comes to metahumans is there any way to do it. Which can really look realistic.
@MarcusGrip-o1m Жыл бұрын
If Unreal doesn't find Audio2Face in Livelink, how do I export it instead? Or why might Unreal not find it, even though I activated it like your video? Thanks!
@노딩맨10 ай бұрын
I tried Ver 2023.02 and had same problem. I had fixed this. Copy forlder should be C:\Users\oooAppData\Local\ov\pkg\audio2face-2023.2.0\ue-plugins\audio2face-ue-plugins\ACEUnrealPlugin-5.2
@artrush36034 ай бұрын
What for eyes?
@marcoshms Жыл бұрын
What about tongue and eyes animation? This way, we only have face skin animation.
@Jsfilmz Жыл бұрын
it has tongue also i believe
@RongmeiEntertainment Жыл бұрын
realtime audio ?
@wildmillardz8934 Жыл бұрын
thanks for the tutorial, it worked for the most part but i got the error," video memory has been exhausted " :( is rtx 3070 too low?
@kingsleyadu9289 Жыл бұрын
Nice
@DanielPartzsch Жыл бұрын
Thanks. Do you know a way how to export the baked down arkit blendshape animation as fbx file (the mesh including the arkit animation)? I'd like to retarget these to my character in Blender but couldn't find a way yet how to do this (only the static head mesh). Thanks again.
@Jsfilmz Жыл бұрын
not blender i did tut on maya
@DanielPartzsch Жыл бұрын
I mean I would like to know if and how it's possible to export the animated arkit solved head mesh with all the animated blendshape data from audio2face? Or did you cover this in the maya tutorial?
@ielohim2423 Жыл бұрын
If only omniverse had as easy a way to get this into iclone to be touched up. That workflow is tedious still. This is definitely the best workflow for audio to face animation. Im surpirised you havent done one on the metshuman sdk plugin. It generates facial animation with audio also. Its comparable to acculips.
@Jsfilmz Жыл бұрын
id take it to maya tbh if you have it then just export back
@IRONFRIDGE Жыл бұрын
@jsfilmz Will nanite really work now with 5.3 in VR?
@Jsfilmz Жыл бұрын
its been workin since 5.1
@IRONFRIDGE Жыл бұрын
@@Jsfilmz with forward rendering? I only got the fallback mesh. But i give it another shot in the 5.3 beta
@Jsfilmz Жыл бұрын
@@IRONFRIDGE r.raytracing.nanite 1
@IRONFRIDGE Жыл бұрын
@@Jsfilmz yes the thing is, i make a Game and i want that older GPUs can run it fine too PCVR. So a Question if Nanite is active does it crreate a base higher impact on Performance? Because i try to squezze much as Performance as Possible for a large Terrain.
@blerdstatic8187 Жыл бұрын
Okay this is my ticket in, because I can't afford an iphone right now. Does it pronounce names fairly well?
@Aragao95 Жыл бұрын
It worked, but i needed to put the real time in the audio2face app off, bc it destroyed the fps, still needed to put unreal in low too to be usable hahaha have a rtx 3070 and 64 gb Ram, might be the 8 gb vram...
@InsigniaID4 ай бұрын
Can it be combined with audio2gesture?
@fhmconsulting4982 Жыл бұрын
This maybe relevant to a lot of actors at the moment. If you scan yourself and get it rigged you should be able to claim copyright as it is a form of CAD. Instead of worrying about others using your image & likeness you could then have a digital 'fingerprint' that you use for proving ownership of your body, face and (I suspect) voice. And it doesn't stop with actors. Imagine having a LeBron or Messi NPC!
@ahlokecafe_articulate Жыл бұрын
Ur thumbnail speech doesn't sync with ur voice... A bit distracting bit overall good
@vi4dofficial Жыл бұрын
Is it free . Without any limit
@ostamg1379 Жыл бұрын
Lips aren't moving for me :/
@Jsfilmz Жыл бұрын
dont forget to click activate
@ostamg1379 Жыл бұрын
I did :/ its like theres no lip sync features@@Jsfilmz
@ostamg1379 Жыл бұрын
@@Jsfilmz could it somehow have to do with the fact that Audio2Face only display a black screen and not the selected face?
@GermanWorld-c5q Жыл бұрын
Well...I can't see nothing....
@HologramsLab Жыл бұрын
Let she to taaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaalk please!!!