Industry-Leading Face Capture for your Stylized 3D Characters - FREE & Easy with MetaHuman Animator

  Рет қаралды 6,727

tiedtke.

tiedtke.

Күн бұрын

Пікірлер: 48
@tiedtkeio
@tiedtkeio 2 ай бұрын
As always, let me know if you have any questions about the video. 🙂
@Miditator
@Miditator 5 күн бұрын
Hello,thank you for the tutorial but I had a one question - I already have blendshapes in my stylized char, import them in CC4. Totally I have my char in CC4 with my blendshapes but CC4 won't create face rig for him (. help me please
@axiomvicarious
@axiomvicarious 2 ай бұрын
Bro, this gives me both faith and confidence.
@tiedtkeio
@tiedtkeio Ай бұрын
Amazing! Hope you create something cool!!
@ompansare4516
@ompansare4516 2 ай бұрын
bro no shit , started learning ue5 recently and ur video were a big help ;p-;
@tiedtkeio
@tiedtkeio 2 ай бұрын
That's great! Thanks for being here and happy to help! 🥰
@ompansare4516
@ompansare4516 2 ай бұрын
woooow just what i needed wtf ... thankyou man ❤❤❤❤
@HussinKhan
@HussinKhan 2 ай бұрын
Nicely explained, thank you!
@tiedtkeio
@tiedtkeio 2 ай бұрын
Great to hear! Thanks for watching! 🤗
@itestthings5337
@itestthings5337 Ай бұрын
Great stuff! been looking for precisely this for a while. I have a question for you, in your experience is MH animator worth all the extra steps compared to, for example, using live face/accuface straight in iclone? (my experience with accuface is bad, but live face in iclone has decent results). It seems like MH animator really shines only with metahumans because of the topology and rig, but using it with a CC character seems a bit underwhelming, you don't get those amazing deformations. I know from a previous video that you are working with accuface on a film, so you are uniquely positioned to share good insight. Amazing channel, please continue to share!
@tiedtkeio
@tiedtkeio Ай бұрын
Thanks for your comprehensive comment! Yes and no is the short answer haha. Accuface is great and really the only alternative when you have a pre-recorded video of an actor. Maybe they're not in the same part of the world as you etc. Metahuman Animator really takes the facial capture to the next level due to the fact it uses Lidar Depth Sensor, achieving a much higher fidelity. The CC control rig is in its early stage and a version 1 right now, it will be updated in the future to achieve a much closer resemblance to the Metahuman Performances. Both Accuface and Metahuman workflows will continue to evolve with new features, so what it really comes down to is where your end project will end up. If it's Unreal Engine, then MHA is worth the hassle. If it's Blender, iClone, Unity, Maya etc, then Accuface is what I recommend.
@poet8236
@poet8236 Ай бұрын
Thanks a lot, this is truly helpful as getting lost in these processes is so damn easy. :) I got my facial animation now on my sequence and it works. The big but is that the quality is still quite mediocre. On the Metahuman performance the matching is actually very astonishing but in the process to get it onto the character (bought Amber from La Famila too to full match the tutorial :) ) most detail is lost. I suppose this all has to do with the blendshapes and the CC control rig being an early version. Still already quite powerful! The only thing I am struggling with is the CC_Rig_BP. It is being removed whenever I restart the Unreal Engine although I save the level and "Save All". Which causes my "LS_Amber" to always have to be set up again. Outliner shows the error "The world contains invalid actor files" and the log says "LogWorldPartition: Warning: Unknown actor base class `/Game/_Characters/Amber_001/Rigs/CC_Rig_BP.CC_Rig_BP_C`: Actor: 'CC_Rig_BP_C_UAID_C87F5400CDAF7A2802_1089411207' (guid 'B5DD4D1F49FB59339413B49B6978E51B') from package '/Game/__ExternalActors__/FirstPerson/Maps/FirstPersonMap/A/FS/3VHWPU13SMVILXSI0DHHN5'". I have done nothing but export from CC as you did, import into UE5 and drag the CC_Rig_CP to the scene. I tried it now with 3 fresh projects in 5.4.4, going step by step as you did. EDIT: Found my mistake, you HAVE to rename "CC_Rig_BP" to something else upon creating the control rig, it's not a nice to have but a must. Anyways, what I really wanted to say: THANK YOU! :)
@tiedtkeio
@tiedtkeio 24 күн бұрын
Thank you so much for watching and your comprehensive comment! Yes, I should make a follow up video to this one to show the difference you can get by adjusting the blendshapes. My quick tip is; the morphs for the CC4 character doesn't quite match the Metahumans morphs. I'd recommend going through all your blendshapes in CC4 and match them more to the MH equivalent shapes. I also know Reallusion is working on better support for this specific workflow, I've talked with them. 🙂
@cukiris_
@cukiris_ 22 күн бұрын
One more time, thanks so much for the great tips
@tiedtkeio
@tiedtkeio 15 күн бұрын
Happy to help! Thanks for watching! :)
@albertusbodenstein1976
@albertusbodenstein1976 Ай бұрын
Thanks so much.. Never knew you could do this.. I'd like to add a Mocap capture workflow to this.. Do you know how to do this?
@tiedtkeio
@tiedtkeio Ай бұрын
Sure, I have another video on my channel which goes over how to add animation your character - that could be a start. Then I would say it depends on what type of mocap solution you have; if you stream it to Unreal or if you import fbx animation sequences etc. But overall it's just a matter of combining two sequences in UE5.
@archichampin
@archichampin 2 ай бұрын
Really nice video mate, thanks, I started watching it just out of curiosity but the background music trapped me in and kept watching it til the end. I think is a really nice workaround but can't really understand why would someone would do all of this, plus fixing all the issues that seem to happen, instead of setting the character up properly from the beginning from CC to Unreal so can use the real time facial performance? I'm working on a short at the moment and I'm looking for ways to speed up character animation to Unreal without having to buy an expensive tools and can't really figure out how to benefit from this workflow, can you tell me what I'm missing here? Cheers pal
@tiedtkeio
@tiedtkeio 2 ай бұрын
Thank you for your kind words and watching the video! The main takeaway here which I probably glanced over is the fact that MHA uses 3D lidar depth information with machine learning to achieve a much higher fidelity facial capture than regular 2D video capture using tracking and AI can (like Live Link Face or iClones AccuFace tools).
@nu-beings
@nu-beings 2 ай бұрын
Ok, very nice, but how would you add an updated animation for her face once you're done with this one? Do you have to start all the way back over?
@tiedtkeio
@tiedtkeio Ай бұрын
Sorry for late reply! You simply remove the control rig from sequencer, change the animation in sequencer (import the new one first to your Unreal project like shown in this video), and then add the control rig back. Its difficult to describe with words, but very simple in reality.
@kashifhussain7388
@kashifhussain7388 4 күн бұрын
thanks man
@danodesigndanomotion2068
@danodesigndanomotion2068 2 ай бұрын
thanks for this video bro
@tiedtkeio
@tiedtkeio 2 ай бұрын
That's fantastic! Thanks for watching it! 😄🙏
@Ariymore
@Ariymore Ай бұрын
So cool man , can we join cc charecter with live link?
@sergiopaz3263
@sergiopaz3263 Ай бұрын
"Hi @tiedtkeio, does this mean I can apply this to any character I want? For instance, if I download an Orc from the Marketplace without blend shapes, could I still add facial expressions to it? I recently learned about Metapipe, but it seems quite complicated... 😔 I have a small story in mind that features an Orc, and I'm unsure how to proceed. Does this only work with Metahumans?"
@tiedtkeio
@tiedtkeio Ай бұрын
Yes it does. If you download an Orc from the Marketplace (whether that's Unreal or Reallusion marketplace) you can still use it with this workflow. However let's assume the worst and that the Orc isn't rigged and doesn't have blendshapes; then you'll have to first rig the Orc using either Blender, Maya, Mixamo or my personal suggestion AccuRig and CC4. Then you'll have to manually create the blendshapes for the face yourself. This is a somewhat tedious task, but not hard at all. I have another tutorial on my channel which shows how to create blendshapes in Blender for your CC4 character. In summary, the steps/tutorials you'll need in order to get the Orc to work as shown in this video are: 1. Rigging a character (you choose software, I suggest AccuRig/CC4). 2. Creating blendshapes for a character (Also suggest CC4 here).
@sergiopaz3263
@sergiopaz3263 Ай бұрын
@@tiedtkeio Thank you for your response! I'm excited to move forward with this and have been learning so much from your content. I truly appreciate your help!
@rakeshmani8787
@rakeshmani8787 2 ай бұрын
can you make one video how to make cinematic dark clouds in unreal engine
@tiedtkeio
@tiedtkeio 2 ай бұрын
Oh, that's a great idea, will do! 🙂🙏
@michaellemon9183
@michaellemon9183 8 сағат бұрын
Found out the hard way today that Animator now only works with iPhone 12 or newer. Still using an 11 other no plans to upgrade until now….
@shivangipriya4153
@shivangipriya4153 2 ай бұрын
Thank you. what about the body? if we have animation from Mixamo how will mix this ??
@ardagenc4674
@ardagenc4674 2 ай бұрын
For using it in an anim blueprint you can benefit from layer blend per bone node Use the neck bone as the transition point and connect the body animation to pose 1 and facial animation to pose 2 To create a combined animation sequence I am guessing a sequencer can work. Dragging the skeleton to sequencer enabling its control rig, making it layered and adding the both animation but I am not really sure whether if this one works smoothly I would normally use the facial animation on a layered control rig and hand animate the body
@shivangipriya4153
@shivangipriya4153 2 ай бұрын
Ok thank you so much
@shivangipriya4153
@shivangipriya4153 2 ай бұрын
Ok thank you so much
@amigoface
@amigoface 2 ай бұрын
does it work with android phone camera
@tiedtkeio
@tiedtkeio 2 ай бұрын
Unfortunately not! It uses the iPhones Lidar sensor with 3D depth. The alternative way to iPhone is to use a Stereo HMC i.e. a dual camera setup on a headcamera to achieve 3D depth, but that is more advanced than the iPhone.
@leizervieira1166
@leizervieira1166 2 ай бұрын
❤❤❤
@tiedtkeio
@tiedtkeio 2 ай бұрын
Hope it helped! Thanks for watching! 🫶
@t_claassen
@t_claassen 2 ай бұрын
Sub +1. Is it somehow possible to send this facial animation back to iclone?
@tiedtkeio
@tiedtkeio 2 ай бұрын
That's great, thanks! 🤗 It is! You can simply right click the body and face components, select bake animation, then right click on the animation in content browser and export as .fbx. I could do a video on this too later on if you'd like. 🙏
@t_claassen
@t_claassen 2 ай бұрын
@@tiedtkeio Thanks for that íf you do and thanks for the swift reply. 😊 For some of my projects, this iclone > ue5 > iclone > blender (live-link) is my go-to and would save me a lót of time. Although UE5 is not exactly my cup of tea to be honest atm. And while Acculips, in iClone8, dóes get the job done eventually it's very time-consuming. That's why I'm watching these series. Thanks again. *#sweet* 😉
@t_claassen
@t_claassen 2 ай бұрын
@@tiedtkeio 🙏
@davidvideostuff
@davidvideostuff 2 ай бұрын
@@tiedtkeio Yes !! Do please make a video on the workflow with the animation back to Iclone !!!
@Faneva_Jorah
@Faneva_Jorah 5 күн бұрын
If anybody has a solution, tell me. I have a full rigged character from blender using accurig to cc pipeline in blender and face it for the face. It's already animated and I want to import it in UE5 but it failed, a bunch of error that I don't even understand, I started learn UE5 recently I like the control that I have in UE, rendering in blender is way too long and I cant freely preview my work
@ArthurBaum
@ArthurBaum 2 ай бұрын
That's cool and all. But don't we already have like hundreds of MHA tutorials? What is this about?
@tiedtkeio
@tiedtkeio 2 ай бұрын
This is mostly for those looking for how to record facial performance capture for their stylized or anime character from CC4 or Blender and not a Metahuman as end target (see end of the video). ☺️🙏
كم بصير عمركم عام ٢٠٢٥😍 #shorts #hasanandnour
00:27
hasan and nour shorts
Рет қаралды 10 МЛН
When Cucumbers Meet PVC Pipe The Results Are Wild! 🤭
00:44
Crafty Buddy
Рет қаралды 58 МЛН
Capturing performance footage for MetaHuman Animator
8:04
Unreal Engine
Рет қаралды 22 М.
Animating Faces in UE5 with Metahuman Animator
17:31
Sir Wade Neistadt
Рет қаралды 16 М.
10 AI Animation Tools You Won’t Believe are Free
16:02
Futurepedia
Рет қаралды 414 М.
THIS IS HUGE! Everyone can do High Quality Face Mocap now!
9:04
Cinecom.net
Рет қаралды 344 М.
AI Powered Facial & Lip Sync 3D Animation Is Here!
8:02
Convert Custom Characters for UE using the NEW CC Rig
10:39
Prompt Muse
Рет қаралды 49 М.
Why do Studios Ignore Blender?
8:52
Film Stop
Рет қаралды 194 М.