As always, let me know if you have any questions about the video. 🙂
@Miditator5 күн бұрын
Hello,thank you for the tutorial but I had a one question - I already have blendshapes in my stylized char, import them in CC4. Totally I have my char in CC4 with my blendshapes but CC4 won't create face rig for him (. help me please
@axiomvicarious2 ай бұрын
Bro, this gives me both faith and confidence.
@tiedtkeioАй бұрын
Amazing! Hope you create something cool!!
@ompansare45162 ай бұрын
bro no shit , started learning ue5 recently and ur video were a big help ;p-;
@tiedtkeio2 ай бұрын
That's great! Thanks for being here and happy to help! 🥰
@ompansare45162 ай бұрын
woooow just what i needed wtf ... thankyou man ❤❤❤❤
@HussinKhan2 ай бұрын
Nicely explained, thank you!
@tiedtkeio2 ай бұрын
Great to hear! Thanks for watching! 🤗
@itestthings5337Ай бұрын
Great stuff! been looking for precisely this for a while. I have a question for you, in your experience is MH animator worth all the extra steps compared to, for example, using live face/accuface straight in iclone? (my experience with accuface is bad, but live face in iclone has decent results). It seems like MH animator really shines only with metahumans because of the topology and rig, but using it with a CC character seems a bit underwhelming, you don't get those amazing deformations. I know from a previous video that you are working with accuface on a film, so you are uniquely positioned to share good insight. Amazing channel, please continue to share!
@tiedtkeioАй бұрын
Thanks for your comprehensive comment! Yes and no is the short answer haha. Accuface is great and really the only alternative when you have a pre-recorded video of an actor. Maybe they're not in the same part of the world as you etc. Metahuman Animator really takes the facial capture to the next level due to the fact it uses Lidar Depth Sensor, achieving a much higher fidelity. The CC control rig is in its early stage and a version 1 right now, it will be updated in the future to achieve a much closer resemblance to the Metahuman Performances. Both Accuface and Metahuman workflows will continue to evolve with new features, so what it really comes down to is where your end project will end up. If it's Unreal Engine, then MHA is worth the hassle. If it's Blender, iClone, Unity, Maya etc, then Accuface is what I recommend.
@poet8236Ай бұрын
Thanks a lot, this is truly helpful as getting lost in these processes is so damn easy. :) I got my facial animation now on my sequence and it works. The big but is that the quality is still quite mediocre. On the Metahuman performance the matching is actually very astonishing but in the process to get it onto the character (bought Amber from La Famila too to full match the tutorial :) ) most detail is lost. I suppose this all has to do with the blendshapes and the CC control rig being an early version. Still already quite powerful! The only thing I am struggling with is the CC_Rig_BP. It is being removed whenever I restart the Unreal Engine although I save the level and "Save All". Which causes my "LS_Amber" to always have to be set up again. Outliner shows the error "The world contains invalid actor files" and the log says "LogWorldPartition: Warning: Unknown actor base class `/Game/_Characters/Amber_001/Rigs/CC_Rig_BP.CC_Rig_BP_C`: Actor: 'CC_Rig_BP_C_UAID_C87F5400CDAF7A2802_1089411207' (guid 'B5DD4D1F49FB59339413B49B6978E51B') from package '/Game/__ExternalActors__/FirstPerson/Maps/FirstPersonMap/A/FS/3VHWPU13SMVILXSI0DHHN5'". I have done nothing but export from CC as you did, import into UE5 and drag the CC_Rig_CP to the scene. I tried it now with 3 fresh projects in 5.4.4, going step by step as you did. EDIT: Found my mistake, you HAVE to rename "CC_Rig_BP" to something else upon creating the control rig, it's not a nice to have but a must. Anyways, what I really wanted to say: THANK YOU! :)
@tiedtkeio24 күн бұрын
Thank you so much for watching and your comprehensive comment! Yes, I should make a follow up video to this one to show the difference you can get by adjusting the blendshapes. My quick tip is; the morphs for the CC4 character doesn't quite match the Metahumans morphs. I'd recommend going through all your blendshapes in CC4 and match them more to the MH equivalent shapes. I also know Reallusion is working on better support for this specific workflow, I've talked with them. 🙂
@cukiris_22 күн бұрын
One more time, thanks so much for the great tips
@tiedtkeio15 күн бұрын
Happy to help! Thanks for watching! :)
@albertusbodenstein1976Ай бұрын
Thanks so much.. Never knew you could do this.. I'd like to add a Mocap capture workflow to this.. Do you know how to do this?
@tiedtkeioАй бұрын
Sure, I have another video on my channel which goes over how to add animation your character - that could be a start. Then I would say it depends on what type of mocap solution you have; if you stream it to Unreal or if you import fbx animation sequences etc. But overall it's just a matter of combining two sequences in UE5.
@archichampin2 ай бұрын
Really nice video mate, thanks, I started watching it just out of curiosity but the background music trapped me in and kept watching it til the end. I think is a really nice workaround but can't really understand why would someone would do all of this, plus fixing all the issues that seem to happen, instead of setting the character up properly from the beginning from CC to Unreal so can use the real time facial performance? I'm working on a short at the moment and I'm looking for ways to speed up character animation to Unreal without having to buy an expensive tools and can't really figure out how to benefit from this workflow, can you tell me what I'm missing here? Cheers pal
@tiedtkeio2 ай бұрын
Thank you for your kind words and watching the video! The main takeaway here which I probably glanced over is the fact that MHA uses 3D lidar depth information with machine learning to achieve a much higher fidelity facial capture than regular 2D video capture using tracking and AI can (like Live Link Face or iClones AccuFace tools).
@nu-beings2 ай бұрын
Ok, very nice, but how would you add an updated animation for her face once you're done with this one? Do you have to start all the way back over?
@tiedtkeioАй бұрын
Sorry for late reply! You simply remove the control rig from sequencer, change the animation in sequencer (import the new one first to your Unreal project like shown in this video), and then add the control rig back. Its difficult to describe with words, but very simple in reality.
@kashifhussain73884 күн бұрын
thanks man
@danodesigndanomotion20682 ай бұрын
thanks for this video bro
@tiedtkeio2 ай бұрын
That's fantastic! Thanks for watching it! 😄🙏
@AriymoreАй бұрын
So cool man , can we join cc charecter with live link?
@sergiopaz3263Ай бұрын
"Hi @tiedtkeio, does this mean I can apply this to any character I want? For instance, if I download an Orc from the Marketplace without blend shapes, could I still add facial expressions to it? I recently learned about Metapipe, but it seems quite complicated... 😔 I have a small story in mind that features an Orc, and I'm unsure how to proceed. Does this only work with Metahumans?"
@tiedtkeioАй бұрын
Yes it does. If you download an Orc from the Marketplace (whether that's Unreal or Reallusion marketplace) you can still use it with this workflow. However let's assume the worst and that the Orc isn't rigged and doesn't have blendshapes; then you'll have to first rig the Orc using either Blender, Maya, Mixamo or my personal suggestion AccuRig and CC4. Then you'll have to manually create the blendshapes for the face yourself. This is a somewhat tedious task, but not hard at all. I have another tutorial on my channel which shows how to create blendshapes in Blender for your CC4 character. In summary, the steps/tutorials you'll need in order to get the Orc to work as shown in this video are: 1. Rigging a character (you choose software, I suggest AccuRig/CC4). 2. Creating blendshapes for a character (Also suggest CC4 here).
@sergiopaz3263Ай бұрын
@@tiedtkeio Thank you for your response! I'm excited to move forward with this and have been learning so much from your content. I truly appreciate your help!
@rakeshmani87872 ай бұрын
can you make one video how to make cinematic dark clouds in unreal engine
@tiedtkeio2 ай бұрын
Oh, that's a great idea, will do! 🙂🙏
@michaellemon91838 сағат бұрын
Found out the hard way today that Animator now only works with iPhone 12 or newer. Still using an 11 other no plans to upgrade until now….
@shivangipriya41532 ай бұрын
Thank you. what about the body? if we have animation from Mixamo how will mix this ??
@ardagenc46742 ай бұрын
For using it in an anim blueprint you can benefit from layer blend per bone node Use the neck bone as the transition point and connect the body animation to pose 1 and facial animation to pose 2 To create a combined animation sequence I am guessing a sequencer can work. Dragging the skeleton to sequencer enabling its control rig, making it layered and adding the both animation but I am not really sure whether if this one works smoothly I would normally use the facial animation on a layered control rig and hand animate the body
@shivangipriya41532 ай бұрын
Ok thank you so much
@shivangipriya41532 ай бұрын
Ok thank you so much
@amigoface2 ай бұрын
does it work with android phone camera
@tiedtkeio2 ай бұрын
Unfortunately not! It uses the iPhones Lidar sensor with 3D depth. The alternative way to iPhone is to use a Stereo HMC i.e. a dual camera setup on a headcamera to achieve 3D depth, but that is more advanced than the iPhone.
@leizervieira11662 ай бұрын
❤❤❤
@tiedtkeio2 ай бұрын
Hope it helped! Thanks for watching! 🫶
@t_claassen2 ай бұрын
Sub +1. Is it somehow possible to send this facial animation back to iclone?
@tiedtkeio2 ай бұрын
That's great, thanks! 🤗 It is! You can simply right click the body and face components, select bake animation, then right click on the animation in content browser and export as .fbx. I could do a video on this too later on if you'd like. 🙏
@t_claassen2 ай бұрын
@@tiedtkeio Thanks for that íf you do and thanks for the swift reply. 😊 For some of my projects, this iclone > ue5 > iclone > blender (live-link) is my go-to and would save me a lót of time. Although UE5 is not exactly my cup of tea to be honest atm. And while Acculips, in iClone8, dóes get the job done eventually it's very time-consuming. That's why I'm watching these series. Thanks again. *#sweet* 😉
@t_claassen2 ай бұрын
@@tiedtkeio 🙏
@davidvideostuff2 ай бұрын
@@tiedtkeio Yes !! Do please make a video on the workflow with the animation back to Iclone !!!
@Faneva_Jorah5 күн бұрын
If anybody has a solution, tell me. I have a full rigged character from blender using accurig to cc pipeline in blender and face it for the face. It's already animated and I want to import it in UE5 but it failed, a bunch of error that I don't even understand, I started learn UE5 recently I like the control that I have in UE, rendering in blender is way too long and I cant freely preview my work
@ArthurBaum2 ай бұрын
That's cool and all. But don't we already have like hundreds of MHA tutorials? What is this about?
@tiedtkeio2 ай бұрын
This is mostly for those looking for how to record facial performance capture for their stylized or anime character from CC4 or Blender and not a Metahuman as end target (see end of the video). ☺️🙏