I miss the updates. But i also believe you are going to make something incredible. (More than you already did) Keep the good work!
@DragonPOPCh Жыл бұрын
Omagah your tracking is just sooo good! But yeah wish i can adjust the mouth movement sensitivity 😊
@megaaziib Жыл бұрын
amazing update, love this software.
@ItsMallyce Жыл бұрын
That's just awesome, thank you for keeping improving it
@shivangipriya4153 Жыл бұрын
Thank you, but any explain for steps to mocap in correct way please
@hi_im_gush Жыл бұрын
Any tips for getting lip sync to work with this? My mouth doesn't move that much on the model when im talking.
@AnimeThemeGadget Жыл бұрын
Unfortunately XR Animator doesn't support lip sync (yet), unless you use it on other VTubing software that supports lip sync via VMC protocol. I will try to improve the sensitivity of mouth tracking in next version.
@hi_im_gush Жыл бұрын
@@AnimeThemeGadget Yes, that would be awesome!!! Right now, what I'm trying to do is use VMC to send XR animator to VSeeFace, HOWEVER I can't use the lip sync + XRAnimator because the lipsync in VSeeFace just won't work unless I have VSeeFace controlling Blendfaces, which makes it where I can't use XR Animator. If you can somehow icnrease sensitivity on the mouth that would be aweomse. Maybe its just an issue with my model? The mouth just doesn't seem to move well when im talking, but does the expressions in XR Animator great!
@Keizynhas2 Жыл бұрын
Hello, with my current level of knowledge I can't change the first letter of each blendshape, is this still necessary ? Is there a place teaching how to do this?
@AnimeThemeGadget Жыл бұрын
The initial casing of ARKit blendshapes shouldn't matter anymore. No matter they start with upper or lower case letter, XR Animator should be able to read them
@Keizynhas2 Жыл бұрын
@@AnimeThemeGadget Thanks
@asdanimatezstuff848 Жыл бұрын
nice work ! , btw ,can the motion be transfered to other engines ? , for example unity or unreal ? in realtime