This looks really promising. Would love to try this with a more in depth ROM for training.
@poochyboi Жыл бұрын
Gonna need an indepth tutorial on this my guy. Especially for muscle deformations.
@sahinerdem5496 Жыл бұрын
Thank u for showing it working, should we do the process for every each animation or just for rotation angles like interpolation of maya? if u know it.
@skimskim Жыл бұрын
thanks for the video. but one thing I relly don't undersand. as far as i know the training data generater in maya generates completely random joint pose on each frame, but in your video, you set the key frame animation raising its leg and then generate the pose and it looks generated nothing. looks it's just same as before the generation. is that correct way? why and how the generater did noting or generate the same pose as the leg key anim?
@sintrano Жыл бұрын
It works similarly to other ML real time deformers at the moment, you feed the trainer one file that is your skinned mesh going through the poses, then you need to feed it a target to train against(in most cases a simulated mesh) that follows the same poses. The trainer then compares positions of verts between the two different poses and "trains" the geo to look more like your simmed mesh on those kind of poses (based off your skeleton's position on each pose), it's essentially a corrective blendshape that instead of you sculping/ setting up keys to trigger just gets triggered based on position.