Free Training for a career in 3D Animation: ebook.digitalcreatorschool.com/animatorsjourneyfreetraining
@agentwhiteblack58692 жыл бұрын
Hello, I want to animate a characters face in an animation of mine. Does that work as well with live link? That you record the facial expressions on the camera which is in the sequencer?
@EagerSleeper3 жыл бұрын
You jumping on the ball with the new Metahuman stuff is A-tier. Everybody out here is confused, and there are almost no resources that get to the point like yours. Nice work.
@AnimatorsJourney3 жыл бұрын
Ha, thanks! I know the feeling of being frustrated for lack of clear instructions, that's what got me started creating courses. Cheers.
@ChasingLatitudes2 жыл бұрын
@@AnimatorsJourney you realize no one and i mean no one in your audience understands anything you are saying, not the programs nothing, you need to do a step by step tuutorial from 0 not start way in the middle
@mithunkrishna35672 жыл бұрын
sooo true
@virtual_intel3 жыл бұрын
Wow you didn't need the sample MetaHuman file, nor did you make any blueprint adjustments. I'm blown away by how simple you made this work using that iPhone LiveLink app connection. Amazing!
@3dhotshot3 жыл бұрын
PLOT TWIST THE GUY IN the bottom right is a Metahuman !
@pelado92933 жыл бұрын
Metahumanception
@davidkelly42103 жыл бұрын
I just discovered Metahuman. I gotta try this.
@marcelogonzalez95653 жыл бұрын
I learned Maya thanks to you. It is good to see that you will help us with Unreal too. Amazing!
@AnimatorsJourney3 жыл бұрын
That makes me happy to hear I've helped :) Thanks for commenting!
@Art_9113 жыл бұрын
Great intro. They still have issues about the mouth closure, but there are workarounds out there. My question is... I'm not made of money and cannot afford a new iPhone or even the Ipad/pro. So I'm looking to buy a refurb or used iPad and can't find a list of older models that will work with LiveLink and do the face recognition. At first I assumed it was the Lidar (sp) camera, but further research tells me no? Anyway, if anyone knows of a list or specs I can use to determine which older models that work it would be greatly appreciated.
@codydillon74282 жыл бұрын
Bro. This is awesome! You just saved me probably a week of frustration. Cheers.
@AnimatorsJourney2 жыл бұрын
You're welcome!
@swiftdetecting3 жыл бұрын
What phone version u using ? and u know if u can use android as well if u wanted to use that instead
@porororo90563 жыл бұрын
The same goes for Kinect, and facial motion capture using an IR sensor has an inherent problem. First, it can't handle fast pronunciation properly. Second, it is impossible to determine if the lips are closed in a state of lack of strength. Third, the lips must be exaggerated for proper capture. The way to complement the above three is OpenCV, but OpenCV cannot capture Z-depth.
@SofiaHerrero2223 жыл бұрын
hi! i tried with my iphone 12 pro, add my ip but it doesnt find me in UE, u know how i can find my iphone there? thanks
@virtual_intel3 жыл бұрын
Try loading up the MetaHuman sample project before adding your custom character. The one from the web store aka Unreal Marketplace. Also you can try using both IP's that show up on your network properties area. Hope these tips help. As it's the only way I can get it to work.
@privet200053 жыл бұрын
You managed to get the best lip sync out of metahuman yet!
@Andimax112 жыл бұрын
Thank you SO MUCH! Finally an ACTUAL tutorial... the Reallusion "tutorials" just showed you how to download some of the software and then skipped 10 steps. I got it up and running, but would you have any idea why my metahuman looks like half his face is numb?
@dabeerygoat2 жыл бұрын
Thank you for the tutorial! I ran into a problem though with my MH. Only half of my MH face is being animated and it looks odd (i.e. smiling would make the face make a weird expression and doesn't match up). Does anyone know what could be causing this?
@ShadowsClub Жыл бұрын
When i speak it doesn't open mouth fully
@alexusa-zo3fn Жыл бұрын
wooooaaaaaahhhhhhhhhhhh this just blew me away !
@levitabusman2 жыл бұрын
When I export it doesn’t have the skin or textures
@CaseyChristopher2 жыл бұрын
I assume you would then use the Take Recorder to capture a performance that you could then add as an animation track in the Sequencer?
@donrivas80742 жыл бұрын
Mines is just not linking at all what am I doing wrong?? This is the second video i follow till the linking then i just stop cause it's not linking with the iphone
@faddlewaddle26152 жыл бұрын
I can get this all to work BUT, what do I do with it after? How do I render a video from this? Every single time I hit the record button it crashes. I'm using UE5.0 btw and I've no clue if that's even the process for recording my animations from Live Link. Also managed to get something done from within the BP but couldn't do a darn thing with that animation file either.
@Cangel062 жыл бұрын
Wonderful! But when I try to do it, the movements are very slow and sometimes clogged.
@9ayadis3 жыл бұрын
Keep it up, I'm exactly where you are, can you make a beginner tutorial for unreal especially for cinematic?
@AnimatorsJourney3 жыл бұрын
Thanks for the suggestion, I'm working on something for that now, might be a few months. Stay tuned :)
@raxian_3 жыл бұрын
When I set up the LiveLink ip thing, it wouldn't show up or work at all for me. Any help?
@EdwardMilliganBouwls3 жыл бұрын
Thanks! Does this app work with android also?
@tr-dg2iy3 жыл бұрын
can livelink connect to UE running on macbook pro? i have livelink running on an iphone, added my mac IP, but UE does not recognize that livelink is sending data to it (the iphone name does not show in UE). Lucas, thanks for making the video.
@saif03162 жыл бұрын
Hi, I’m having an issue where my live link head separates from body? Any help would be appreciated
@louis.blythe3 жыл бұрын
Thanks for creating this super excited to jump in!
@Instant_Nerf2 жыл бұрын
How do you add an idle pose because I’m trying to make a video but with the body just stiff.. doesn’t look good but I don’t know how to add that animation.
@glatze_-.-2 жыл бұрын
i would like to know from which iphone this is compatible? would an iphone 6 or 7 or 8 already be enough to operate this app?
@virtamay73112 жыл бұрын
hi, great tutorial. I'm using cc3 character in ue4 with live face, and my face mocap works however I'm trying to add my pre-made body animation to my character. I want this body animation to loop while I'm controlling the face mocap through live face. I'm bad with blueprints. any tips? thanks
@shockerson3 жыл бұрын
Hey. Great video, wich Iphone u used? Wha you think about Iphone 12
@KriGeta3 жыл бұрын
Amazing sir, one question, is it possible to control the eye ball moment with other models? and then how can we record and export the animation to Blender?
@joshuascott95983 жыл бұрын
Also, you should direct people to the settings that gives you an option to download 8k, 4k, or 2k. It seems the metahumans download at 8k res by default...which is fine if we're assuming everyone has terabytes of memory.
@blcgamer8753 жыл бұрын
Iphone 12 pro max can connect with live link???
@longsonfullmetal18563 жыл бұрын
is there a way to do this without an Iphone but a webcam instead ? i dont want to buy an iPhone just so i can do this. please help.
@monadrian.official3 жыл бұрын
is it possible to connect with USB Cable Iphone to pc?
@tylerdurden-vevo2 жыл бұрын
Is a webcam used?
@Polytricity2 жыл бұрын
I don't understand the mouthClosed blendshape, normally I use a bone for the jaw / mouth open. So, is mouth close design to compensate for the jaw bone opening the mouth and how are the eyes controlled as I usually have those skinned to bones too.?
@Deano-M2 жыл бұрын
everytime i try to export from bridge. it says unreal isn't open. even tho it is open
@HassanAhmed-vs2fx2 жыл бұрын
I’m working on PC and I have tried to link my phone with the PC to activate the live link but it didn’t work. How should I solve this problem? My PC has a wire connection and not a wireless connection
@gryphonsegg3 жыл бұрын
min will connect. I even tried the trick of turning the sequence off and on and setting it to none. What am I doing wrong?
@gryphonsegg3 жыл бұрын
will not connect
@val_evs3 жыл бұрын
the head is static, it looks weird, the movement of the head is not enough, how to do it?
@AnimatorsJourney3 жыл бұрын
setting in the app to turn on head rotation tracking
@val_evs3 жыл бұрын
@@AnimatorsJourney Thanks!
@NeerajIngle3 жыл бұрын
This is awesome, I am surely gonna try today, was wondering will my iPhone 8Plus will work?
@dushyantm95793 жыл бұрын
iPhone 10 is probably the lowest you need as it has the depth sensor front facing camera.
@Kanermi2 жыл бұрын
Can Live link work on a dedicated server, or only on a package?
@Babakkhoramdin2 жыл бұрын
what is your iPhone model? please
@cesarmartinez19472 жыл бұрын
Hi great tutorial. I have a question when I try to record the sequence my UE crash. Anyone have this same issue?
@ShortsforLife122 жыл бұрын
Yes, i tried it like 100 times. Do you got a Solution?
@sikfreeze3 жыл бұрын
Wow this is amazing. Can live link also track the body movement as well?
@AnimatorsJourney3 жыл бұрын
Only head rotation for now (in addition to the face)
@sikfreeze3 жыл бұрын
@@AnimatorsJourney Thank you
@edgarprotsko15582 жыл бұрын
@@AnimatorsJourney How ?
@portfolioyzadora3 жыл бұрын
only works with iphone?????
@shockerson3 жыл бұрын
Hi i have question, is There eyeball movment with iphone live link, answer here please and do short demo if possible , many thanks. Im gona to buy iphone There is a much Money for me. I want to know abt eyeball movment.
@tayam922 жыл бұрын
Hi, do you have a video on using Live Link on Paragon Characters? Or can Paragon Characters use Live Link?
@3d_4_dummies2 жыл бұрын
Nice tutorial!
@AnimatorsJourney2 жыл бұрын
Thank you! Cheers!
@edgarprotsko15582 жыл бұрын
How to enable real time head and neck rotation mocap ? Is it possible at all?
@thedadwars10 ай бұрын
Thanks for this!
@GrimGearheart3 жыл бұрын
I wonder if this can be used with a non-human? Or a highly stylized one?
@luckyenam63293 жыл бұрын
Hello I followed your process and downloaded the character from MetaHumans but haven’t been able to export the character from the quiver bridge or connect my iPhone to the Live Link through my IP address please do yo have a fix for this
@robot_collective3 жыл бұрын
Awesome, that was very helpful THANKS! Had the problem to see the live tracking in the viewport. That works fantastic now.
@AnimatorsJourney3 жыл бұрын
Glad it helped! Thanks for watching :)
@GrimGearheart3 жыл бұрын
Can this only be used with an iphone?
@SProj-px7wm2 жыл бұрын
Thanks! Do you know how to record neck/head movement as well?
@danielshamota2 жыл бұрын
Hey, do you know why in UE5 facial livelink skeleton animation is not updating in editor? Tried literally everything, including the hack showed in this video thanks
@BryantRicart2 жыл бұрын
Can you explain how to add head rotation
@derf00072 жыл бұрын
Is it possible to record in the live link app and then connect to Unreal Engine later or does it have to be a live connect setup?
@V4NDLO2 жыл бұрын
I have the same question. Did you ever find out?
@derf00072 жыл бұрын
@@V4NDLO no one ever responded. However, I think if you record the video ahead of time,, if you do eventually get the live link setup, you can face your phone or camera at the pre recorded video screen on a TV or computer and it'll pickup that face recording.
@maevick24423 жыл бұрын
Perdona, ¿Pero se puede para android?
@PunxTV1233 жыл бұрын
can I use it on iphone se 2016? or need new iphone?
@1amsaint5813 жыл бұрын
I've followed the instructions but for some reason my Metahuman is missing all face textures besides the eyes even after they've compiled. How do I fix this?
@InspectorXyto3 жыл бұрын
Thank you for sharing that. I managed got it workin. Any idea what I can use to track head movement?
@AnimatorsJourney3 жыл бұрын
This does it as well, it's just an option to turn on in the app settings
@ViensVite2 жыл бұрын
@@omgee8968 u need a tracker on ur head
@Andimax112 жыл бұрын
Does anyone know why literally every metahuman skeletal mesh I use doesn't utilize the right side of their face?
@dotapodtv94493 жыл бұрын
nice video! btw, why does my live link get a warning : 30.0 fps message? it's an iPhone X :X
@Michaduo2 жыл бұрын
What Iphone do you have? X?
@rudy552 Жыл бұрын
what iphone do you have?
@UncleTiaoTiao2 жыл бұрын
the face animation can work , but the last step, can not link to the animation~ e
@Silpheedx2 жыл бұрын
THANK YOU SO MUCH!
@ingeonsa3 жыл бұрын
How does it cope with prescription glasses?
@dandylion-evn7w23 жыл бұрын
How much is it?
@AS-jf2mf3 жыл бұрын
why do I not get the subject even though I write my proper IP?
@deadmikehun2 жыл бұрын
Will this work with a ryzen 3700X, 32GB DDR4 and a GTX 1070?
@brianroanhorse52743 жыл бұрын
How would you lip-synced animate a pre-recorded audio? Would you do that in Maya?
@AnimatorsJourney3 жыл бұрын
That's how I would do it personally, in Maya. I've got another video that shows how.
@jamalqutub_instrument3 жыл бұрын
Great tute. But for some reason the textures aren't coming in. Any ideas on why?
@AnimatorsJourney3 жыл бұрын
They're not visible in UE? Is it just a matter of waiting for it to 'compile shaders'?
@jamalqutub_instrument3 жыл бұрын
@@AnimatorsJourney The shaders eventually compiled. Success!
@narikragam3 жыл бұрын
Can we make live in tiktok or in Facebook using this method with being meta human
@sp33dkill3 жыл бұрын
Excellent video! Just discovered this amazing software. still a total noobie. one question about Livelink - is it compatible with Android mobile devices as well as iPhone?
@AnimatorsJourney3 жыл бұрын
only iPhone
@mikey3d9123 жыл бұрын
What has to be the format of the model if I want to totally build one from scratch for my project?
@georgefelner3 жыл бұрын
i want to be able to either assign keyboard keys to force expressions that live link isnt seeing? or is there a way to make UE4 more sensitive to live link so as to deliberately exagerate the expressions? any help would be appreciated and happy to pay for your time?
@shakeelshahid31452 жыл бұрын
Sir, how we can create live link face in Autodesk Maya???
@AnimatorsJourney2 жыл бұрын
Purchase Rokoko face capture license - live link is specific to unreal.
@georgefelner3 жыл бұрын
im having issues using UE with my iphone on Live Link. I used the correct IP.. UE sees it. but when i click play it says "facetracking not supported on this device" but Im using an iPhone X. Can anyone help?
@Andimax112 жыл бұрын
Can anyone tell me why my iphone might not be showing up in UE 4.27.2 under the LLink Face Subj (iphone black)? It worked fine in UE5 other than the mocap animations being janky on one side of the face (this is a commonly known issue with this metahuman process in UE5, I'm just waiting for someone to fix the bugs). Since UE5 mocap was janky, I wanted to try it in UE4.27.2, but now my phone won't show up no matter what I do. I have restarted the live link iphone app several times, restarted unreal engine, restarted my computer, double and triple checked I am using the correct IPv4 address, switched my network from public to private and vice versa and I'm just at a total loss as to what I should do. Does anyone have any suggestions for either my issue with UE5 or UE4?
@LordRubino3 жыл бұрын
Great tutorial Luca. I just wondering why the quality of the capture in every tutorial i saw are not as good as the demo from Epic. I wonder what sort of device they use for mocap. I'm not convinced that creators could use that tool successfully with only a mobile phone. Thank you for the tutorial anyway. :)
@gigxr3 жыл бұрын
can u move the head/neck with it?
@PsychotronAyax3 жыл бұрын
can it also be done with an aribook?
@Austin1-0-83 жыл бұрын
Use this for VR and make it your avatar so we can be ourselves in VR
@gameswownow72133 жыл бұрын
and i've seen others have whole body movement as well with the facial animations, have any ideea how?
@AnimatorsJourney3 жыл бұрын
They have a mocap suit. Perception Neuron and Xsens are two companies that make them.
@joshuascott95983 жыл бұрын
One last thing...the Live Link app does not work on iphone 8 with the latest iOS 14.6. The error reads "Live Link Face requires a device with a TrueDepth camera for face tracking".
@aaronambrose10063 жыл бұрын
Yeah, the app utilises ARKit which in turn requires the right hardware. iPhone X and above works.
@mass232 жыл бұрын
Thanks yo so much!!
@bobatea47323 жыл бұрын
I don't know how to use unreal or metahuman, but this is pretty cool
@vertex-67143 жыл бұрын
Hello I have a question. Which iphone model do I need to run LiveLink? Are there any special requirements(sensor?)
@ViensVite2 жыл бұрын
12pro or max, 13 pro or max, same for 14
@henriquemodena68263 жыл бұрын
Please make the MetaHuman Live Link Kit available? I can't buy it! I even managed to find some files I didn't find the rl_funciontion_lib file
@AnimatorsJourney3 жыл бұрын
I don't know what you're referring to, I don't sell that.
@OmegaMouse3 жыл бұрын
The mouth not closing was a bug in the latest version of Faceshift before Apple bought them for 200 million dollars. Seems to be they don't know how to fix it. Pretty lame!
@5gradeproductions5192 жыл бұрын
Is live link is free ?
@andrejbykov49172 жыл бұрын
Is anyone knows why only neck moving and the face not reacting at all?
@CaracalArt2 жыл бұрын
same problem :(
@freenomon24663 жыл бұрын
thanks for sharing. I wonder if we have access to the arkit blendshapes that we can tweak inside unreal
@AnimatorsJourney3 жыл бұрын
This may be helpful: twitter.com/epicchris/status/1385422520606679042?s=20
@freenomon24663 жыл бұрын
@@AnimatorsJourney awesome. Following you on twiter now. :) I saw your other recent video on mh(metahuman) to maya then to UE. This will let me work on the facial animation via facial mocap as first pass then manually finesse it more in maya then export to unreal to render. Which eliminates the need to do higher quality livelink to unreal (current livelink to ubreal has pretty bad quality facial animation). do you have any courses on lipsync/facial animation workflow for maya?