Seems like what im been waiting for ...Have to look further into this feature .
@snakeeyesdeclassified2 жыл бұрын
thank you for this, very helpful.
@hbstudio78913 жыл бұрын
I am from India. I am your Big fan
@LordCritish3 жыл бұрын
Good to see Sean Connery is still alive.
@brianmercerjr22823 жыл бұрын
Awesome
@AnthonyCupo00123 жыл бұрын
Thank you for the video! Have you made a video on how to attach and use a cable to your PC?
@baroquedub3 жыл бұрын
+1
@doriansean78193 жыл бұрын
I’d love to see the setup too
@doriansean78193 жыл бұрын
My capture thru the phone and internet isn't as good as his! He must be hooked up to the pc!
@chillsoft Жыл бұрын
@@doriansean7819 How? With normal USB to TB cable? Or whatever apple calls their end of the cable :P
@ToysRUsKid_Critter3 жыл бұрын
Excellent!
@JimmeeAnimAll2 жыл бұрын
[QUESTION] Is there any difference between iPhone models, aiming face performance capture ?? Would You recommend any model more than the other, or does Face ID or True Depth or dot sensor (whatever is responsible for mocap) works the same since iPhoneX till iPhone13PRO ?? Thank You the video.
@starwarz84793 жыл бұрын
How do you transfer these iclone face cap data to Houdini? Would love to see a workflow demo of transfering the entire character mo cap + face cap data to Houdini
@nathanbayne35762 жыл бұрын
It works as exporting an alembic that's what I've been doing and it completely works, but obviously the rig doesn't come in if that's what you meant!
@doriansean78193 жыл бұрын
So is this mocap on this video over the WiFi or cabled to your pc?????
@mamadoudiallo91393 жыл бұрын
Bring this to Mac already
@arnoldo0012 жыл бұрын
AWESOME Video, Thanks. Could you share the "stage" or "visual" setup of this scene? It really, REALLY looks great
@andreasgeorgiou19012 жыл бұрын
Is it possible to store a strength multiplier preset? we have many animations and so far we've had to adjust the strength to the same parameters on each animation
@prankfever88772 жыл бұрын
is it possible to use the live face app with the samsung galaxy S20 ? it's only 499 $
@RetzyWilliams3 жыл бұрын
People asking for Android - ARKit is Apple’s proprietary face tracking tech. Until Android itself comes out with a similar tech, Android cant do it, even if RL wanted it to. It (1) needs a special camera like iPhone X has, and (2) needs an ARKit-like software. I hope it does one day.
@jdsguam3 жыл бұрын
"ARKit is Apple’s proprietary face tracking tech." - That pretty much settles it don't you think?
@JGooden7623 жыл бұрын
It must have been really expensive to hire Miachale Caine for the voiceover of this video...
@TXanders3 жыл бұрын
Has this just been reuploaded?
@cinemarks.3d3 жыл бұрын
Great update! I currently use iPhone X, I am wondering if the iPhone 12 will improve the performance too? Anyone has experience with that?
@reallusion3 жыл бұрын
The iPhone 12 is faster and will improve performance. Especially when tethering the connection.
@Alaz213 жыл бұрын
Unbelievable it is cool
@PalmaMultimedia3 жыл бұрын
Xlent
@marcelo96553 жыл бұрын
😨😨👍👍 amazing
@GTSongwriter9 ай бұрын
I've invested into iClone 6 & 7 & CTA 5... I want "iPhone Live Face Profile" but I can't afford it.
@sliderssli1769 Жыл бұрын
Hello! I came across your lessons. I have one problem, and I haven't found a solution to it anywhere. I installed the program iClone 7 and created characters in it. Now I bought an iPhone and decided to record facial animation using the tutorial. I did everything according to the tutorial and connected the iPhone (the iPhone showed that everything is connected, there is a mask on the face). I specified the iPhone in the parameters and entered the path to the phone. That is, I did everything according to the lesson. But when you click on the record / preview, nothing happens, the character does not repeat emotions. I checked the character for emotions, everything moves in manual mode. I checked during the preview whether the coordinates of the points on the face are transmitted, everything is fine, everything is transmitted. The legs and face are static. What is the problem? is there a plugin missing or didn't put a check mark where? I didn't find a solution to the problem.
@imdeby3 жыл бұрын
@Reallusion is it possible to export just the facial motion into blender without it being applied onto a character?
@fathouyniat87973 жыл бұрын
well can i import my own character with rig and facial on it and control it in iclone?
@tsechee3 жыл бұрын
can i use multi iphone capture to multi character?
@helenawilsena58213 жыл бұрын
I don't have any iPhone, is it work with Android phone or DSLR Camera?
@dabneeghmoob3D3 жыл бұрын
can use androi ?
@fidel_soto3 жыл бұрын
Is the smoothing feature only for Live Face for iPhone or does it work with Faceware? Edit: Just saw the smooth option on the clip directly
@sebastianvignau43333 жыл бұрын
Great video. Question, can you do the MOCAP from a previously recorded video instead of using the LiveTrack? Is it better in any way?
@enjoyenjoy57213 жыл бұрын
But HOW can you still have no iClone 7 for MacBooks???? isn't there anyone asking this before?
@cezanneali3 жыл бұрын
Imac are poor engines to use for such stuff or even rendering anything in general.
@wallstbets48653 жыл бұрын
Can i use my webcam camera with this application or do i need to buy an iphone? Also does it record my voice will i talk?
@mrkshh3 жыл бұрын
only iphone
@WerIstWieJesus3 жыл бұрын
Is acculips free for iclone/cc3 users? Can I record clips outside and with a smartphone and apply them later inhouse with iclone?
@Alex_Lebron_Animations3 жыл бұрын
Yes it’s free. It’s part of the new update that was released 2 days ago. I’m not sure if I understood your second question correctly but u can’t use recorded clips with face live. It can only capture live feed. Hope that answers your questions.
@WerIstWieJesus3 жыл бұрын
@@Alex_Lebron_Animations Thank you very much.
@dreagea3 жыл бұрын
which iphone works best? iphone 10, 11, 12 ?
@metulski12343 жыл бұрын
I would say the iPhone 12 Pro and Max should work really good. The Lidar Scanner is awesome.
@dreagea3 жыл бұрын
@@metulski1234 Thanks
@flixels55203 жыл бұрын
Sort of ironic the software is windows only.
@meglaarif3 жыл бұрын
What about android?
@reallusion3 жыл бұрын
Android phones do not have a true depth camera.
@marvinmartin62463 жыл бұрын
Why is this only for iPhone users ?
@reallusion3 жыл бұрын
Because only iPhone has a TrueDepth camera built in. When Android has this, then we will also support it with this plugin.
@over-e18343 жыл бұрын
We need for android!
@over-e18343 жыл бұрын
@@pushingpandas6479 So instead of a phone, can i just buy a camera that has the same capabilities of the iPhone?
@MichaelZurcher3 жыл бұрын
Is the African American model you use in this video a preset character or one that you have modified? I am not seeing it in character creator. Thanks!
@doriansean78193 жыл бұрын
Modified
@أرحعقلك-ح2ط3 жыл бұрын
We need for android
@CarloMercadoJudgementProject3 жыл бұрын
for now that is imposible, because android doesnt have the cameras quality for mocap as the 3D cameras of the Iphones, they need the hardware before they can release it for android.
@Alaz213 жыл бұрын
But how can i create my own face on it....?
@JeffersonDonald3 жыл бұрын
via the Headshot plugin with Character Creator 3.
@Alaz213 жыл бұрын
@@JeffersonDonald Thanks for your reply. Can you link me a tutorial video pleas ......?
@frankcabanski94093 жыл бұрын
Is this better than facecap live? I have that, and the motion around the mouth is awful, stretches it in odd ways.
@MikeDKelley3 жыл бұрын
If you can't tell the quality after watching this video there's not really much more to tell you -- it's pretty well laid out here how it works and what it can do. You can compare yourself to the results you are getting otherwise to make your decision.
@frankcabanski94093 жыл бұрын
@@MikeDKelley I saw videos of Faceware Live (not Facecap, my mistake) for iClone. It looked terrific. I bought it. It's awful - weird stretching on the mouth - unusable.
@MikeDKelley3 жыл бұрын
@@frankcabanski9409 You saw how everything was constructed, how all the minute parts came together like is shown in this video? I've never seen such a video for Faceware. Indeed, I saw just the opposite -- folks showing that Faceware was very difficult to get right. Now, I bought Faceware as well, but Live Face is so much better I don't even have FW installed on my system anymore.
@frankcabanski94093 жыл бұрын
@@MikeDKelley See, that's what I'm looking for - word from users/buyers. It will be around $600 for this plus the iPhone. Also, I don't know if Face Mojo is better or Facemotion - I guess they worked together for awhile, but then they split.
@MikeDKelley3 жыл бұрын
@@frankcabanski9409 If you have Motion Live (which you should, for Faceware) you only need the Live Face plugin, which shouldn't be $600 (the last I looked it was around $300 but I do admit it was a while ago -- I bought it when first released at around $200). You can get a used unlocked iPhone (you don't need service) for less than $500 (perhaps even a lot less) but be sure it has face ID (the 3D camera -- most all of them in the last three years do). I guess if worse comes to worst you could turn around and sell the iPhone for about what you paid if you REALLY didn't like it, but it's hard for me to imagine that, not when it's this good (the Arkkit stuff is stuff even the pros have been hankering after).
@fayhiba4313 жыл бұрын
Guyyys your website is down xD
@hyperface20503 жыл бұрын
If only this video were true!!!! But it is not. I too am shocked by the quality of the mocap here, but even moreso because I have an ultra modern computer with RTX 3070 etc and an iPhone 11 pro, and when I boot up the same software, the CC3+ character doesn't move AT ALL like this demo. She's jerky, for ten seconds at a time she freezes, and then in a rush she speeds through all the bad mocap data from the past ten seconds, then freezes, then jerks. I couldn't imagine it being any different than this video. Why? Well the folks from Reallusion haven't explained. Numerous comments below have confirmed the same problems. Many have asked if the wifi connection is an issue and whether the folks from reallusion are hardwiring their connection (this is likely the case) or are they indeed cheating this by recording the mocap beforehand then playing it back as if it is a realtime prview. Either way they need to provide all the specs and connections for a demo like this. We need the computer model and specs, the phone, and everything to do with the connection. If it is wifi, what are the wifi specs. If there is a cable, they need to show us how it can be done. The folks from Mocapx (not a solution I recommend) do provide for an easy USB connection, which eliminates the wifi problem and at least that part works better with Mocapx. Still, Mocapx provides no calibration, no zeroing, no smoothing etc. And after owning Mocapx for a month, I have YET to do convincing mocap. That is why I am now looking at iClone. But wow, the JITTERS and FREEZING are insane.
@mehdi.shiraziu75593 жыл бұрын
I also have this problem and I could not do it. The character exported from cc3 completely loses its beauty in iclone
@hyperface20503 жыл бұрын
@@mehdi.shiraziu7559 Yeah CC3 characters do not export well, but nearly all the goodies are there in the export. It just takes a while with your NLE to learn how to adjust the settings to get something that not only looks as good as the CC3 images, but even better. Better? How? Well that's easy, CC3 doesn't do great work on subsurface scattering or displacement (two essential tactics for making characters look 100% human). Pick a render engine (I use Arnold) and learn the ins and outs of that, and after a month of practice etc, your characters can really come alive.
@hyperface20503 жыл бұрын
BTW I did get my USB cable connected to get this mocap stuff working more smoothly. The key is to make sure you are connected as a 'hotspot' even though you are using a cable.