In UE4 the animation was playing 1/3 too fast, so I slowed clips down in post. That is why the motion looks weird. I believe I imported the FBX at the wrong frame rate possibly. I'll update this comment if I figure out the issue.
@bondell18404 жыл бұрын
update it bro
@thearousedeunuch4 жыл бұрын
Did you find the issue?
@JQhardge4 жыл бұрын
You are killing it with these VPVlogs! It’s so wild to see how far you’ve come. I remember watching you on twitch everyday back at the start of this. It’s incredibly cool to see all of this progress.
@alex05894 жыл бұрын
You remind me of the very small team that made the game "Hellblade" they had a tiny mocap studio in their conference room and sort of developed their own ways of doing things and made a great looking game for almost nothing compared to other studios. (Their complete production vlogs are on youtube for those who might not know!) You had this up and running in no time, that's impressive. Next up: carbon fiber monopod leg+hockey helmet+ Gaff tape= Performance capture helmet.
@CinematographyDatabase4 жыл бұрын
Alex I’m very familiar with their story and eventually Microsoft acquisition 😂 thanks. I do need phone helmet or mount. Rokoko has one I believe too. The other Technoprops ones are for bigger studios.
@LordJagd4 жыл бұрын
And that was back in 2016! How has the tech evolved since then?
@CHITUS3 жыл бұрын
Hey Matt, great work!
@reallusion4 жыл бұрын
Great work Matt !
@Subrat_Das3 жыл бұрын
Thanks to you for your 3d softwares
@prankfever88772 жыл бұрын
i would like a complete motion capture suite with headset like they used in the movie planet of the apes and in ninja turtles . My question is how do i capture my facial expressions if the suite doesn't comes with a headset . Do you have to buy the headset seperately ?
@JarrenRocks4 жыл бұрын
This is looking fantastic Matt. Something like integrating this into a twitch livestream would be awesome for a unique online experience!
@CinematographyDatabase4 жыл бұрын
yeah this is pretty close to how Vtubers work. You would just live stream it all into UE4 and then stream from a UE4 camera.
@JarrenRocks4 жыл бұрын
@@CinematographyDatabase That makes sense, I'm sure we'll see animated TV shows start adopting this technology in the next few years.
@CinematographyDatabase4 жыл бұрын
@@JarrenRocks they have been using this for a while now, but more and more will use it for sure.
@Sebbir4 жыл бұрын
So the 3D version of Adobe Character Animator? I love it!
@minimallyinvasiveventures4 жыл бұрын
Hey Matt - I noticed that you don't look at the video assist on your virtual camera and look at the monitor instead. It looks like this video assist is pretty expensive part of the kit. Could you use an ipad pro instead? Interested to hear limitations and how you would mount it to the virtual camera rig. The ipad pro would give you additional buttons and inputs - the same as how using a vive controller is better than the puck.
@hellohogo3 жыл бұрын
Could you use vive trackers instead of rokoko suit in iclone for body mocap? even if it's lower quality...
@mrtobycook4 жыл бұрын
This is great, Matt! This is the workflow I'm using too and yeah, there's lots of hidden "gotchas" with the process if you want to make it all look perfect. :-))
@jedi77623 жыл бұрын
Letting you know I had a live demo with Francesco Turri with Rokoko - gave you a shot out as I told him you were the reason I was there...thx Matt
@CinematographyDatabase3 жыл бұрын
check out my new channel, it's more dedicated to MOCAP and digital character work, I'm back in the Rokoko suit - kzbin.info/door/2gH3UhyNvsyNZwubGTBhIg
@GospelMusicians4 жыл бұрын
I’m new to this, but it seems you have to go from Rokoko to iclone to Unreal Engine? Why are so many tools needed?
@AnimeBadBoi2 жыл бұрын
The normal process is typically longer and require more tools than this
@ErrorJordan1s4 жыл бұрын
are there any benefits to using a newer model iPhone?
@quicksolphoto3 жыл бұрын
rokoko have own face capture system. Is it do not work wirh iclone?
@hiskishow4 жыл бұрын
This is insane!!!
@theo.jovitch4 жыл бұрын
Hey Matt, cool stuff! Are you planning on creating a keyframe system in Cine Tracer in order to be able to time the different actor performances so that a series of performances from a single actor, as well as several performances from multiple actors can realistically be stitched together? Also being able to keyframe "NPCs" to sync behavior with live performances would be amazing.
@CinematographyDatabase4 жыл бұрын
I'm working on something like that to help sync all the actions together. I want to avoid the traditional "timeline and clips" metaphor however.
@ntygamer203 жыл бұрын
what model is that vcam? can't search it.
@sondosa.90513 жыл бұрын
Are suit pro capture fingers motion
@justinkringstad41674 жыл бұрын
why don't the hands move?
@keithtam88594 жыл бұрын
if I have a pc would the iphone works with it?? or do I need them both be apple products?? thanks
@Myshowbro4 жыл бұрын
My question for you is Are there specific limitations to why Android is not supported it would have helped if you went into more detail
@CinematographyDatabase4 жыл бұрын
It's technically possible, but no developers have supported Android yet for facial MOCAP. Apple has a very robust depth sensor based facial anchor system and the API to use it in third party apps is pretty straight forward.
@Myshowbro4 жыл бұрын
@@CinematographyDatabase thank you for the timely response it was greatly appreciated
@ZigUncut2 жыл бұрын
You could use a gopro mounted to a helmet and use Faceware on a PC. The results are amazing. In particular the eyes which is something that the iphone isn't so good at.
@griffon41633 жыл бұрын
Ребят смотреть есть в и айфон и мокап костюм и персонаж, объяденил актëра с моделью и получилось классно, но Можно-ли после всего этого проанемировать лицо персонажа?
@patriotaRBC4 жыл бұрын
No hands motion cap ?
@grapicsrz4564 жыл бұрын
Hi thanks for the video but Do you think that with all this equipment it is easy to do fitness animations?
@mr.shohabbos90023 жыл бұрын
Where are yu from
@mr.shohabbos90023 жыл бұрын
Where yu study?
@sondosa.90514 жыл бұрын
Kindly what you recommend to use for hand fingers motion capture and face facial motion I need to capture motion to make avatar in sign language What Software and hardware I should buy
@scottownbey80413 жыл бұрын
I’m surprised you don’t have the Rokoko Mocap gloves- curious if you have any experience with these in the same pipeline?
@Ranger7Studios4 жыл бұрын
Amazing opening.
@SpiritscometogetherProductions4 жыл бұрын
Hey Matt, really love the work you do for us all 🎬📽😉 question is there a way where we can import our own 3D characters from iclone 7 into cinetracer? That would be so awesome. Keep up the great work.
@CinematographyDatabase4 жыл бұрын
That’s not currently possible. There is a lot of work to import them and then clean them up in UE4.
@SpiritscometogetherProductions4 жыл бұрын
@@CinematographyDatabase Thanks for your reply. Can't wait for that feature to show up in cinetracer 😎 really love that program 😉
@raulmoreno24993 жыл бұрын
I wonder if virtual production can be done through a laptop?
@CinematographyDatabase3 жыл бұрын
A modern MSI laptop with a 30** GPU would be an good place to start. But a full PC works better.
@ToonaciousD4 жыл бұрын
What the link to the virtual camera hardware you use? Thanks.
@ramiru32644 жыл бұрын
How much is the mocap suit
@wittdesigns4 жыл бұрын
Hey Matt, great work! There is an update to cinetracer that allows to import fbx scenes, custom character, etc? Keep up strong!
@iHeartDom4 жыл бұрын
Do you need the vcam to put your digital character into the real world like that or can it technically be any camera?
@FINALFORMSTUDIOS4 жыл бұрын
I see that you have the Ninja V on the end there for your virtual cam setup. Are you able to see the Unreal engine display on there or what is its purpose?
@Jsfilmz4 жыл бұрын
he records from ue4 straight to the ninja
@goodcitizen17th344 жыл бұрын
Awesome
@belbozs4 жыл бұрын
You should make a helmet rig with your iPhone
@woodrowjang4 жыл бұрын
how big are the file sizes for animation data? say for something short like your intro speech?
@bushellmediainc37004 жыл бұрын
We rendered a 2 min animation from iClone to unreal and it was 72 gig we used handbrake to compress
@AAa-tn5rz4 жыл бұрын
Could we do this with extra Vive trackers and valve index controllers to get full body with finger tracking? If we don't have rokoko? And iPhone 10 face tracking to record full body, fingers and face tracking?
@CinematographyDatabase4 жыл бұрын
Yes, but you would need to write your own IK elbow solver and translate the hand/finger tracking yourself in Unreal Engine and also write/maintain the iPhone iOS ARkit app that runs on the phones. Unless you are asking if it's possible in Cine Tracer, then I would have to do all of those things lol.
@PiPiCatStudio4 жыл бұрын
no fingers capture, too sad
@wongj32794 жыл бұрын
So the real-time live avatar streaming works right? you have plan doing this?
@CinematographyDatabase4 жыл бұрын
yeah I'll show some real time LiveLink into UE4
@ranjithvenkat44104 жыл бұрын
i want cine tracer for Android
@ВладЛеонтьев4 жыл бұрын
а где пальцы?
@joelarvidsson4 жыл бұрын
Love this video. Seams to work with out any large hiccups.
@noahstudio37174 жыл бұрын
You said Live Face is free, I just checked it . It is not...it is $399.
@marcuscisneros85284 жыл бұрын
Ashu Ka the iPhone app is free but it’s pretty much useless without that 400$ plugin for the desktop app
@ramiru32644 жыл бұрын
@@marcuscisneros8528 so you have to pay for the plug-in and the software? Or just the software and the plug-in is already there?
@VicVegaTW4 жыл бұрын
You are a one man army
@digimaks3 жыл бұрын
iPhone iPhone...... I USE ANDROID!! Why are there no alternatives!? This sucks.
@juanayala46784 жыл бұрын
I need an iphone
@theinfocircles74063 жыл бұрын
In the introduction the mocap character's mouth looks like he/she is saying "i" the whole time. Hardly any changes to the shape of the mouth when talking. Like a Kermit the Frog puppet.
@SharpDesign4 жыл бұрын
Face is cool but the flat hands are throwing me off a bit.
@CinematographyDatabase4 жыл бұрын
Sharp Design yeah I hope to have some hand tracking hardware in the future
@SharpDesign4 жыл бұрын
@@CinematographyDatabase awesome.
@oby12 жыл бұрын
Ouff... the facial capture ain't that great
@overdev19934 жыл бұрын
the facial animations look really crappy perfect body motion capture + crappy facial animations is a really bad combination
@CinematographyDatabase4 жыл бұрын
There is still some tweaking on my end to get the facial animations better. The phone wasn't placed in an ideal position and you have to learn how to puppet it correctly. There are only so many blend shapes that it can match to.
@alex05894 жыл бұрын
That's because youre used to games that have animators work for 2 years on cutscenes facial animation ahah, this is just a proof of concept
@KRGraphicsCG4 жыл бұрын
I often keep noticing this with the iPhone... you have to exaggerate your facial motions to get everything looking right. It's why I started creating my own facial pipeline with custom hardware.
@KRGraphicsCG4 жыл бұрын
@@CinematographyDatabase It is very hard. It's why I prefer the precision of marker based facial capture... I can capture all kinds of nuances in the face with this method. I hope to learn more from you
@KRGraphicsCG4 жыл бұрын
Also the frame rate in the iClone face cap is 30FPS... you need at least 60 FPS or greater to get VERY good facial animation