Thanks for watching and sharing! Let me know what you guys thought about this as well, I'm curious to hear others thoughts! If you want to learn compositing, check out our course below: www.compositingacademy.com/nuke-compositing-career-starter-bundle
@focusmedia24658 ай бұрын
This is really exciting and gives me hope at getting into compositing footage. I love that you can see your scene in real time, what an absolute game changer. Thank you for all of your work put into this video. The scene you created was so amazing and detailed as well, very impressive!
@TheMagnel77724 күн бұрын
The tutorial is very inspiring and the final product looks great! Thanks a lot.
@TheDavidTurk9 ай бұрын
this may be one of the best VFX breakdowns I've ever seen!! So awesome to see how you made this using so many different techniques and tools! BRAVO!!!!
@JoshuaMKerr9 ай бұрын
Great work on this breakdown, Alex. Im so impressed by what you managed to achieve. It's an amazing system, isn't it.
@dakshpandya65598 ай бұрын
I did the excat same setup last year. Connected my iphone with live link camera. And attched it over my dslr. So my dslr shot live action while iphone captured unreal footage. Combined together gave amazingly well made product
@snehalkm9 ай бұрын
This is really great Alex, Stuff we really dont learn even while working for VFX. Super proud and happy to see you from MPC and watching you here in KZbin. Keep making. Just by making this, made me fall in love in movies again. Thank you so much and good luck for the rest of the videos. Looking forward.
@lucywallace9 ай бұрын
This is really amazing! Thanks so much as always Alex for sharing such interesting and exciting VFX techniques 😊 Would definitely love to try out this virtual production workflow.
@buddyfx70269 ай бұрын
Yea this was great, thanks so much for the thoughtful explanation of the entire process. Instant sub!
@whypee079 ай бұрын
Well, this video has motivated me once again after all the burnout afflicted by writer's strike and extra pressure of work and learning side by side. So good to see that with such small team and assets we can create such a stunning shot.
@eliotmack9 ай бұрын
This feeling of 'I can do this' is what I most wanted to make happen when building Jetset. It means a lot to me that it gave you that!
@FX_AnimationStudios9 ай бұрын
super awesome!! really nice seeing the BTS
@johnwoods93808 ай бұрын
Really great advice for people who have a friend who owns an ABANDONED BARN. Lack of physical space is the toughest obstacle to my plans. Can't even use my garage, because it's full of someone else's stuff. I'll call you back when I find my barn
@CompositingAcademy8 ай бұрын
When there's a will there's a way! Worst case scenario hang up a greenscreen at night on a non-windy day and do it outside somewhere. There's always a workaround, the barn was lucky but we didn't even plan to use it originally. Also it was quite cold and batteries died a lot, so workarounds come with their own problems. My belief though is constraints create creativity.
@pietromaximoff43659 ай бұрын
Thank you amazing video and tutorials
@SeongGyu_6 ай бұрын
Hello, Alex! Can you tell me the name of the warm-colored light in the video? Or can you tell me the number of Kelvin's? (The light on the right at around 1:44 of Time!)
@ChronicleMove9 ай бұрын
Cool! It would be interesting to hear what difficulties and limitations your team encountered when using this pipeline?
@CompositingAcademy9 ай бұрын
Detailed in a few of the other responses. Mainly if you want to refine one of the real-time tracks there's a workflow that they've developed, but it ended up working pretty well. That was probably the biggest thing we worked together to figure out, but they're tooling up a gizmo that essentially does a refinement workflow in either nuke or syntheyes. Another hurdle was figuring out the lighting - the app can load in a model but not lighting (they're adding this feature very soon though, if you have a workstation on set). Mainly I just screenshotted the blender scenes from various angles that I knew I would be shooting, and moved around the lights accordingly. We were in a barn in some cold temperatures so I wouldn't bring a workstation there - but I can imagine this workflow will be insane when you have unreal engine live-linked into your viewfinder. They're also adding the ability to stream an eevee viewport into the phone as well if you want to use blender instead of unreal. It gives you a great idea on lighting and how to match it for the composite. Some other factors could be you need to have enough light for the iphone to see features for it to stay stable, so I would imagine in the pure darkness this wouldn't work, but even this was shot back-lit so I think you just need to plan accordingly. I was already stretching it here and it worked. They have some other features they're updating as well, currently there's a feature called 'infinite greenscreen' which essentially garbage mattes out the edges that aren't greenscreen. Currently it uses auto-detection but on an un-even greenscreen it didn't work as much, so they're going to change the approach to just snapping corners and then garbage matting the outside away. This is nice to have but I still had no problem shooting the scene since probably 80% of it was in the greenscreen area. Orienting the scene they also have a printed marker you can use. This is really useful for flipping the scene around etc, without having to mess around with positioning the origin by hand on the app. Basically you just aim the app at a piece of paper with a QR code, and it orients the scene to that marker.
@VFXCentral9 ай бұрын
Absolutely amazing!!
@VFXforfilm9 ай бұрын
Looks cold in that barn.
@CompositingAcademy9 ай бұрын
it was terrible, batteries kept dying, haha
@GabrielMendezC9 ай бұрын
For an aspiring VFX artist such as myself, this is really awesome content to learn from. Thanks Alex! 🙌
@malcolmfrench9 ай бұрын
Game is changing! GREAT job Alex
@moisesdelcastillo67039 ай бұрын
amazing thank looking fwd to more
@JuJu-he8ck9 ай бұрын
the shot works great. boom and bam.
@otisfuse9 ай бұрын
WOW, the revolution has begun.
@NirmalVfxArtist9 ай бұрын
Fantastic! Manipulating light to our advantage in terms of saving cost and time is something rare these days!
@JasonKey3D9 ай бұрын
Nice EmberGen info @ 4:00 👍
@BradfordRex9 ай бұрын
Great video! The final shots look awesome! Love that the toxic goo was practical. How solid was the tracking data out of JetSet? Did you have to clean up or re-track or was it sufficiently accurate?
@CompositingAcademy9 ай бұрын
The tracks are pretty good. For the hips up out of focus shot, I ended up just using the real-time track out of the box For the one where the feet are really prominently featured, I wanted a sub-pixel level and wanted to refine it. I worked with them to figure out a workflow that essentially “snaps” (orients/scales) any post track you do to the real-time camera. When you do a nuke track normally it’s not to real world scale and it’s not oriented to your cg set at all, so essentially the “refined” track workflow is do your track in post, and hit a python script button to “snap” that camera to the real-time camera where we know the scale and orientation is good in world-space. They’re working on a nuke gizmo (or syntheyes) to wrap that workflow up, but it worked really well. Orienting one camera is one thing, but once you start having sequences this is a big time saver. Additionally you’ll probably have some shots where the realtime track works as well so you can literally just start rendering / compositing.
@BradfordRex9 ай бұрын
@@CompositingAcademy Thanks for the insight to this! That's great to hear that you could use the realtime tracking for several shots. I was curious about the idea of somehow using the realtime track to refine or orient the post track. When you mentioned scanning the set to have a model of it in post, my mind went to the syntheyes tool that uses photogrammetry to make a model to improve tracking. Sounds like this workflow is something similar. Very cool! I can't wait to use this app and workflow myself. Hoping to shoot a project with it sometime this year.
@CompositingAcademy9 ай бұрын
Very similar to that! It's especially useful if you have more takes. Imagine if you had 10 camera angles pointing at a CG scene from different positions. Aligning & scaling, all of those would be a painstaking process normally. Interestingly they also export some extra stuff for post workflows, like an A.I matte that you can use to garbage out the post tracker if you want to (basically to roto out the moving person, etc). What's also interesting is this can be used for CG filmmaking, or CG objects placed into real scenes. I didn't go into it on this project, but those are also possible here.
@eliotmack9 ай бұрын
@@BradfordRex That's exactly the methods we're setting up for Nuke and Syntheyes. For Nuke we can re-orient a solved track to match the original scale & location of the Jetset track, and solve the usual alignment & scale problems encountered with single camera post tracking. Syntheyes will be extremely interesting as we can 'seed' the track and then tell Syntheyes to refine it with a variety of tools (soft and hard axis locks, survey points to 3D geo, etc.) The Jetset live tracks are good enough that we want to use them as a base for a final subpixel solve when the shot demands it.
@BradfordRex9 ай бұрын
@@eliotmack That sounds like the best workflow to me. Take your live track and refine it, instead of having to start all over. All that data is invaluable, even if isn’t sub pixel, I would think it has to be helpful when refining to get sub pixel accuracy. I’ll keep an eye out for that syntheyes update. I really want to get out and play with Jet Set myself!
@DEADIKATED8 ай бұрын
Awesome! I 'm actually trying to figure out a Solution for a complicated Green Screen Shot at the moment. Very Inspiring. I subbed
@ninjanolan63289 ай бұрын
Ian Hubert has been doing things like this for over a decade
@AlejandroGarciaMontionargonaut8 ай бұрын
great workflow, quick question , why green trackinpoints are used in this green screen , Isn't it better to use a different colour or is it because of this workflow?
@CompositingAcademy8 ай бұрын
In this case the green markers are used because if the character passes in front of them, it can still easily be keyed out - but at the same time there's enough contrast to be able to track the pattern. I believe the phone app tracks better if there are features as well - but mainly I put them there just in case I wanted to track one of the shots in Nuke afterwards wth a more refined track. In darker lighting conditions sometimes pink tape is used, because it's very bright and creates a lot of contrast against the green. However, this is not keyable so if an actor walks past it, you'll have to paint or rotoscope out the marker
@ocdvfx9 ай бұрын
Leveling up!
@SHVWNCOOPER9 ай бұрын
this is exactly what i need. now if you have unreal engine tutorials i'm subbing lol
@CompositingAcademy9 ай бұрын
later this year unreal will come into the picture
@2artb8 ай бұрын
Nice work n vid thanks!
@Ricoxemani9 ай бұрын
This is really cool. Only thing holding me back from being able to do this is not having a huge barn to shoot in.
@CompositingAcademy9 ай бұрын
Outdoors is a good workaround too! Hang up a greenscreen on back of a garage or any wall, and shoot at night if you need to control the lighting.
@Gireshahid339 ай бұрын
Amazing
@marktech23789 ай бұрын
Nice work 👌👌
@HadjFilmz26 күн бұрын
Hey quick question in your keying course do you talk about keying with after effects and bringing it into blender
@CompositingAcademy26 күн бұрын
At the moment it's only for Nuke. Keying in After Effects isn't the best workflow, especially for difficult keys where you need to combine a lot of techniques.
@HadjFilmz26 күн бұрын
@@CompositingAcademy cool and is there a way I can key in nuke and export the png into blender
@HadjFilmz26 күн бұрын
@@CompositingAcademy cause I’ve always been fascinated with keying but all the other software I use the keying has frizz or flickering around the hair and messes everything up so if there’s a way I can key in Nuke and export the png into blender is that possible?
@LFPAnimations9 ай бұрын
How accurate is the track output of Jetset? Would you have to do a normal 3D track in post still or can you use the app's track for final pixel?
@CompositingAcademy9 ай бұрын
The tracks are pretty good. For the hips up out of focus shot, I ended up just using the real-time track out of the box For the one where the feet are really prominently featured, I wanted a sub-pixel level and wanted to refine it. I worked with them to figure out a workflow that essentially “snaps” (orients/scales) any post track you do to the real-time camera. When you do a nuke track normally it’s not to real world scale and it’s not oriented to your cg set at all, so essentially the “refined” track workflow is do your track in post, and hit a python script button to “snap” that camera to the real-time camera where we know the scale and orientation is good in world-space. They’re working on a nuke gizmo (or syntheyes) to wrap that workflow up, but it worked really well. Orienting one camera is one thing, but once you start having sequences this is a big time saver. Additionally you’ll probably have some shots where the realtime track works as well so you can literally just start rendering / compositing.
@SeongGyu_6 ай бұрын
Can you tell me the product name of the monitor you use? And can you upload the video regarding the monitor topic?
@jimmahbee9 ай бұрын
Awesome
@ryanansen9 ай бұрын
How accurate/usable is the track that you get from this workflow? Is it something just good enough for look dev or do you find it good enough/comparable to a track you might be able to solve out of something like Syntheyes?
@CompositingAcademy9 ай бұрын
From the tests I did, sometimes it was good enough for final track, in other cases I wanted to refine it especially when the feet were very prominent. They have a refined tracking workflow if you want to use syntheyes or nuke - it snaps the post track to the real-time track. In this way it saves you time on orienting / scaling / positioning the new camera in world space.
@WastedTalentStudios6 ай бұрын
so do you create the world first and then see it on your iPhone or how does that work
@CompositingAcademy6 ай бұрын
Yes exactly - you can buy a kitbash set from something like Big Medium Small - and you place those objects in a 3d scene. Then you essentially load that into the app and it will load on top of the real world
@themightyflog7 ай бұрын
Which Accsoon semo can I use?
@FabianPadillaFotografía9 күн бұрын
i saw you on bigscreen
@SeanAriez9 ай бұрын
Epic
@WhereInTheWorldIsGinaVee9 ай бұрын
it looks like you were using a gymbal with the camera...did that cause any problems with LIghtcraft Jetset? Could you use it with dji ronin?
@CompositingAcademy9 ай бұрын
nope no problem, this combination was awesome! I balanced the gimbal with the iphone and attachment on top. This setup works great with the gimbal. This is with ronin RS3. You basically check the iphone to see your CG, but your final image out of the Ninja 5. Obviously it's two different cameras so the view is slightly different, but jetset gives you a super clear idea of where you are framed up against your CG with your actor
@unspecialist8 ай бұрын
VFX supervisor here, nice vídeo and nice concepts. But in no way is this called virtual production (outdated term for volume production) the whole reason is to avoid the green screen as you can see by those 2 unjustified specular lights on the white bucket the actor is carrying. This is just tradicional green screen work with tracking set data. You need the LED panels to create your light environment and volume correctly
@CompositingAcademy8 ай бұрын
I would disagree that this isn’t virtual production - traditional greenscreen doesn’t allow you to see what you’re filming. Personally I think it helps clients understand what we’re doing (shooting, and seeing the result). This app / company also markets itself as that term for that reason. I don’t think LED volume companies can claim the term virtual production - although they would like to and have poured millions into doing so. Also, sure there might be a spec highlight, but these shots are impossible to do on an LED stage without compositing. There’s foreground elements, a replaced floor, rack focus, a virtual camera move extension , etc. This is why I chose this environment, it plays to the strengths of fully CG scenes in a contained space, while also costing 100x less with arguably a better result than an LED volume. Personally I think LED stages are a cross-over technology to something much better, most likely virtual production seen through VR headsets on set, while the greenscreen (if you’ll even need one) is replaced live or pre-vis. Smaller more cost effective panels would be interesting for reflections definitely. I think that’s cool. But from a first principle and even physics standpoint, there’s a lot of limitations not honestly discussed often about LED virtual production. Also , you can get a lot of realistic lighting without needing LED panels which has been done for years, the only time you need LED panels is if you have many obvious reflective objects. Including greenscreen outside - which you can’t get realistic direct sunlight on led panels. The best mix is actually using greenscreen projected on an LED stage, and a virtual environment around the edges, but this still makes your cost ridiculously high for an arguably diminishing return, unless you’re filming chrome characters.
@monarchfilmspx09553 ай бұрын
Hard agree with the actual vfx supervisor here...This is just green screen work. You really think no one could see what they shot on green screen before unreal? 😂😂😂😂
@DannyshRosli9 ай бұрын
Better than LED Screens i must say
@CompositingAcademy9 ай бұрын
it all depends on the lighting. Can have really bad results on greenscreen if it's lit wrong too. I think LED stages could be good for very reflective scenes or characters (like mandalorian, he's chrome), but they're limiting (and very expensive) in a number of ways that aren't often discussed.
@tomcattermole18449 ай бұрын
I swear this tech didn’t exist a couple months ago, I was trawling the internet for solutions and couldn’t find anything. Ended up having to compile 3 After Effects camera tracks across one 2 minute clip :I
@CompositingAcademy9 ай бұрын
Oh yeah, for long camera tracks this is going to be really interesting. I didn’t even think about that. It was also interesting that I could use the real-time track on an out of focus background shot, those shots are usually much harder to track.
@eliotmack9 ай бұрын
You're correct! We only introduced this in February at the British Society of Cinematographers' show, so it's very new. We like long camera tracks!
@tomcattermole18449 ай бұрын
@@eliotmack Kudos. I was scratching my head trying to figure out what piece of the puzzle was missing to make something like this possible because it felt like all the hardware you'd need can be found in a modern phone anyway. As someone who wants to push concepts as much as possible with smaller crews/budgets this is going to be nothing short of a life saver.
@AstroMelodyTV9 ай бұрын
@@tomcattermole1844I’m actually working on a grad project right now where I’m going to have to do a lot of camera tracking. Do you have any pointers on how I could use this method with just the iPhone? Or should I try to get a cine camera as well 😅
@JungNguyen099 ай бұрын
Are courses n101 - 104 currently discounted? I really want to buy this course 😌
@violentpixelation54868 ай бұрын
❤👌💯🔥 please more #UnrealEngine #VirtualProduction
@rossdanielart9 ай бұрын
Is the prores RAW any good? does it keep more data that is useful for vfx?
@CompositingAcademy9 ай бұрын
Super useful for keying, also gives more flexibility with grading. You basically transcode the prores raw into ProRes 4444 or directly into EXRs, it really helps. Also it's just less compressed overall so everything has really crisp detail
@StudioWerkz9 ай бұрын
Which Accsoon Seemo can be used? I see a Regular, Pro version and a 4k Version.
@CompositingAcademy9 ай бұрын
All of them. The standard SeeMo is HDMI and the Pro is both HDMI and SDI but they work the same.
@trizvfx8 ай бұрын
Fuck yes Alex! Greta work.
@themightyflog9 ай бұрын
any tutorials on jetset?
@CompositingAcademy9 ай бұрын
Possibly! If more people are asking for it, it's something I might do.
@themightyflog9 ай бұрын
@@CompositingAcademy I would love to see your lighting tutorials. For real everyone but you seems like they are still on green screen
@CompositingAcademy9 ай бұрын
good idea. I'll probably talk more about that in some next tutorials on these shots. Lighting & compositing is 100% the reason why people aren't getting the results they want. It's also the same reason a lot of CG environments look super video-game, people don't know how to control contrast & light mainly.
@orbytal17589 ай бұрын
@@CompositingAcademy A tutorial on that would be amazing. It’s the one setback I have when doing anything virtual. Even in UE with mega scans it looks like a video game still plus my compositing skills need some work
@SortOfEggish9 ай бұрын
This is all impressive until the client says the barn limits their idea and they want to shoot something in Time Square lol
@CompositingAcademy9 ай бұрын
Yeah you would need a bigger stage. This barn space is equivalent to a smaller greenscreen stage, it’s about the same size as a few you can rent. This workflow would still work on a wrap around stage though if you need to pan more.
@roxanehamel17539 ай бұрын
I have an FX3 too. But I use it with a PortKeys LH5P II and Zhiyun Crane 4, will it work? Do I absolutely need to have pro-res raw to do virtual production like that? And can I do it with Unreal Engine and Davinci Resolve?
@CompositingAcademy9 ай бұрын
you don't necessarily need prores raw, it does help getting super clean keys and small details, but if you use one of the better Codecs in the internal recording on FX3 those will still work pretty well. Make sure to shoot 10 bit for sure though. You can use Unreal as well. Right now you can import any geometry, but they're also making making a live-link to see your unreal scene live if you hook up wired to a workstation. Either way you choose to work, you can get the camera data + plate data into unreal afterwards and it will align with your scene.
@roxanehamel17539 ай бұрын
@@CompositingAcademy Thanks for the infos. And you're using an Iphone, but with a Google Pixel 8 can make it too?
@eliotmack9 ай бұрын
@@roxanehamel1753 Right now Jetset uses iOS devices, but even a used iPhone 12/13/14 Pro will work great. The onboard LiDAR improves the 3D tracking, and makes 3D scene scanning possible.
@aidenzacharywessley38088 ай бұрын
@compositingAcademy how can I become vfx artist and a compositing artist can u make a video for that
@CompositingAcademy8 ай бұрын
Hey Aiden, The best way is to learn the fundamentals and build a demo reel to prove you have the skills to employers. If you're interested in the beginner series, we've built a really good path for beginners who want to go professional here. There's a bunch of projects included, and the footage can be used for reels as well. www.compositingacademy.com/nuke-compositing-career-starter-bundle All the best!
@santhirabalasinthujan91708 ай бұрын
Accsoon Seemo or Accsoon Seemo pro Using good
@humangameryt47294 ай бұрын
any app for android 😢
@jeshwanthreddy41212 күн бұрын
Prizma 3d and node video editor is something in android
@lFunGuyl7 ай бұрын
You iPhone people are so annoying, but you sure get all the cool toys these days 😅
@kanohane9 ай бұрын
Iphone is trash....is there an Android software....and I wish UE would add mp4 support...😢
@travelstories25299 ай бұрын
How much are you earning with this setup
@jinchoung8 ай бұрын
meh. i mean cool results but don't see the value add of jetset.
@CompositingAcademy8 ай бұрын
It helps a lot when you're filming - framing up to things that don't exist is pretty unique. Personally I used to do a lot of photography I liked moving around and finding interesting angles, you can't do that traditionally which is why I think a lot of greenscreen stuff in the past is only background / distant stuff.
@keithtam88599 ай бұрын
don't know. it is subscription.... and I am cheap LOL
@HTOP19829 ай бұрын
Dude shoots GS and thinks this is VP. It's well done, but not correct...
@billwarner46418 ай бұрын
The term "virtual production" came well before LED walls. But now LED walls has "taken over" the term virtual production. We're not ready to give up that fight. We think the Lightcraft Jetset approach is going to be so accessible to a much wider audience, that in the fullness of time, "virtual production" will go back to its original meaning which is any way to combine live action with a synthetic background.