12:25 I cant add live link camera controller in camera role... "no controllers were found for this rore " please help) ( in transform role all good work)
@GregCorson2 жыл бұрын
Not clear exactly what the problem is, the vive is not showing up as a live link subject or you can't actually add the live link component controller? Are you using UE 4.27 or 5? if you use 5 you might want to look at my other setup walkthrough here. kzbin.info/www/bejne/m5PYfaShbZqJirM
@greatideas218 Жыл бұрын
Hi Greg, At 23:56 I am doing this in Unreal 5.1 with vive trackers and it keeps crashing. Any ideas how to resolve this issue?
@GregCorson Жыл бұрын
I am not really sure, I have used the vive in 5.1 and not had any problems with crashing. I don't think there is any significant difference in the setup between 4.27 and 5.1 as far as the calibrator goes. Does it crash randomly or right after you pressing a button (if so, which button)?
@greatideas218 Жыл бұрын
@@GregCorson It is right after I hit the add to Nodal Offset Calibration button. I do about 4 clicks for the green light tracker. I actually got it to work once, but the tracker was completely offset from the virtual tracker. It continues to crash for the majority of the time. The crash report does mention opencv. On a side note would you happen to be familiar with the opencv detect aruco marker blueprint? A main objective I want to achieve is a mixed reality capture, where I can align virtual objects to either the vive tracker or a printed aruco marker sheet. I input the in lens distortion parameter and utilized an active render target for the in render target, but I can never get that blueprint to recognize the aruco marker....... There is such a dearth of information on certain blueprints, which makes your tutorials such a breath of fresh air.
@GregCorson Жыл бұрын
There is really no doc on unreal's opencv blueprint, I haven't got it to work yet. It is possible to use the detection in the lens calibrator to line up shots with arucos in the studio. I use it here kzbin.info/www/bejne/rGLUfKqIqM6fY7M to line up camera shots without using a tracker. Also, when using the VIVE it may take more than 4 clicks to get a reliable result. I usually do at least 10. Also make sure to move the tracker a fair amount forward, back, left and right. If you do all your samples with the tracker at the same distance from the camera it will frequently give crazy wrong results.
@greatideas218 Жыл бұрын
@@GregCorson Thanks for your response! Yeah, I will try to redo the calibration to hopefully bypass the error. I appreciate all the information you provide!
@terryriegel2 жыл бұрын
At 5:00 you set your sensor dimensions for the cinecam. When I connected my lens file it reset the sensor dimensions on my cinecam. So I double clicked on the lens file and set the sensor dimensions there. Now when I connect the lens with the livelink component controller it has the correct sensor dimensions.
@GregCorson2 жыл бұрын
Yes, I noticed that some actions were causing the dimensions in the cinecam to be reset. Seems to depend on what order you do things. Basically I make sure the dimensions are set right every place unreal asks for them, then go back and double check the cinecam before I start shooting images for calibration to make sure it is still right. I believe I reported this to Epic, not sure if there is a fix in UE5
@ShaunFoster3 жыл бұрын
Thanks for doing this video Greg! Looking forward to trying out the workflow!
@ShaunFoster3 жыл бұрын
Also - sharing with colleagues!
@3Dmotionblur3 жыл бұрын
@@ShaunFoster @Greg Corson I'm one of the colleagues, thank You so much for this video! It is an incredible resource!! I'm looking at that one thumbs down and thinking how anyone could thumbs down this video is beyond me! Keep up the great work!!
@GregCorson3 жыл бұрын
Thanks! Sometimes people will thumbs down a video because they though it was about something else, they didn't understand it or just by accident. Hard to know why unless they post a comment.
@ShaunFoster3 жыл бұрын
@@GregCorson it's happened to me before...or people try a workflow and they miss a step and don't blame the person between the keyboard and the chair
@duchmais71202 жыл бұрын
Thanks Greg for sharing the Informative Videos...Virtual Production Workflow Process seem so Daunting, Timing Consuming and Tedious...I aint giving Up though..Will be patient with all this...Maybe learn something though...God Bless
@GregCorson2 жыл бұрын
It is gradually getting better as the available tools improve, please keep at it, things get better as time passes.
@onethingeverything3 жыл бұрын
Great job on this Greg! Very helpful!
@GregCorson3 жыл бұрын
Glad to hear it!
@mariusandreelgs2 жыл бұрын
Hi Greg! In the part of the Raw Fiz Input livelink actor, these values are not updated in UE5 inside the lens file. Its present in the livelink, and the livelink controller under the camera is a camera role, subject camera and lens file choosen. Vars in the actor is public.
@GregCorson2 жыл бұрын
I'm not sure I understand your question. In this setup the FIZ data is entirely manual. You would have to set it in the virtual subject you create during the setup and change it if you change the camera. If you actually have some kind of lens encoder that reads the lens data in real time, you would have to hook that up according to the encoder's instructions. I don't have a lens encoder myself, so I was not able to demonstrate that here. If I've missed the point of your question, please feel free to ask again.
@mariusandreelgs2 жыл бұрын
@@GregCorson Hi Greg! Yes the virtual object has fiz data manually on its variables. Currently do not have any live lens camera data. Ive followed your setup exactly and the manual data entered in the virtual object should be present in the lens file, but its not. Its saying 0,0,0
@mariusandreelgs2 жыл бұрын
@@GregCorson I will redo the process in UE4, as i believe its bugged in UE5
@mariusandreelgs2 жыл бұрын
@@GregCorson Hi greg! Now i've tried making the VirtualProductionFiz object as you explained in the video both for UE5 and UE4, and assign this as a camera object under the livelink component. I've set the iris 22, zoom 50 and focus 120. In the LENS file, the Raw FIZ input is saying 0,0,0 and is not updating to the values from the VirtualProductionFiz. I cant seem to figure out what the issue is, can you help?
@GregCorson2 жыл бұрын
It sounds like you might not have the lens file hooked up to the camera correctly. Make sure the virtual fiz is in your livelink window. Make sure you have added a livelinkcomponent to your camera, set the subject to the virtual fiz and gone down and added your lens file to the lens file picker. All this is covered in the UE5 tutorial but is in two different sections. The setting up of the lens file picker is in the section about making a lens file so you might have missed it. kzbin.info/www/bejne/m5PYfaShbZqJirM please let me know if you are still having problems after checking this stuff.
@maxtemi603 жыл бұрын
Very nice video, so we can imagine this procedure replacing the setup based on your autorig and your recent checklist on github?
@GregCorson3 жыл бұрын
Yes, this should basically replace my autorig and other stuff. BTW that checklist is more for me and is just in the repo to keep track of it, though I'll try to clean it up by the time I put out release 9. I'm also working on another tutorial to show how to set the whole thing up with a calibrated camera. I need to do a bit of experimenting to see if I should keep the autorig stuff or build a simpler one to allow for manual tweaking of the camera offset in case the measurement from the calibrator is a little off. Also will try to keep the old system around for people who don't have VIVE
@resetmatrix3 жыл бұрын
Awesome work Greg! Thanks for explaining those complicated concepts and for sharing your knowledge with us, you're a god man!
@GregCorson3 жыл бұрын
Glad it was helpful!
@judgeworks36873 жыл бұрын
Thanks for covering all this.
@GregCorson3 жыл бұрын
Any time! More updates coming.
@prasithsay47412 жыл бұрын
Hi Greg, thank you for taking your time made such a useful tutorial for us. Do you other solutions for calibrate nodal point offset with one tracker? I've tried with two tracker but deference system, it's not works.
@GregCorson2 жыл бұрын
I have posted a few other videos with ways of finding the nodal point. The best one is probably this one I posted last night. kzbin.info/www/bejne/lYunkJyDodSqfNU This only requires a mount that lets you slide the camera forward and back, if you don't have one you can buy something that will work for as little as $10
@mn041473 жыл бұрын
Great and kind Tutorial! Thanks! if this calibrate my nodal point, can I free move my camera afte this without camera rig setup?
@GregCorson3 жыл бұрын
Using this new lens calibration system does everything my old camera rig. Takes a little time but gets all the measurements right! I am working on updating my VPStudio sample to show how to use it better.
@mn041473 жыл бұрын
@@GregCorson Thank you!!
@JoseChemmassery10 күн бұрын
Thank you so much
@torsten99833 жыл бұрын
22:30 you select the point method. Do I get advantages when I am using a Aruco makers? They are the strange QR Code Style. PS: Thank you a lot for making this Video👍🏻
@GregCorson3 жыл бұрын
I'm not sure about this, the Unreal docs didn't have any information about how to use Aruco markers and I haven't had time to figure it out yet.
@youssefnazih64473 жыл бұрын
Great job 👌 How can i make this tutorial with antilatincy tracking system ??
@GregCorson3 жыл бұрын
Sorry, I don't have an antilatency system so I don't know. This requires a tracker on the camera and a second tracker in the scene. If you have two antilatency trackers you can make it work but you have to do a custom object for the "tracker in the scene"
@youssefnazih64473 жыл бұрын
@@GregCorson thanks
@volkereffinger71442 жыл бұрын
Hi Greg, thank you so much for doing this tutorial! It de-mystified a lot of things to me! One more question: Once I did this whole calibration process, can I move the camera (like handheld or dolly)? Or does it have to stay in one place to keep things in sync?
@GregCorson2 жыл бұрын
Assuming you have a VIVE tracker on the camera, this calibration process gives you all the information needed to move the camera and keep things in sync. You should be able to use a camera calibrated this way for pretty much any Virtual Production setup. I haven't had the time to build it into my sample project yet but it should work with a little tweaking.
@volkereffinger71442 жыл бұрын
@@GregCorson Yes, I have a Vive tracker on the camera. Thank you very much for answering!
@mathieutimba89962 жыл бұрын
Hi, great video as always thanks you i'm french sorry for my bad english. do you know about a optical center correction that you have to apply for every optical lens ? theses values are mesured with a cross hatch in the center of the screen that you have to align on a real world point at wide zoom and tele zoom. where do you put this values in UE4 lens files ? do you have to put theses values before the lens calibration process or UE4 lens file calibration take care of that ? thanks. Mat.
@GregCorson2 жыл бұрын
Sorry i'm not sure if I understand your question completely. The optical center and other calibrations will be different for every lens and can even be different at different zoom and focus points. As far as where to put the data, the calibration process in Unreal will calculate all that (optical center, entrance pupil...etc) so this type of calibration should take care of all of that. Before this calibration process was available, a lot of this information might have been put in other places, the BlackMagic and AJA card plugins also had places to add the distortion, optical center...etc. When using this new calibration process you need to make sure that you don't put calibration information anywhere else, the new lens calibration should take care of it all.
@Lazniak2 жыл бұрын
thank You Greg
@sinanarts2 жыл бұрын
Greg you are priceless.. Thank you.. Can you please say I have 1 Vive Controller and 1 tracker can this same process work for me too.? or I need 2 Vive Trackers is a must.?
@GregCorson2 жыл бұрын
It should work if you use the controller as the camera tracker. The setup built into unreal uses the green LED on the tracker as a measuring point so that one needs to be in view of the camera. If you want to do some messing around you could build a tracking object that would let you use the hand controller as the one in the scene, you would need an exact measurement of the distance from the LED on the controller to it's tracking center to do that though.
@jdvroum3 жыл бұрын
Thanks a lot Greg for this Video. I only have one vive tracker. Can I calibrate the lens and keep the camera rig that we made before ?
@GregCorson3 жыл бұрын
With only one vive tracker you can still do lens distortion calibrations. I'm not sure of the best way to blend the camera rigs from my VPstudio with the camera calibration/distortion info from 4.27 but I expect to be working on that over the holidays.
@gregorplacht78413 жыл бұрын
Fantastic tutorial, thanks a lot! By any chance, would you make a more in-depth one that also shows the additional steps to take when using lens encoders as mentioned around minute 27?
@GregCorson3 жыл бұрын
Towards the beginning of the video where I first talk about lens encoders there is a link to a youtube video that shows how to use glassmark/indiemark encoders. Other encoders will be similar. I meant to embed that link in several places but KZbin doesn't let you put in the same link twice! It's here kzbin.info/www/bejne/b5_EZ4GHpb1_g9U and I'll go back and add it to the description of the video too so it will be easier to find.
@gregorplacht78413 жыл бұрын
@@GregCorson amazing, thanks a lot!
@terryriegel2 жыл бұрын
at 26:50 you state that you can undistort the real camera. Can you explain how this can be done?
@GregCorson2 жыл бұрын
I don't usually do it that way, so I'm not sure of the exact process for doing it with these new lens calibration resources. In the past you would get calibration parameters for your lens that could be entered in the Blackmagic or Aja plugins to undistort the camera. There were other ways to do this too, I just haven't made much use of them. Before this lens calibrator was available I entered the lens parameters in my Aja plugin.
@terryriegel2 жыл бұрын
@@GregCorson I found that option after asking my question but it seems to increase the distortion when I plug in the values from the lens calibration. I’ll keep fiddling. Thanks for the excellent resources.
@Eroktic3 жыл бұрын
does this work for valve index trackers?
@GregCorson3 жыл бұрын
Valve gear and vive gear are the same, so it should. I don't remember seeing any calibration models for the controllers though, will have to check. It may only work with the V1-2-3 tracking pucks.
@Eroktic3 жыл бұрын
@@GregCorson thank you on fast respond and great tips. Your knowledge is much appreciated. Thank you!
@GregCorson3 жыл бұрын
I found out if you want to use a vive or valve controller, you will need to make the calibration model yourself. If you look at the blueprint for the tracking puck model there is a reference point component in it that represents the position of the LED on the puck. To make your own calibration model you need to measure the offset from the tracking center of the controller to the LED and create a blueprint similar to the one for the puck. The 3d model of the controller doesn't have to be an exect match as long as the position of the LED component is right. I haven't tried this but a guy I talked to at Epic says it will work.
@DigitalStaff Жыл бұрын
Hey Greg, thank you so much for putting this together! Would you please help me on this issue? I'm wondering how I can set up the camera calibration without having LiveLink trackers. Essentially I just want to put the FIZ data into the Lens File so I can use a single stationary camera and match it up perfectly with the virtual world. I tried doing that but I always see "No Focus/Iris/Zoom Input" in the Raw FIZ input section in the Lens File setup. Is it possible to do that without using Vive trackers? Thank you!!
@GregCorson Жыл бұрын
What you want to do is look at this video of mine which shows how to align a camera with a set using an ARUCO tag and no tracker, works great and is very fast kzbin.info/www/bejne/rGLUfKqIqM6fY7M
@smillerfx2 жыл бұрын
Thanks Greg, it's brilliant, very useful. I have 2 Vive trackers with 2 base stations setup on my camera and connected to UE via Livelink, it's working fine. I'm going to buy BM Decklink Duo 2 card to try out lens calibration with the help of your method. You know I would like to know that with this calibration method "can my physical camera zooming effect the virtual one's or should I recalibrate when I changed physical cam's zooming?". Sorry if I misunderstood, I'm new to this technology )
@GregCorson2 жыл бұрын
The actual zoom and focus on the camera lens can affect the virtual camera too, but you need some way to bring in the settings of the camera lens in real time. This is usually done with a "lens encoder" that attaches to the lens. Right now this can be kind of pricey to buy. If you have one of these you can calibrate the lens at a number of different zoom/focus settings and then the virtual camera will follow the lens settings sent by the encoder. If you don't have a lens encoder you can still calibrate the lens at different settings, but you would have to manually enter the lens setting you are using in unreal. So you could use different lens settings but you would not be able to do realtime zooms and focus pulls. To calibrate a zoom lens at different positions you use the same process as in this video, but you repeat it a bunch of times for different zoom and focus settings. You need to have at least two calibrations (far and near) for this to work but more is better as the properties of zoom lenses are not entirely linear.
@smillerfx2 жыл бұрын
@@GregCorson Thanks Greg for the reply. Sorry for my comment above, because I didin't see that you mentioned about Loled encoders one of the comments here. Well I'm planning to buy the Lens encoder but for now I follow your tutorial. If you buy lens encoder, maybe later, please make a detailed tutorial. kzbin.info/www/bejne/b5_EZ4GHpb1_g9U in this tutorial it's a little bit fast and IMO it's little hard to follow. Thank you again Greg. Have a good time!
@DKindenssFilms3 жыл бұрын
hi is there a way to do this without at vive? I'm using the retracker system
@dpredie3 жыл бұрын
It has an option to use AprilTags/Aruco markers during the nodal offset part
@GregCorson3 жыл бұрын
The distortion calibrator will work without the vive, to do the nodal/entrance pupil part you would have to have a vive. Unreal's calibrator does have an option to use Aruco markers but I'm not really sure how that works because there was no documentation on it.
@dpredie3 жыл бұрын
@@GregCorson checkout Vanishing Point Vector’s KZbin channel.. they have a tutorial to do the 4.27 epic Aruco nodal calibration but integrating with their product, needs a corresponding aruco blueprint actor in scene
@DKindenssFilms3 жыл бұрын
@@dpredie Yea true iv seen their video and I'm not to sure how to get it running, REtracker has their own Aruco nodal calibration system but it only runs in Aximmetry unfortunately
@GregCorson3 жыл бұрын
Unreal has an option to use Aruco markers but they haven't documented how to use it yet. I'm told it's coming soon. It would be possible to write your own code for the type of tracker retracker uses, I've thought about it but so far have not had the time to do it myself. I have suggested it to Epic though.
@bischofftep3 жыл бұрын
This is amazingly helpful, thank you! I wonder if you have any thoughts about why my tracking in UE4.27 appears very shaky, even though the camera is perfectly still? I can't find any references to best practices or issues that would cause the tracker to "shake" like this...? It's small, but very noticeable.
@GregCorson3 жыл бұрын
A small amount of shake/jitter is something the VIVE always has. As near as I can tell it's because the vive is really tuned for fast moving gaming. You will probably notice that the jitter is only obvious when the camera is standing still on a tripod. There is no general purpose way to get rid of it. For stationary cameras you can turn tracking off once positions are set. For cameras that will move and stop a lot, you need some kind of filtering but we haven't found one that is general purpose yet. The problem is that the jitter is very close to the magnitude of a "normal" slow camera move so finding a filter that removes jitter but doesn't cause camera motion to lag when movement starts and stops can be tricky.
@P4UL3RE2 жыл бұрын
Hi Greg, Great tutorial, I do have one question though while creating the lens file does the camera that is being tracked. Does the tracker have to be in the nodal position of the real camera for accurate data or does the camera just need to be tracked? As I was wondering if my virtual camera was not relative to my real camera would that give me bad data?
@dilithium722 жыл бұрын
My understanding is that's what the nodal offset calibration is for. The values it generates are the XYZ distances (and rotations) from the tracker mounted on your camera to the camera's sensor. Obviously, having the virtual camera and the physical camera not line up would be bad (without knowing the nodal offset). You can get away with quite a bit, though, if the perspective isn't too bad and you're not moving the physical camera around much. It really starts to show when you start tracking and pivoting the physical camera a lot. Things start to misalign pretty quickly.
@GregCorson2 жыл бұрын
The tracker just has to be attached rigidly to the camera so the two move together. It can be attached anywhere. The nodal part of the calibration process will handle getting the offset from the tracker to the camera's nodal point. If the calibration is correct the tracking should be fine wherever you mount the tracker. When you are doing the part where you click on the green led on the tracker, be sure to move the tracker in all 3 dimensions. If you don't move the tracker forward and back as well as side-to-side you may get incorrect results. It is helpful to use one of the other nodal point finding techniques in my other videos to get an estimate of where the nodal point is, so you will have some idea if the data from the calibration is reasonable. Usually when it comes out wrong, it is wrong by a lot.
@nocnestudio Жыл бұрын
Do You have any tutorial with compact camera? Sensor and focal length won't match...
@GregCorson Жыл бұрын
If you put in the right variables, it will work fine with any size camera. In the unreal cinecamera you can set up the sensor size which will give you a match to any camera, even one in a phone. Same thing for the lens calibrator, you just need to give it the sensor size information and it will work out everything else for you. There is no difference in the procedure for calibration, just put in the correct sensor size and resolution before you start
@nocnestudio Жыл бұрын
@@GregCorson My camera sensor is 13,2x8,8 FL 7,6 (Lens file) and when I put this value to unreal cinecamera is bad. But when I put equivalent value of Lens to unreal cinecamera 20,5x13,6 FL 20,5 then is matching. Now is good but Now I don't understand.
@GregCorson Жыл бұрын
This is confusing, you should be able to enter the sensor size and FL in the unreal cinecamera and it should work. What is happening when you try to calibrate? Is the FL value you get way off? It is normal for it to be off by a little as the published focal lengths of lenses are not perfectly accurate. You should be able to calibrate the lens with the lens calibrater and then apply the lens file to the cinecamera to get the right settings.
@nocnestudio Жыл бұрын
@@GregCorson Thank you for your help. After a good calibration of the lens distortion, I was able to calibrate the parameters of the camera. And now it fits. It's worse with NodalPoint offset to be good. I think it's because of the large distortion of the small lens of the compact camera... Are you using UE 5.1? When I'm calibrating lens the media player with live camera is lagin with every click until it finally locks up and I have to reset the media player. In OBS and Resolume, everything plays smoothly without lags. Any idea?
@GregCorson Жыл бұрын
Sorry for taking so long to reply. To get a good nodal point offset you have to have a good lens distortion calibration first. This can be hard on a compact camera because distortion is usually worse. Also because you can't repeatably set the camera's zoom to anything but all the way in or out, and the distortion will be different for each zoom level. As far as lockups and such, I'm really not sure, I have not seen this.
@marcelhofman7655 Жыл бұрын
Hi, i'm going to test valve index to track the camera. Is it possible to calibrate entrance pupil and tracker offset like you showed it with vive?
@GregCorson Жыл бұрын
Yes, this video is all about how to do it with Vive trackers. If you want to do it with an index controller instead of a tracker puck, Unreal isn't setup for that. It could be done, but the setup isn't part of UE 5.1
@JMY10008 ай бұрын
Hi Greg, seem to be running into a weird issue where the values for "Raw FIZ Input" are all "No ___ Input"-even though the FIZ controller still totally works, and updates the camera's view inside the lens editor. Have you seen this before? Is that a problem? Also, the Camera Feed Info seems to max out at 1920x1080, even once the Camera Info and the camera's Sensor Back are set to a higher resolution. Is that just an artifact of Unreal using fixed resolution distortion maps or something? Or does that need to be configured somewhere else?
@GregCorson7 ай бұрын
Hi, I'm not really sure about this, I haven't been doing virtual production for a bit (I'm starting back up now) and there have been a number of incremental changes Epic made in Unreal 5.0-5.4 that have changed small things about the way things work. I'll try to follow up on this as I update my tutorials for 5.4
@JMY10007 ай бұрын
Cool, thanks! Love the tutorials, looking forward to it!
@MDLabStudios17 ай бұрын
@@JMY1000 Hi, I saw this issue as well. I noticed there was a message under "Lens Component" talking about how there wasn't a lens component assigned to feed those values. I did some poking around and added a lens component to this CineCameraActor that we're working with and that seemed to solve it. I selected the "CineCameraActor" in the Outliner, hit "Add" at the top of the "Details" panel, typed "Lens", selected the custom "Lens", and then assigned the "MyLens" file (that we built in this tutorial) under the "Lens File" setting. Hope this helps you!
@JMY10006 ай бұрын
@@MDLabStudios1 Thanks! I'm not working on this project at the moment, but if I do get back to working on it I'll give it a try and see if that fixes things.
@GregJones-xw9sg Жыл бұрын
Thank you so much for this tutorial. I'm having an issue with my version 3 tracker. I follow the directions and the camera ends up moving in all sorts of wacky directions after I do the nodal offset with the second tracker. Also I'm using a Blackmagic card for output and the camera output coming out of the Blackmagic card doesn't seem to match the camera in the lens profile when I do the calibration. Any ideas?
@GregCorson Жыл бұрын
If you look at the more recent tutorials on the channel, you can see the setup with UE 5 and higher. One thing that changed in 5.1 is that you need to add a "lens" component to your camera and set it to look at your lens calibration file. Also you need to go into the composure CG layer and check the box that says "apply distortion" and connect it to the lens file or the CG layer won't match the distortion of your lens.
@GregJones-xw9sg Жыл бұрын
@@GregCorson Thank you! That worked perfect! I knew it had to be something simple I missed. The next issue I'm having is that I'm getting 'jitter'. When I pan, the composited image seems to jitter. Do you think this is a result of the Vive Trackers not having perfectly smooth tracking? I've seen other people having issues with 'jitter' using vive trackers. We are using the version 3 vive trackers with 2 base stations.
@GregCorson Жыл бұрын
Vive trackers almost always have some annoying jitter when they are stationary, however you usually don't see it when they are moving like a handheld or panning camera. A couple of things can make jitter worse, the biggest is vibration. Setup your system with everything stationary and try stamping your foot while watching the monitor. If it shakes, you have a vibration problem, if you have a very flexible floor just walking around can cause shake. The other common thing that causes problems is reflections. The base stations are scanning the room with IR lasers, if the beam reflects off something it can cause glitches either small or very large. Common sources of reflection are windows, picture frames, glass tables and mirrors. Sometimes a highly polished floor can also cause problems. The best way to troubleshoot this is to try temporarily covering up things you think might be a problem, if the problem goes away then that's an object that's causing trouble. Of course, accidentally blocking the tracker's view of the base stations while moving the camera can also cause problems. I have had better luck putting the base stations up high, looking down at a 45 degree angle. You also get better coverage putting them in the center of walls instead of the corners, the field of view is 120 degrees.
@brettcameratraveler3 жыл бұрын
Vive trackers have the advantage of being smaller than vive controllers but it seems that a Vive controller is more versatile as you can map any of its buttons to trigger various things like virtual position, etc. Should most just use Vive controllers on top of their cameras for positional tracking or is there another reason why Vive trackers are more often used?
@GregCorson3 жыл бұрын
I agree that the buttons on the controllers are pretty handy. The main issue with using the controller is that it is hard to mount securely to a camera rig. If it slips or moves even a little it can spoil the shot, so people prefer the trackers because they have a 1/4-20 tripod screw to securely mount them. I've seen a number of methods including using a microphone clamp or a 3d printed holder for the controller, but because of their curvy shape it's just hard to get them secure. Some people have some luck using a clamp on the leading edge of the ring but it's still tricky to get it to hold securely and applying too much pressure could crack the plastic. If you are using clamps the best approach might be to use two of them, one near the front and one at the back. I think this is the main reason, also some people have bought base stations and trackers only (no headset or controllers) to keep the cost down.
@brettcameratraveler3 жыл бұрын
@@GregCorson I appreciate it. Just to be clear, can you buy controllers and base stations only and still track the controllers within Unreal/SteamVR? I know you can go into the code and quickly change some values to get Steam to accept controllers without a headset. Also ormally the controllers bluetooth pair to the headsets while the trackers pair to their USB dongles. Perhaps the Tundra dongle can read both the vive tracker 2/3.0 and the controllers without the headset present?
@GregCorson3 жыл бұрын
To use the controllers you need to have a headset. I have heard the dongles that come with with the trackers can be used to with controllers but it may take some hacking and they are not available separately as far as I know. None of the VIVE controllers/trackers actually use bluetooth, they have custom radios. Not sure exactly what the tundra trackers can do.
@Jaogurich3 жыл бұрын
Awesome tutorial Greg, Thanks. I have a question tho.. I don't really have deep knowledge of how the calibration works mathematically, but I did exactly what you did and I read all the UE documents regarding lens calibration. My zoom lens have a focal length range(zoom) of 8.8mm to 176mm. I choose 4 points in between that range and did a calibration in every focal length point(zoom point). Of course in every zoom point, I chose at least 4 points in focus range. In short, I did calibration with lots of combinations of zoom and focus inputs. The problem is every time I change the focus of lens, the virtual camera in unreal engine zooms in and out like crazy. I thought zoom input should change the focal length of virtual camera, but focus is changing focal length while changing zoom input does nothing. Why is this happening?
@GregCorson3 жыл бұрын
Not really sure why it would do this. Some lenses stay in focus throughout their zoom range while others need to be refocused when you zoom. Similarly sometimes refocusing a lens causes a slight change in focal length too (called "breathing"). Usually higher-end lenses and video lenses don't have these characteristics. What kind of lens are you using? Maybe UE's calibrator is tuned for the cine style lenses? Not sure exactly how to debug this, if you calibrated your lens for several focus points at each level of zoom, you should be able to have a look at the curves to see if changing the focus actually effected the focal length of the lens. I don't have a lens encoder setup so I haven't tried this. Something definitely sounds odd, adjusting zoom could change both focus and focal length. Adjusting focus could also change both focus and focal length. But normally focus will have only a small effect on focal length and changing zoom should only have a small effect on focus. If zooming isn't zooming the virtual camera and focus is, something must have gone wrong during calibration.
@dpredie3 жыл бұрын
Thanks!
@ViensVite2 жыл бұрын
Hello mate. I do,nt have vive trackers only ZED 2I. During the end of the process where you use your vive trackers, unreal official tutorial uses point calibration with a CG checkerboard. I did made one on Blender and so on but he does a step where ima a bit lost actuelly. He put he's 3D checkerboard in front of the cherckerboard you can find on Unreal by using livelink on it, whereas when i do it my 3D cherckerboard goes off the map. On the same subject, when he first open live link on the tutorial, he got he's arri tracking feed but he got "cherckerboard" that i dont have. Any idea on how can i finish the process as i dont have neither vive tracker or anything else than a printed cherckerboard and a Zed2i? I cant mesure my rig (FX6 and Zed and 24-105) right to re create the rig atm. i did once the full tutorial worked like a charm but now i'm stuck
@GregCorson2 жыл бұрын
I'm not sure which tutorial you are talking about. Right now you can do distortion calibration with just a checkerboard. but to get the exact camera-to-tracker offset and some other stuff, you really need the VIVE. It should be possible to do this with just an ARUCO marker (another kind of checkerboard) and some other tracker but I don't know how difficult it would be to setup. There are some tutorials on the unreal site about setting up LED walls using ARUCOs that might also work with green screens and printed markers, but I'm not sure. I haven't had time to try them yet. Rassi engineering has something called Retracker Bliss that works this way.
@ViensVite2 жыл бұрын
@@GregCorson if i get a vive tracker (just a vive tracker) with my Zed2i camera would i be able to do it?
@GregCorson2 жыл бұрын
Unfortunately, you can't use just a vive tracker. You need a tracker plus one or two base stations at minimum. Also there would also probably be some issues trying to do this with both VIVE and Zed since they are two different systems. The best bet is probably trying to understand the Unreal system for setting up things using ARUCO markers instead of vive trackers. I haven't had the chance to research this yet though, so I can't give you much guidance. Best references are probably the two below, however searching youtube for Unreal Aruco may turn up some other tutorials that would help. docs.unrealengine.com/5.0/en-US/camera-lens-calibration-overview/ docs.unrealengine.com/4.27/en-US/WorkingWithMedia/IntegratingMedia/InCameraVFX/AligningYourLEDWall/
@GregCorson2 жыл бұрын
Just to be clear, what I'm saying is that you might be able to substitute an ARUCO marker (printed on paper) for the VIVE tracker...I'm just not sure how to do it yet.
@ok3renutube3 жыл бұрын
How can I use the calibration parameters in Aximmetry?
@GregCorson3 жыл бұрын
Sorry for not replying sooner. If you look at the bottom of the calibrator screen they show the nodal point offset and the lens distortion parameters. I suppose you could just copy those over to Aximmetry. If you are using a lens encoder and have a lot of curves and stuff, that will be a lot harder to do.
@ok3renutube3 жыл бұрын
@@GregCorson Seems that every time i capture the camera calibration image and the parameters of lens are a little bit different. And also how to translate the nodal point offset into Aximmetry's delta head transform?
@FrogForge3 жыл бұрын
Hi Greg I have a similar setup. I also use the Live Link FIZ file for Lens Data input. But my Problem is as soon as i start this project over switchboard the FIZ data is hardcoded to the setting in the LiveLink Variables. Any change to them does not affect the running project. Thats a bit annoying since you have to use at least the focus on you camera. Otherwise tracking is useless. So do you know how to dynamically chance the FIZ Data when running a project over Swicthboard?
@GregCorson3 жыл бұрын
Unfortunately I have not tried using Switchboard yet, so I can't be of much help here.
@realdonaodtrump12973 жыл бұрын
Hi Greg, do I have to have two vive trackers to do this calibration? I only have one, so I just used the same one tracker for lens distortion and nodal offset, but in the nodal offset panel no matter how many times I tried to put the tracker in all positions and directions, the Point 3D stayed the same. I think that's because nodal offset needs the tracker that was used for lens distortion to keep fixed as a reference point and the point 3D data comes from it, right? If so,is there any way to complete calibration with one tracker?
@GregCorson3 жыл бұрын
Yes, you do need two trackers to do the offset. It is trying to measure the nodal point/entrance pupil offset from the position of the tracker on the camera so it needs the position of the tracker on the camera as a reference. The only way you could do it with one would be to have it remember the position of the tracker on the camera, then take the tracker off, but if the camera moved even a little while you were doing that, the result would be wrong. The issue is to measure the offset you need two reference points in the same coordinate system.
@MrYoungAndrew3 жыл бұрын
@Greg Corson Thanks so much! Does this work if your cine camera is within a blueprint? I'm using a cine camera within a blue print so that I can use it in Composure. But it doesn't seem to work because the camera is within a blueprint. Any tips or anyone know the work around? THANKS!!!
@GregCorson3 жыл бұрын
Should be possible to make it work that way but I haven't had a chance to rework my sample to use these calibrated cameras yet. I'm thinking a lot of stuff in my current setup, like the autorig, can just be removed because this calibration takes care of it.
@MrYoungAndrew3 жыл бұрын
@@GregCorson Thanks Greg! Really appreciate your research and troubleshooting and sharing of all your great knowledge!! Keep up the great work! We all appreciate it very much!
@JorgeRodiles Жыл бұрын
what happend if only have captured video of the checker boards, but no connection to unreal with live link , can you still calibrate unreal camera to match your previously recorded material?
@GregCorson Жыл бұрын
I have not tried this, You should be able to capture frames of the checkerboards from your video and use it for calibration. Not sure exactly how you would do it though. You would not be able to calibrate the nodal point without a live link connection.
@honghaixie83837 ай бұрын
Did you use FX6? I use it and the cmos is 35.7*18.8,should i match 16:9 to change as 35.7*20.08?
@GregCorson7 ай бұрын
I use an A7R4. When you are putting in the sensor size/resolution you want to use the size of the sensor that is active. If the sensor gets cropped to create a 16:9 image, then you want that cropped size. Most full frame 35mm cameras just remove some from the top and bottom and use the full width of the sensor. So you can usually get the right numbers by dividing 35mm by the aspect ratio you are shooting in. Not all cameras are the same though, so check your manuals to see how much of the sensor is active in the shooting mode you are using.
@JAMs65042 жыл бұрын
How do you export with the distortion?
@GregCorson2 жыл бұрын
What do you want to export it to? I believe when you are using the tool, the various export coefficients and measurements are all at the bottom of the screen if you want to copy them to some other program. In unreal they are in the lens object that the calibration process creates, so you can just move/migrate that into a new unreal project if you need to.
@JAMs65042 жыл бұрын
@@GregCorson first thanks for the reply! I want to export it using movie render queue when rendering out higher quality backplates. It works everywhere else. Just not when I render it out.
@mckachun3 жыл бұрын
if I don't have Vive tracker, but I have an iphone, possible to do this type of calibration?
@GregCorson3 жыл бұрын
The distortion calibration should work with any camera. For the nodal point/entrance pupil stuff to work with something other than VIVE you need a custom driver and a way to have a tracker on the camera and in the scene. If you have some kind of setup where the iPhone provides tracking and video, please let me know. In that case the entrance pupil and related calculations are not that critical because the lens is so small.
@lennardpinto1292 жыл бұрын
Database of camera sensor sizes based on resolution - Often use this in camera tracking for VFX : vfxcamdb.com/
@GregCorson2 жыл бұрын
Thanks, looks handy!
@torsten99833 жыл бұрын
Ah i know why the nodal Setup does not work every time. You need to make sure, that you do some spots with the tracker moving forward. So it is not just on a plane.
@GregCorson3 жыл бұрын
Interesting, I will have to try and confirm this, have you actually run tests or is this just a guess?
@torsten99833 жыл бұрын
@@GregCorson Yes it worked with a Black Magic Ursa and Pocket.
@torsten99833 жыл бұрын
@@GregCorson When moving the tracker on a "plane" like in the tutorial, the camera was upside down and looked in the wrong direction. And with an "extra" dimension it worked for me
@GregCorson3 жыл бұрын
Thanks for the tip, just uploaded a new tutorial suggesting people do this and a few other things that seem to help.
@francescocaterino3 жыл бұрын
Amazing
@MeinVideoStudio3 жыл бұрын
You make fabulous Content. I would just hope you would put some Work into your Thumbnails. On this thumbnail it was very hard to read the text and get what the Video is about. I did watch your other Tutorials and knew you are making the good stuff, but people that never discovered you will not be so nice. Potentially skipping the Video. If you would just put a Grid, Lens and the UE Logo on the Tumbnail and nothing else. This video would kill the Internet. I just wanted to let you know this because I want you to get more Views because this is the stuff I needed to get things working for me.
@GregCorson3 жыл бұрын
Thanks, you are probably right. The color choice on this thumbnail was not so good, I will have to see about fixing it.
@MeinVideoStudio3 жыл бұрын
@@GregCorson i hope you have a good new year's Day. Happy new year
@CryptoSensei3 жыл бұрын
wow
@GregCorson3 жыл бұрын
I kind of went wow too when I saw that they were calibrating the Entrance Pupil too! Although it was more like WOW why didn't they do this sooner! People have been having problems calibrating Entrance Pupil for years now, really should have been the first thing they did because it is so essential for getting a good result.