Hi. Thank you for the info in the description. I am interested in your projection method from fisheye. I found a Hugin software, but it is for pictures, not for videos. I film on my Visinse in ( no vr) or fusheye, then I add 280 pixells on top and bottom. Then I can watch it with skybox on quest 3. But on the edges the image is deformed. You have a method that makes it better? What do you use? P.s. using vr mode in the camera messes up the image, and it is bad! Thank you.
@exolonx52 ай бұрын
I use Hugin as I said and you're right, it's for images. But what is a video if not a series of images? Short: I use ffmpeg to convert to a numbered series of jpg images padded with 280 on top and bottom and then have Hugin run on each image. It has a batch mode that simplifies that. The only slightly complicated thing about that is that the batch mode requires a project file for each image. That's not too hard to solve. You've got two lenses and a series of images for each. Since the lenses don't vary for each eyes series the only thing you need to change per project file is the filename. That can be scripted. I've thought about using the fisheye correction of ffmpeg before, but unfortunately it uses a different mathematical approach to model the lenses so the variables you get from Hugins optimizer are not usable with ffmpeg. Some guys tried to figure the correct values out by hand, but that's to much fiddling for my taste. Guessing a forth order polynomial doesn't sound like something that can be done in a few minutes. However, if that sounds like the better approach to you, you're going to find a few pages on Google on that. It involves taking a picture of a checkerboard like surface and then tweaking each parameter until the edges of the board appear straight. From there on you would project it to equirectangular. Unfortunately that's not all of it. What annoyed me the most about the Visinse is that the lenses are poorly aligned to the sensors. An ideal camera would be build so that the images would pixel perfectly align at the horizon, because our 6cm eye-to-eye distance is irrelevant if something is very far away. At lest for my Visinse that is not true and the lenses are not only misaligned horizontally, they are even misaligned vertically. That's nothing my eyes can compensate, by going slightly cross-eyed. I'm not a cameleon and I can not steer my eyes independently. Dual fisheye with misaligned lenses unfortunately has sickening effects, even if one would be willing to ignore that the lenses do not generate a mathematical perfect fisheye view. The latter is why you get distortions in skybox. It assumes perfect fisheye and re-projects for that. In order to get a good result you need to align the lenses and apply lens correction. There is no way around it.
@alexlevkin6302 ай бұрын
@exolonx5 wow! Thank you for this detailed comment! It is very interesting to check and to try, but I film a lot, and I can't spend time, frame by frame, making a couple of seconds footage. The info about ccd not placed precisely on its place bothers me. So, even with good software, it is not sure we will get good quality. I hope my camera is not bad in sensor center way. I don't have a headache when I watch my videos from Visinse. Camera needs to be fixed or with gimbal. Hand held is not acceptable for me. There, I have problems watching it. Contrast is bad, and the only way I found it is a slightly tilt camera down, so it will be not so dark. Next week I will film a lot with it. On deovr I am alexvr. I will put some videos there soon. Thanks again for your help. I will check my lens, just for info.
@exolonx52 ай бұрын
@@alexlevkin630 The lens alignment thing can be tackled in an easier way. I noticed the misalignment because I overlayed left and right eye image of a still picture to see the differences. I expected to see huge differences for close objects, as it's where the eye to eye difference plays a role, but to my surprise things about hundred meters away were still shifted and that should not be the case. I corrected the image by moving each eyes fisheye a bit, so that the objects at the horizon perfectly aligned. That alone improved the result by a lot, if one can live with the slight distortions because of the imperfect fisheye lenses that might be enough for most viewers. The process would be to split your video in two, one for each eye, then shift each video by some pixels and put them back side by side. In my case, I thought I might also go the whole way and also correct the fisheye distortion, because I have an old server from work lying around at home that I can have running over night doing the work image by image. Some interesting side facts: KZbin does not require your input video to be equirectangular, you can also provide dual fisheye input when you tag it correctly with some VR tag injection tool. YT will then project the fisheye to their needs, but their algorithms result in a less sharp image. I have no idea what they're doing there. They even have some features to correct fish eye lenses after video upload on their side. One involves to specify lens correction paramenters, like for hugin or ffmpeg, the other involves uploading distorted vector meshes that will fix the distortions when the video is projected to them. I did not try any of those yet, because again I do not have the parameters of the lenses in the form required for these functions. If you're interested in that: "google spatial media" are your search terms, but beware, this is at a rather scientific level.
@alexlevkin6302 ай бұрын
@exolonx5 hi again, and thank you for a long and interesting comment. I did some tests filming and checking distortions. I also used qoocam ego studio, where I could use a grid and distort the image with pitch, etc. My camera looks to me quite well. Almost no visible distortion. Maybe I am lucky. For example my ego: I add +3 to every footage that I take with it, I have a right distace between both lenses after that and I can watch a video for a longue time and my eye will be not forced to adapt. The Visinse looks good in this matter. I have 2 other youtube accounts where I put videos, but I have not many views. So, I am not uploading anymore. When I film, nobody knows about 3D. They ask, but they will not start doing it. 360 videos are more popular, but me, personally I only see advantage when you cut it and do 2D video. So, we are stuck with a hobby that is rare and will not be developed well in the future. I hope I am wrong...