Two algorithms: head-to-head (libsurvive april 2018)

  Рет қаралды 6,276

CNLohr

CNLohr

Күн бұрын

Пікірлер: 22
@moncef0147
@moncef0147 6 жыл бұрын
guess you could say, lib survive ... survived.
@ra_benton
@ra_benton 6 жыл бұрын
Awesome to see more progress on this. I might have to see about getting a vive set now that they've come down in price so that I can pitch in.
@CNLohr
@CNLohr 6 жыл бұрын
that would be really cool.
@CharlesVanNoland
@CharlesVanNoland 6 жыл бұрын
A few months ago I was excited to set my sights on acquiring a Vive and start working on getting solid tracking into LibSurvive, and also OpenHMD... but then I got a Rift, and now I just want to make cool games for the first time in years, hooray for VR. As far as my thoughts on the lighthouse tracking: I thought it was rather telling of the production implementation that it updates the position with each sweep, not after a pair of (pseudo-)orthogonal sweeps to deduce an actual position, indicating to me that it just fudges around a last best-known position along a perpendicular plane, and is otherwise clearly relying very heavily on coasting on IMU data. The external tracking system is invariably going to suffer from all kinds of noise and glitchiness, and if you have something as solid and fluid as IMU data to go on (after filtering, granted) it'd be a shame to not extract every single bit of value from it. When I was looking at all this back during the holidays and hanging out in the Discord back in January I was dead set on milking that IMU for all it was worth. Nowadays I've resigned to just waiting for OpenXR (don't hate me), especially if all the major HMD vendors are willing to write their driver implementations to support it properly. As far as I can tell, the proprietary SDK/runtime nonsense is just a growing pain of XR, and will be a thing of the past just like 90deg FOVs and screen door effect. We're in the days of VR that are like the Atari days of console gaming. There's a lot that's going to change just over the next decade, and it's super exciting.
@KeithYipKW
@KeithYipKW 6 жыл бұрын
You may want to test your view projections using more realistic scene, such as a house and a dense city. Scenes are important for the 3D effect. Floating transparent objects in infinite space have a very weak 3D effect. It is also uncommon in your experience so it is difficult to tell if a projection is good. If the glitch is intrinsic to the hardware, it may be solvable by some existing glitch resistant filters. I feel like it has been a common problem since the past and people have solved it already.
@willrandship
@willrandship 6 жыл бұрын
I would recommend making a tracked model of the boundaries of the room. That way, settings can be tweaked until the view inside the headset matches the view with it taken off. Nothing too fancy, just a basic wireframe of the floor and wall edges would be enough.
@rhoen8075
@rhoen8075 6 жыл бұрын
Perhaps the implementation of a Kalman filter would help with the position estimate of the controllers?
@ThereminHero
@ThereminHero 9 ай бұрын
Any update on this? It's been 5 years but I noticed the github is still active.
@dorbie
@dorbie 4 жыл бұрын
IMU should be much lower latency than the lighthouse scanner, so even with robust lighthouse approach you want to perform low latency correction using the IMU.
@CNLohr
@CNLohr 4 жыл бұрын
That was a core tenant of my charlesrefine driver.
@jacobdavidcunningham1440
@jacobdavidcunningham1440 Жыл бұрын
0:55 my god that acronym haha great
@CNLohr
@CNLohr Жыл бұрын
EPnP or SBA?
@jacobdavidcunningham1440
@jacobdavidcunningham1440 Жыл бұрын
@@CNLohr EPnP, mouthful ha
@drink__more__water
@drink__more__water 6 жыл бұрын
Man, I really need to put my big boy pants on and get better at C...
@dorbie
@dorbie 4 жыл бұрын
If you are asking about convergence then you are conceptualizing the display geometry wrong. You need to put the pixels where they belong for each eye independently. The convergence is an emergent property. You do not rotate the view in for convergence. You draw the frustum to match the display intrinsics for each eye (a.k.a. field of view but supporting asymmetric frusta) and you position the camera extrinsics (a.k.a. relative viewing matrix, or eye space inverse model matrix for the display positions) relative to the tracking origin correctly for each display, extrinsics can include rotation but it is not a fudge to force convergence artificially and on a display like the Vive you are likely to have parallel viewing vectors and perhaps asymmetric frusta. This process produces a correct display geometry and the information is in the public domain. In addition you need to warp the rendered image to compensate for the display optics (using render to texture and a transfer to screen warp), this requires the warp center and radial distortion warp function for each eye. All this information is available for various headsets and you can probably estimate if you know what you're doing. Valve will give you this information if you just ask the right questions and it exists in libraries like OpenVR. Finally during this final warp you can compensate for the elapsed render time since you measured position and time-warp the rendered image to apply the latest estimated camera pose to each eye minimizing latency. Lots of tricks have been tried in this last stage like handling hand tracking independently from head tracking and compositing in the warp, accounting for depth and doing a parallax friendly warp that handles changes in head translation not just rotation.
@CNLohr
@CNLohr 4 жыл бұрын
LibSurvive is still being devloped by others, but I had to stop deving on it for conflict of interest reasons. Many of your insights are accurate and describe an excellent path forward. You may want to join the libsurvive discord.
@troylee4171
@troylee4171 6 жыл бұрын
Awesome man
@DerSolinski
@DerSolinski 6 жыл бұрын
Waaaaait that's the reason why no sensor fusion was done yet? I mean I followed pretty much from the beginning and it struck me a bit odd that nobody used the IMU yet, but since my RL sucks and keeps me occupied I never really read all the discussions on the github and in discord -_-. I always thought that you didn't want to use the IMU because you like the challenge... I knew that from the beginning that the main way of tracking in the Vive is done via the IMU since it has 1000Hz polling rate... The LH are for drift correction. That's how the most absolute position systems work (even the fancy smartphone AR stuff) with a few exceptions.
@ineedtodrive
@ineedtodrive 6 жыл бұрын
reflection. is there any different when u close to the wall or far from it.
@seanocansey2956
@seanocansey2956 6 жыл бұрын
You are brain 😍
@geogeo3644
@geogeo3644 6 жыл бұрын
"disord in the description"
@L1Q
@L1Q 6 жыл бұрын
DISORD invite link!
I Built The First LAMINAR FLOW ROCKET ENGINE
15:51
Integza
Рет қаралды 1 МЛН
libsurvive - May 2017
5:59
CNLohr
Рет қаралды 5 М.
The FASTEST way to PASS SNACKS! #shorts #mingweirocks
00:36
mingweirocks
Рет қаралды 14 МЛН
escape in roblox in real life
00:13
Kan Andrey
Рет қаралды 18 МЛН
Фейковый воришка 😂
00:51
КАРЕНА МАКАРЕНА
Рет қаралды 6 МЛН
АЗАРТНИК 4 |СЕЗОН 1 Серия
40:47
Inter Production
Рет қаралды 1,3 МЛН
Game Made for VR on a $1 Processor?
16:48
CNLohr
Рет қаралды 21 М.
I've never seen ANYTHING like this before... Temple OS
17:57
Linus Tech Tips
Рет қаралды 4,2 МЛН
But, Can It Run Doom?
8:37
CNLohr
Рет қаралды 91 М.
Electromagnetic Aircraft Launcher
15:09
Tom Stanton
Рет қаралды 1,1 МЛН
Why Linux Is Better For Programming
13:32
Hallden
Рет қаралды 2,8 МЛН
3 Hours vs. 3 Years of Blender
17:44
Isto Inc.
Рет қаралды 4,8 МЛН
The coolest robot I've ever built!
19:40
Thomas Burns
Рет қаралды 4,6 МЛН
We built the Biggest iPhone in the World.
23:30
Mrwhosetheboss
Рет қаралды 3,4 МЛН
The Truth About SIM Card Cloning
13:04
Janus Cycle
Рет қаралды 1,1 МЛН
The FASTEST way to PASS SNACKS! #shorts #mingweirocks
00:36
mingweirocks
Рет қаралды 14 МЛН