📌 The demo Unity project featured today is available via Patreon: www.patreon.com/dilmerv 👉 For additional documentation about MRUK take a look at: developer.oculus.com/documentation/unity/unity-mr-utility-kit-overview (this was a great resource to get a overall idea of what’s available)
@tariksavas24 күн бұрын
You made everything so simple and clear despite the SDK's complex documentation. Your explanations and simplifications were a huge help, truly made a big difference.
@dilmerv24 күн бұрын
Thanks a lot, I really appreciate that! I am so glad it helped.
@wfleming8015 ай бұрын
Thank you! For sharing/making this video!! I’ve been missing these features! So glad they are available now- Can’t wait to try it out
@dilmerv5 ай бұрын
Thanks for your feedback and if you have any questions as you integrate it let me know 😉 have fun!
@jeffg46864 ай бұрын
@0:53 - Surface Projected Passthrough - NICE - allows them to be mostly immersed, but also see the real world when needed.
@dilmerv4 ай бұрын
That’s a great feature I agree! Thanks for your feedback.
@jeffg46864 ай бұрын
@@dilmerv - I was wondering about this just yesterday. If developing a VR app for a restaurant for instance - everyone plays together in a VR room, but they still want to be able to see the world around them - I was thinking of a switch view (switch to passthrough to view the world, then back to the game). So, this is much nicer - still get to play the game and view
@thomptechdotnet5 ай бұрын
Awesome video Dilmer!
@dilmerv5 ай бұрын
Thank you Jeff, I appreciate your feedback man!
@jamesprise42525 ай бұрын
thanks, Dilmer! Anything for spatial mapping or image tracking?
@dilmerv5 ай бұрын
Thank you for your feedback. Meta currently only supports it prior to entering the app but image tracking is definitely not supported yet. Are you also considering other XR devices?
@jamesprise42525 ай бұрын
right now, im just trying to map an experience to a specific room. Maybe there's a way to export the room scans without having to build a 3D model of the space and "fitting it" to a prefab? Thanks, Dilmer!
@propipos10862 ай бұрын
Good day! I have a question: Is there a tool similar to Vuforia for creating image targets, so that depending on which image is detected, a 3D object appears?
@dilmerv2 ай бұрын
Hey thanks for your great question! Currently, Meta doesn’t provide image tracking support or access to the cameras, I believe it is coming next year based on what they announced during Meta Connect 2024.
@정주호-f5x4 ай бұрын
Hello, Dilmer. I'm from Korea, and I really enjoy watching your videos. Thank you for sharing so much valuable information. I have a question for you. Is it possible to do image tracking with Meta Quest 3’s passthrough, like how you can do image tracking with AR Foundation?
@dilmerv4 ай бұрын
That’s a great question and thank you for all your support! As far as image tracking, currently Quest 3 doesn’t support image tracking, and you are correct AR Foundation offers it, but the underlaying Meta OS doesn’t expose it, at least not yet. Thank you and let me know if you have further questions.
@cruzmoragael9833 ай бұрын
Hello, thanks for this video. I was wondering if I can save that scene model and develop interactions on them, I am working on VR and XR training at my work, I scan a part of my work and save it, and in editor mode I add features and many things. that when I want to export my application to the quest 3, the lenses recognize this environment, and that all the interactions are within it, so to speak that when I walk everything is defined as I scan it, but I cannot find a solution to this
@dilmerv2 ай бұрын
What type of interactions are you trying to develop?
@cruzmoragael9832 ай бұрын
The simple interaction of grabbing objects, snap, and I want to make an interaction as if I were smearing something on a surface, for now, as a solution to this, I use particles to simulate it, but it is not completely correct, Thank you very much :)
@erenboran39185 ай бұрын
Thank yok for great videos but I have question I'm trying to figure out how to do the room scanning on Quest 3 (like in the demo app "first encounter", not the basic plane detection that sets up walls from your room setup.) How can I make a Scan room button do you know anything?
@dilmerv5 ай бұрын
My understanding is that each app will check for a scene model at startup, you can’t launch it yourself (with a button) as this is all handle at the OS level. Also, if you require a mesh (global mesh) there is a label that you can turn on within the MRUK prefab that will enable the captured mesh, then you can have more flexibility when it comes to raycasting.
@ar.manimaran2 ай бұрын
Can we implement luma ai into unity, like i want scan an object(cars) in quest 3 and convert it to 3d object and i want to control the 3d cars like remote controlled cars. Is it possible? everything should happen in quest 3