📢 Everyone, be sure to checkout this link bit.ly/3CKf1lG for additional info on Meta's SDK and Multiplayer Building Blocks.
@MetaverseAdventures3 күн бұрын
Love to see a followup video on getting the new Meta Avatars working with locomotion and Interactions using XR Origin Rig and XR Interactions. The Unity Netcode and Fusion Meta Building blocks shown here are only for OVR Rigs and interactions. Near as I can tell, no dev have got this combo working with Meta Avatars 2.0 and many need it. I got close, but the non host could not see the host avatar yet as the LOD did not enable, yet the transforms were being received. The host can see the non host avatar moving around no problem. Maybe you can crack it?
@marvinbai15Ай бұрын
Thanks for sharing this great video! I'm developing with Photon and Meta Avatar and ReadyPlaerMe. For the Meta developer dashboard for getting App ID, can an individual get verification? or does it have to be corporate?
@dilmervАй бұрын
Hey thank you for your question. If you are planning to upload an app then yes that is required. You can find more info here about the verification process: developers.meta.com/horizon/resources/publish-organization-verification/
@cfactorygames13 күн бұрын
Failed to poll initial positon, no head position found UnityEngine.Debug:LogError (object) Oculus.Avatar2.MecanimLegsAnimationController:PollInitialPosition () (at Assets/Samples/Meta Avatars SDK/35.0.0-pre.1/Sample Scenes/Scripts/LegsNetworkLoopback/MecanimLegsAnimationController.cs:626)
@AnthonyReyes-b1e5 күн бұрын
I came across this, I ended using the previous version, Meta Avatars SDK version 31.0.0
@cfactorygames4 күн бұрын
@@AnthonyReyes-b1e I'm using also now! But when I use HandSkeletonVerison as OpenXR the Avatars Hand Animation tracking no working, but if I use OVRHandSkeleton instead the poke and interactions stopping to work in this case
@esko7503Ай бұрын
Does the Colocation features work for PCVR builds? I read on Meta's Github that there is some level of support.
@dilmervАй бұрын
As far as I understand, Mixed Reality & Colocation only works with Quest standalone builds. PCVR will need to run natively and I don't believe the info needed is available from the PC running the app.
@tobi6758Ай бұрын
Hi Dilmer, would you recommend the quest 3S for XR development. I am currently trying to decide in between Q3 and 3S and wonder if there is any disadvantage in Passthrough Quality or in the Mixed Reality depth sensors that would make the Q3 the better choice?
@dilmervАй бұрын
I love my Quest 3 personally, Quest 3s is very good as well and the price is so great, but I feel like the image is so much better with the Quest 3 pancake lenses.
@tobi6758Ай бұрын
@@dilmerv Got it ,than I probably opt for Quest 3. Thank you :)
@jameswilliams772529 күн бұрын
Not sure why this doesn't work completely for me. The avatars are spawned but they do not move. They are just locked in place.
@WeJump_Boris27 күн бұрын
same
@AnthonyReyes-b1e14 күн бұрын
this the error message I get: [ovrAvatar2] OvrAvatarBodyTrackingMode.Standalone is deprecated, please use OvrAvatarBodyTrackingMode.None instead and set the InputTrackingProvider, InputControlProvider, and HandTrackingProvider properties of the OvrAvatarInputManagerBehavior MonoBehavior on the OvrAvatarEntity prefab/instance. UnityEngine.Debug:LogError (object,UnityEngine.Object)
@endgamedevsАй бұрын
Nice tutorial! Thanks for putting this out. I had some issues with Unity 6 and meta building blocks. It kept giving some errors. But will look at it again using this video. Thanks
@dilmervАй бұрын
Glad it helped and let me know how it goes, thank you!
@irtazaarshad3997Ай бұрын
Avatar rotation using ovr controller prefab gets funny , so I rotate my head ( camera rig) the local avatar goes away from the controller I tried couple of things but the minute I add ovr controller prefab and try to move using controllers the avatar goes away
@dilmervАй бұрын
I don't believe I have tried adding the OVR Controller Prefab, just curious why you added a controller when using the Avatar if the system automatically adds the controller for you as well as avatar hands?
@irtazaarshad3997Ай бұрын
@ I am trying to add movement by quest controller functionality and for that if I am not wrong we need the controller prefab so I just used the ovr controller prefab but if you could create another movement updated video with networked avatar that would be super helpful
@irtazaarshad3997Ай бұрын
@ for now I have figured out how to keep the avatar in place when moving in VR with controllers
@user-bggxdfrzaxdfsfsАй бұрын
Hey, does it also supports web ?? If we export using Web build in unity??
@dilmervАй бұрын
Hey thanks for your comment, I appreciate it! Currently, this is specific for native Meta development tools, but Netcode or Photon support other devices, so I recommend looking more into those if you like to integrate networking with a XR application.
@sharadpoudel7116Ай бұрын
Hi Dilmer, Just watched your latest upload-amazing stuff as always! 🎉 Here’s the first comment from a long-time subscriber 🙌 I have a request for your next video! Lately, I've noticed that many tutorials focus on no-code solutions, especially with Meta's updated workflow and the seamless integration with building blocks, Unity Netcode,UGS, and similar tools. Here’s what I’d love to learn: Let’s say you’re setting up a competitive 1v1 minigame (like a dart game) inside the multiplayer setup as mentioned in this video. How would you handle recording player ID/ player performance and scores in the backend? In deeper level wanted to be able to distinguish between players who join the networked room in code. I’m currently using Photon for networking but suspect it could also be done with Unity Networking Package. Could you dive into how to set up such a system? I strongly believe this would allow your viewers to build more than just quick interactions in multiplayer setup. Looking forward to hearing your insights! 🚀
@dilmervАй бұрын
Thanks a lot for your feedback and you're correct! I've been changing the content a bit and focusing more on non-coding solution, but that will change very soon. I think going forward I'll go back into coding, however this video was one of the last ones I needed to make to cover most building blocks available. Also, I like your question very much and handling of backend data vs networking data is a great topic. I'd say try to keep those two separate, try to think on how you can leverage web services for your player information rather than having to rely on your internal network stack to do so. I'll put something together to help the community, again thanks a lot for your feedback.
@nickdevАй бұрын
i see the tittle on homepage in my language (PT-BR) and i come in to check how is and is surprasingly good., a little robotic yet, but now have a little more expressions... maybe i will continue to watch on english to learn more the language and pronunciation, but it´s helpful for underrestand everithing easily
@dilmervАй бұрын
I am glad you were able to listen to the video in your native language 😉! Hopefully it gets better and better with time. Thanks for your comment!
@jakoblacourАй бұрын
Love this but lost it at 4:37
@dilmervАй бұрын
Hey thanks for your feedback. That part is mainly using “Test Users” which you can create by going to the Horizon Portal. Then to set each instance in Unity to login with your test users set the Oculus platform login to use your test user info.
@dilmervАй бұрын
The file is under Assets > Resources > OculusPlatformSettings
@jeffg4686Ай бұрын
We want to see what's on tap for using an AI interface to put it all together. I know these building blocks make it lightning simple, but, would be really nice just to talk to an AI interface...
@dilmervАй бұрын
Thanks a lot for your feedback, I will look into doing videos about AI as well since I had such a great time doing so before.