I am wondering how this is coded in Unity... UX for desktop is static using the Canvas, while on VR experiences, I am not too sure about, I suppose that overlays are not that possible and a different means like a pop up wrist display is in order? Does this mean that we have to have different builds specific to each target platform or have a script to detect the platform we are using? What would be the best practice?
@ninjarobotstudiollc Жыл бұрын
In-world interfaces are best for VR. There are many ways you could go about it to decide which UI and interaction/input methods to use in your XR experiences. As far as creating the build for platforms, I don't know the development best practices since I'm not a developer. If you're just developing for multiple VR headsets, you could use OpenXR. However, if you're creating 2D apps and VR experiences for example, that has to be a separate build. How you best handle the project development would be something developers can give more insight into.