Create a Mixed Reality Game FAST - BUT JUST With The Unity Asset Store!

  Рет қаралды 2,684

Dilmer Valecillos

Dilmer Valecillos

Күн бұрын

Пікірлер: 7
@dilmerv
@dilmerv 3 ай бұрын
📣 All of the assets used in the video are currently priced at 50% off. There are many other assets available here assetstore.unity.com/dilmer-valecillos?aid=1101l7LXo (affiliate link) that I also recommend year after year for XR or game development in general.
@jeffg4686
@jeffg4686 3 ай бұрын
Dilmer, I wanted to recommend something in this field. A guide of sorts, perhaps you could sell it. The guide would have different use cases for Mixed Reality, and what approach you would take - not the details, just the high level approach. Here's a few uses case examples: 1) A restaurant - you blow out the ceiling and replace it with a 3d scene that you're interacting with (a game that renders a scene to a plane that's placed just below the surface of the ceiling in the restaurant)- Render Textures I'm assuming could handle this, though I haven't gotten into the details yet. A similar thing would be essentially blowing out the ocean completely with a plane that gets some game rendered onto that you could play anywhere along the beach. 2) A put put golf course - you want to add in some characters to the course. Perhaps they could be involved in the game in one way or another. What would be your approach for getting those characters to display properly sitting or standing on top of parts of the miniature golf course? All the tracking and everything - what approach. 3) A tunnel - you render to the tunnels surface - say for instance the fake tunnel surrounds a real world street, but you're not rendering a static tunnel. You're rendering a 3d scene onto it of fish swimming above you, or something like that. -- So, I'd like to know your approach for each of these 2 scenarious now if you're able to, but in general, I think having a guide to show developers what types of gigs that they can go out and do and what high-level approach that's likely going to work out the best. -- Could have a guide of 30 of these different types of games and the approach you would take.
@bennguyen1313
@bennguyen1313 3 ай бұрын
Is it possible to develop a mixed-reality game that superimposes an image onto the phone screen (without the use of a headset)? Not sure the difficulty involved, but I'd like to make android app that can find components from a circuit board. For example, if you have a sheet of paper with a grid of squares, say 50x50.. if A4 is said/entered, could AR use a fiducials on the paper to calculate precisely how far into the grid it needs to place an appropriately scaled marker (scaling it based on the the angle and how close the camera is to the sheet). Can an image be overlayed onto the camera screen with this level of precision? Would such an application be realistic for a beginning unity programmer?
@dilmerv
@dilmerv 3 ай бұрын
Currently, you won’t be able to do it with a headset mainly because there is not any kind of image tracking or object tracking available with Meta Quest devices yet. Vision Pro is pretty similar unless you use their enterprise solution. With phones (Android) yes that could be possible, you could use image tracking for your reference grid of squares and with some type of marker, then based on the dimensions, positions, rotations, you could then do your calculations. Also, take a look at something like media pipe from google github.com/google-ai-edge/mediapipe/blob/master/docs/solutions/objectron.md Best, and great question!
@justinkasowski1688
@justinkasowski1688 3 ай бұрын
@@dilmerv I think he was asking the opposite (which is what I was wondering too). It looks like the Meta SDK is an OpenXR extension, so shouldn't it be possible to build an android/iphone app that utilizes the Meta SDK's scene understanding? Chat-GPT tells me it's possible, theoretically it feels like it should be possible, but when I try to set it up I either get OpenXR runtime errors or just don't see how it can be done when the Meta SDK uses the HeadsetManager class (forget what it's actually called, the one where you pick which quest devices to support). Would love to get your opinion since you seem familiar with the Meta SDK. I think for @bennguyen1313 looking into Unity Sentis and training a single neural network to recognize circuit boards would be the easiest option. There are lots of colab guides on how to train a network to segment something with computer vision through something like tensorflow (which the network can then be converted to onnx which is what I believe Sentis uses, at least Barracuda did when that was Unity's AI option). Then you could overlay the grid onto the segmented board like you're talking about.
@LStargate
@LStargate 3 ай бұрын
Also vision pro?
@dilmerv
@dilmerv 3 ай бұрын
I have been making many prototypes for Quest 3 which I plan to cover with Vision Pro, thanks for your suggestion!
I Made a Game in Unreal in 14 Days... (No Experience)
32:59
Jack Sather
Рет қаралды 1,5 МЛН
How to create a Quest 3 Mixed Reality App - Unity BEGINNER Tutorial!
33:28
Из какого города смотришь? 😃
00:34
МЯТНАЯ ФАНТА
Рет қаралды 2,6 МЛН
За кого болели?😂
00:18
МЯТНАЯ ФАНТА
Рет қаралды 3,2 МЛН
Dear Game Developers, Stop Messing This Up!
22:19
Jonas Tyroller
Рет қаралды 731 М.
The Strange Graphics Of LETHAL COMPANY
15:59
Acerola
Рет қаралды 964 М.
The Meta Immersive Debugger is Now Available! - Unity Tutorial
15:31
Dilmer Valecillos
Рет қаралды 2,6 М.
The REAL Reason Unreal Engine VFX Looks FAKE
6:58
Joshua M Kerr
Рет қаралды 521 М.
How to create a Quest 3 Mixed Reality Game - Unity BEGINNER Tutorial!
7:29
AI Learns to Play Tag (and breaks the game)
10:29
AI Warehouse
Рет қаралды 4,5 МЛН
Making my bones UNBREAKABLE with real-life NANOTECH!
43:01
Hacksmith Industries
Рет қаралды 4,5 МЛН
Professional animator works on subscribers' Unity games for free.
22:20
I Spent 1 Year Creating My First Game
7:09
CodyCantEatThis
Рет қаралды 1,3 МЛН