To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/AlbertBozesan/ . The first 200 of you will get 20% off Brilliant’s annual premium subscription.
@trinsic6652 Жыл бұрын
I could see this working really well with environments and scenes created based on a novel, with the audiobook accompanying the user. Basically listening to audiobook, while in VR with a visual environment based on the novel.
@albertbozesan Жыл бұрын
Totally!
@alecubudulecu Жыл бұрын
Awesome vid. Can’t wait to try it. Quick note. Suggest doing inpaint before upscaling. Works much faster
@albertbozesan Жыл бұрын
I suppose, but then you’ll lose resolution in the details and have to paint those back in anyway. Would you say the benefit outweighs that?
@sarpsomer Жыл бұрын
Clean tutorial without cutting or skipping the important parts. Thanks.
@albertbozesan Жыл бұрын
Thanks! I’m glad I managed to keep that balance.
@ArtfulRascal8 Жыл бұрын
Mind absolutely blown. Back in august of last year, i had no idea this would be coming true so fast. Thanks for the tutorial!
@taserface81476 ай бұрын
Just a suggestion, to make things better, skip the depth map creation initially. Upscale the image and make the final tweaks, then create a smaller size version of it for depth map. this way you can have more room for tweaking.
@Iskarioto Жыл бұрын
I started to learn Blender from 2 weeks ago just for can use your Environmake addon, Nice work!
@albertbozesan Жыл бұрын
Wow! Hope you enjoy it 😄
@Iskarioto Жыл бұрын
@Albert Bozesan Much appreciated 😅
@taavetmalkov32953 ай бұрын
hello. Good tutorial! i was wondering if i can take the generated space image and somehow base the next image on the previous one so that the details remain the same,,, for example if i want the same room but a view at the side of the couch...
@albertbozesan3 ай бұрын
Not precisely, but you can check out my latest tutorial to get more control in rooms. You can also research “IPadapter” to get similar images.
@Raulikien Жыл бұрын
2023 is wild, so much new amazing tech
@wedgeewoo Жыл бұрын
this is a great workflow
@albertbozesan Жыл бұрын
Thanks!
@unheilbargut Жыл бұрын
A very cool method again. Thank you Albert, this again is a perfect, cool, unique tutorial! You rock! ✊
@albertbozesan Жыл бұрын
Thank you!! I try to keep my ideas non-obvious :)
@TrentSterling Жыл бұрын
Wow; I've been generating skyboxes for a little VR indie game using blockade-labs and I had no idea there was a 360 Lora available for SD! This will make fine-tuning so much easier. 😃
@Nanotopia Жыл бұрын
Great tutorial! Thank you for sharing 🥳 I wonder if you might know of a plug-in for VIVE/Index rather than Quest? Meanwhile, I am Googling that now 😁
@albertbozesan Жыл бұрын
Thank you! Not that I know of, I only have the Quest 2. But I'm sure there must be an option out there :)
@fifasglitchesita8614 Жыл бұрын
thank you, really, to share this. you gained a new subscriber.
@albertbozesan Жыл бұрын
Thank you! I'm glad it was helpful.
@erank3 Жыл бұрын
Great tutorial, you have great design skills 🙏
@albertbozesan Жыл бұрын
Thank you!!
@yorkewu99464 ай бұрын
Awesome video, can I achive the same using unreal engine? any diea thanks a lot
@albertbozesan4 ай бұрын
I’m sure that’s easily done, but I’m no expert in that engine.
@yorkewu99464 ай бұрын
@@albertbozesan Thank you~😇
@elisabethfischer873 Жыл бұрын
Hey, thanks again for the great tutorial! How would i send this file to someone so they can try it out on their Quest without having to build the environment themselves? Is that possible?
@MrMahoark11 ай бұрын
LoRa means additional parameters that come from a finetuning on top of a model, so they cannot be used with any model, right ? just the one on top of which it was finetuned
@albertbozesan11 ай бұрын
LoRas can be used with any checkpoint as long as the base model is the same (SD 1.5 vs SDXL for example). But it’s correct that the checkpoint a LoRa was trained on is the one it works best with. This is why older LoRas in general work worse and worse on newer checkpoints.
@ReinBijlsma Жыл бұрын
Excellent video! For some reason my panorama viewer doesn't allow me to rotate te screen. It just shows the generated photo, but I'm not able to pan the screen. Any clue what i did wrong? Everything else works fine.
@jasonjmp Жыл бұрын
Thanks for this so cool! Any chance we can see examples anywhere that can be reached from the Oculus browser? That's probably another video. :)
@albertbozesan Жыл бұрын
Thanks! What exactly do you mean? What do you want to see in the browser? 😄
@kiksu1 Жыл бұрын
Very cool! (I see what you did there with the word FUNCTIONAL 😂) I don't use autoupdate, thank god.
@alexxa345 Жыл бұрын
Great video! Maybe you should try the skybox labs as a 360 generation method. I feel as it may simplify the process even further.
@albertbozesan Жыл бұрын
BlockadeLabs ist excellent, I just feel like figuring out and teaching people how to do it independently on their own computer is more valuable in the long run 😄
@alexi79416 ай бұрын
Great video! Qq, is this seamless in headset with the end result?
@albertbozesan5 ай бұрын
Horizontally, yes. The top and bottom aren’t very pretty.
@craceyu Жыл бұрын
Great tutorial and so helpful! Would you like to publish your home to MoonVRHome where people can install this custom home in VR?
@smartin-g2u Жыл бұрын
Great video! Can you recommend the steps it might take to convert this environment into one you can use in the Unreal engine so that a player could navigate within it?
@albertbozesan Жыл бұрын
The .gltf I export towards the end should import perfectly in any game engine :)
@alyx91811 ай бұрын
when trying to generate on the enviromake tab i get error bpy_prop_collection[key]: key"specular" no found. Any known fixes?
@LacyPerryArt10 ай бұрын
I am getting the same error
@MarcoSilvaJesus Жыл бұрын
Thank you very much for this class. I have a question, what kind of minimum equipment is needed to run this application locally?
@albertbozesan Жыл бұрын
For okay performance you will need a Windows PC with at least 8GB RAM, an NVIDIA GPU with at least 8 GB VRAM, and as much hard drive space as you can (models are quite large).
@나나-l9l7p Жыл бұрын
Hello! I really like your video and tried to follow through the process and I have one question. When creating a skybox image as a scenery, the tree is too large and appears to shrink in the upper part of the image. Is there any way to fix this problem? I used the Asymmetric tiling + 360 Lora. Thank you :)
@albertbozesan Жыл бұрын
Thanks, glad you enjoyed the video! The way to fix that at the moment would be manual editing in photoshop, using generative fill and just classic image editing techniques. There’s no shortcut quite yet - the top and bottom of the 360 image is just bad for now.
@나나-l9l7p Жыл бұрын
@@albertbozesan I see! Thanks:)
@cosmicbrewery6 ай бұрын
Is there a way to add an environment sound?
@albertbozesan5 ай бұрын
You’ll have to ask the people who made the converter in their discord.
@cosmicbrewery6 ай бұрын
I need help! I think I must have a setting wrong. All my images are looking like a bad oil painting. I have the lora and the model you suggested and the sampling method DPR++2M Karras. what am I doing wrong. I want ultra realistic looking interiors.
@albertbozesan5 ай бұрын
Sounds like a more general Stable Diffusion problem unrelated to this specific workflow. Maybe something with a VAE, or using an SDXL model with an SD 1.5 resolution. You could post a screenshot on Reddit? It’s easier to help there.
@visualdrip.official3 ай бұрын
the blender startup file is no longer available :(
@albertbozesan3 ай бұрын
Ugh, that’s a bummer. I have an updated version of this vid on my list but it could be a while. Working on a longer course right now.
@cr4723 Жыл бұрын
I have installed the depthmap-script and restart the UI, but i have no depth tab. What did i wrong? Any solution?
@albertbozesan Жыл бұрын
Did you make sure it’s active in the extensions > installed tab?
@cr4723 Жыл бұрын
@@albertbozesan Yes, I think it is a bug. Sometimes it is there sometimes not. Thanks for the video!
@thisguyoverhere8595 Жыл бұрын
I have a brilliant idea. How about creating an ai program where you type in a paragraph to describe the environment that you want and the ai creates a vr environment just like how the ai art generator works in the Wombo dream app.
@albertbozesan Жыл бұрын
I mean, that’s kind of the ideal scenario of this. With an idea like that, it’s all about the execution and being early 😄
@sisyvusvr Жыл бұрын
Moon VR Home on Quest Applab does that afaik.
@Hambxne Жыл бұрын
Welcome to the singularity y'all, is it everything you hoped it would be?
@alecubudulecu Жыл бұрын
And more!
@elisabethfischer873 Жыл бұрын
Thanks for this tutorial! I have one problem, when i click Build and install environment, it says that the APK build was finished, but it doesn't automatically launch on my Quest. I can find the app in the Unknown sources, but nothing happens when i click on it either. Any idea how to proceed?
@albertbozesan Жыл бұрын
It’s not an app - it can be found where your other home environments are. Settings > Personalization and then it’s probably at the very bottom :)
@elisabethfischer873 Жыл бұрын
@@albertbozesan Perfect! It works now :D Also, i tried to use SD upscaler in img2img because it usually does a much better job than the ones in the extras tab, but it created a really weird image with a bunch of connected tiles. I guess it's not meant to be used for equirectangular images. ESRGAN didn't really seem to fix the resolution much. Any idea about alternative upscaling methods?
@60tiantian Жыл бұрын
Is it can't using on and gpa ?
@albertbozesan Жыл бұрын
Stable Diffusion only runs well on NVIDIA GPUs at this time.
@60tiantian Жыл бұрын
@@albertbozesan thank you for reply! your tutorials help me a lot, great work!
@giraffeguitars6021 Жыл бұрын
So it won’t run properly on any Mac?
@albertbozesan Жыл бұрын
@@giraffeguitars6021 No, they don't have the required GPU. It will run very slowly in apps like Diffusion Bee.
@FieldMajor76 Жыл бұрын
Any tips on how to create a video like this kzbin.info/www/bejne/mpPTaaeajM91eJo so we can experience multiple AI created rooms using youtube VR in Meta? Thank you for posting these informative videos!
@albertbozesan Жыл бұрын
Easier than the process I show here, actually. You don’t even need all the depth steps. Edit your 360/180 images one after another in any video editing program, and make sure you set the KZbin settings to 360. Any 360 video tut should help.
@TransitionedToAShark Жыл бұрын
Only cringe kids use VR. Ever seen the matrix? It starts here kids
@albertbozesan Жыл бұрын
I never thought I’d read “cringe” and “the matrix” in one breath.
@TransitionedToAShark Жыл бұрын
@@albertbozesan matrix is cringe
@regal360com Жыл бұрын
Seriously @@albertbozesan, it's hard to believe someone can be so out of touch that they are hating on VR and the Matrix while on a VR tutorial. Who are these lost souls with nothing better to do but spread negativity?