What a sweet, lovely-sounding voice and great pronunciation. Many KZbinrs who, for example, cut out all pauses in speech with an audio editor until it sounds unnatural and annoying, and after a few sentences one is already sick of it, could take this as an example. You have a great speech melody and rhythm so that one can follow the words you say well and listen attentively with pleasure. Thank you also for your great contribution.
@clementcardonnel32198 ай бұрын
The quality of this tutorial is insane. Truly underrated looking at the number of views. Keep it up, we're learning so much!
@m0ose09099 ай бұрын
Thank you for the detailed and slow paced tutorial - way more thorough and easy to follow than others I've seen.
@CreatixAi8 ай бұрын
Glad it was helpful!
@MariaMac-h8x9 ай бұрын
You make The Best tutorials on Stable Diffusion! Thank you!
@CreatixAi8 ай бұрын
Wow, thanks!🙏🏻🥰
@楊明諺-i1q4 ай бұрын
Very detailed tutorial. Really helpful for beginner of stable diffusion. Great work! And please make more video for us!
@alecubudulecu11 ай бұрын
Appreciate the in depth guide on specific aspects of SD. Good job with this
@CreatixAi11 ай бұрын
Thanks! 🥰
@the_trevoir9 ай бұрын
I thought your voice was AI for about 5 seconds. You should take it as a compliment, your voice is perfect, the tone is perfect, and you enunciate like a mofo. What you *don’t* do that generated voices almost always do, is screw up basic pronunciation. I loved this video, I got more out of it than the last 10 SD videos I watched put together. I’ve subscribed because I’m hoping you have a video which will help me with the bane of my ai existence- controlling colour.
@CreatixAi8 ай бұрын
Your comment made my day 🥰 Thanks so much! I am hoping to go through all of the ControlNet models at some point, I’m sure there’s something very useful for color too ;) What exactly do you have trouble with? The general color? Or color of certain parts? (If parts, have you tried fixing it with inpainting?)
@the_trevoir8 ай бұрын
@@CreatixAi So many things to learn! I have a handle on basic text-to-image, image-to image, basic inpainting, controlnet posing with pose, canny, depth, and ip-adapter full face. I still need to learn control-net inpainting. I haven’t tried inpainting for colour much (eyes a couple of times) because I always feel like there is some method which is evading me- a combination of prompting and syntax (breaks, punctuation etc) which would give me the results I want. Maybe there just isn’t yet. The big problem I have is colour bleed. Me: SD, please give me blue eyes. SD: Okay, but guess what, I’m going to also make the jacket, chair, curtains, coffee table, and the cat blue.
@OptimalToast11 ай бұрын
Appreciate the guide, and the tip about playing with compositions, hadn't got around to playing with that, I definitely should going by the examples you displayed, some very interesting results. 😎
@CreatixAi11 ай бұрын
Ah, it’s my favourite! Hope you have fun with it ☺️
@chakhmanmohamed9436Ай бұрын
Please, any use case at a foreground image is blended with background? Adding illumination and shadows. Thanks alot
@ricogfx421210 ай бұрын
Can u bring up a video on upscaling our art to extreme high quality
@LewGiDi11 ай бұрын
Thanks a lot for this tutorial, I sometimes use controlnet because I am not to familiar with it and afraid to mess up things. This video tutorial was very useful. Do you have any plan to make a tutorial about upscaling?
@history30426 ай бұрын
Quick question. If I have an existing photo with an existing segmentation, depth and even normal map. Is there a way use them all as inputs (instead of estimations) and using them all simultaneously in the calculation rather than using them seperately (depth only, mask only etc)
@coulterjb2210 ай бұрын
Nicely done! Your explanation was perfect. Im working on depth maps to pull as much detail as possible for laser engraving but I have yet to find something that doesn't require a lot of work in Photoshop. Leres++ helps as well as increased sampling steps but not really. Anything else I could try?
@joshualloyd669411 ай бұрын
Could you do any workflow builds and talking about this in ComfyUI? Great video!
@TDMIdaho11 ай бұрын
Excellent explanation, thank you!
@keithtam88598 ай бұрын
sorry, may be you have answered in the video... can i use a 3d package like blender, create depth map and use that depth map to generate an image with controlNET? thanks
@LostDarkCloud5 ай бұрын
Ok, so I was playing around with this. If you didn't know find out, this is how I did it: 1. Check allow preview. 2. Click the "Edit P" option, the "P" is the PhotoPea icon. 3. Copy YOUR Depth Map image that you want to use and paste it into the PhotoPea Editor. You may have to adjust the canvas size to fit it properly. The import thing is to make sure your layer name matches the name it is looking for. I couldn't get it to work otherwise. So, your Custom Depth Map layer in the list at the right needs the same name as the unit you are using. Example: "unit-0" if you are using Control Net Unit 0. 4. There's a button at the top of PhotoPea UI which says "Send to ControlNet". Hit it. 5. It should load your Depth Map into Control Net and use it. If this worked for me, it should work for you.
@Nick-tl7ts6 ай бұрын
this is great! what if I have a generated depth map as a pass from a 3d software and I dont need estimation? should I put it into controlnet and thats it?
@LostDarkCloud5 ай бұрын
Ok, so I was playing around with this. If you didn't know how, this is how I did it: 1. Check allow preview. 2. Click the "Edit P" option, the "P" is the PhotoPea icon. 3. Copy YOUR Depth Map image that you want to use and paste it into the PhotoPea Editor. You may have to adjust the canvas size to fit it properly. The import thing is to make sure your layer name matches the name it is looking for. I couldn't get it to work otherwise. So, your Custom Depth Map layer in the list at the right needs the same name as the unit you are using. Example: "unit-0" if you are using Control Net Unit 0. 4. There's a button at the top of PhotoPea UI which says "Send to ControlNet". Hit it. 5. It should load your Depth Map into Control Net and use it. If this worked for me, it should work for you. Now if you didn't need to know this and you know a better way, feel free to share.
@oseaniic2 ай бұрын
THANK YOU, SO MUCH!!!
@CGFUN8294 ай бұрын
amazing stuff. thanks alot
@VGHOST0089 ай бұрын
Weird. I do exactly as you showed, but ControlNet doesn't give me the depth result 1-to-1. It often fails to replicated the poses and the outputs aren't great. Am I missing something? When exactly should I set preprocessor and model to 'none' and why aren't you doing that when generating images? Any help appreciated.
@VGHOST0089 ай бұрын
Updated: Openpose seems to do the trick. I have no idea why but depth produces extremely inaccurate and inconsistent results for me. Openpose - almost always does what is being referenced. Does depth work well and accurately only with anime models?
@contrarian887011 ай бұрын
Thanks for this. Is Midas the underlying depth algorithm or something else?
@CreatixAi11 ай бұрын
Midas is one of the 4 depth preprocessors. It's selected by default to create a depth map that has the least amount of details but works best for most cases. 😊
@odessaodesa53078 ай бұрын
Thanks! Very useful !!! 🙏❤
@jccluaviz9 ай бұрын
Thank you for that great work. It's really amazing. Do you know if this can be replicated in Fooocus ?
@chuckynorris61611 ай бұрын
how did you get the AI to do mouth clicking ,popping noises?
@chuckynorris61611 ай бұрын
link the LORA please
@luciusblackheart11 ай бұрын
thank you!
@elvira-irinareisinger637011 ай бұрын
As anybody else in fandoms without money I'm on yodayo and wonder WHEN exactly to use canny or depth. Sometimes i get nice result with canny and sometimes garbage. Only depth (but not every subroutine or processor is good?) gives me more good than trash results.
@blender_wiki11 ай бұрын
Be careful which depth preprocessor you use because some implementations have bad VRAM memory leak in CUDA and you can have an impromptu system crash.
@AlexeyRodin.9 ай бұрын
I like your English pronunciation/
@CreatixAi8 ай бұрын
That’s the first time I received a compliment on my English pronunciation. Thanks a bunch 🥰 It’s my second language, can you tell?)