How to run Stable Diffusion on AMD Graphics Cards | AI on AMD Graphics

  Рет қаралды 19,837

RisingPhoenix

RisingPhoenix

Күн бұрын

Пікірлер: 169
@stellarluna659
@stellarluna659 10 ай бұрын
If you get the error: RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check then add "--use-directml --reinstall-torch" to the COMMANDLINE_ARGS in the webui-user.bat file through notepad this way SD will run off your GPU instead of CPU
@artymusoke1352
@artymusoke1352 10 ай бұрын
Took me a whole day of trying different tutorials to find this one, thank you, it worked
@visiblydisturbed1688
@visiblydisturbed1688 10 ай бұрын
I think I'm doing something wrong but I'm so incredibly incompetent that I can't figure out exactly what, I now have the error "launch.py: error: unrecognized arguments: --use-directml
@xangre
@xangre 10 ай бұрын
@@weichaog5585 be sure u paste this line: --use-directml --reinstall-torch ...without the " "
@KingOfGameworld
@KingOfGameworld 10 ай бұрын
Thank you for explaining how to fix something that THREE DIFFERENT TUTORIALS I've watched failed to explain. I stayed up til 4am last night trying to figure out why SD was using my CPU instead of my RX 6700xt.
@Spudicus-jv3kz
@Spudicus-jv3kz 10 ай бұрын
You are a god, thank you.
@twostepghouls
@twostepghouls 11 ай бұрын
My dude, I cannot thank you enough. Every other tutorial I followed resulted in frustration and errors. Yours is perfect.
@moncyn1
@moncyn1 Жыл бұрын
i like its actual human voice and not synthesiser unlike other videos about direct ml sd
@DaddyNameless
@DaddyNameless 11 ай бұрын
I've a 6600xt and your video didn't work as expected for me, Only my CPU was being utilized. I switched my python 3.10.11 and use these launch options. "--medvram --backend directml --no-half --precision full --opt-sub-quad-attention --opt-split-attention-v1 --disable-nan-check --theme dark --autolaunch" Seemed to fix my issue. *Pin this as it may help others.*
@Sum5thefirst
@Sum5thefirst 11 ай бұрын
yes Ive been having this problem too. where do I edit those launch options?
@Sum5thefirst
@Sum5thefirst 11 ай бұрын
if I edit in the batch file, it says unknown arguments : --opt-split-attention-v1 --backend directml ??
@kahl452
@kahl452 Жыл бұрын
i can't believe this worked. I'm running a 6750 and holy crap it worked! thanks man!
@thaido1750
@thaido1750 Жыл бұрын
Excuse me but how many it/s does it have if i may ask?
@kahl452
@kahl452 Жыл бұрын
if you mean, "vram"? it has 12 gigabytes @@thaido1750
@Mootai1
@Mootai1 Жыл бұрын
Hi ! how and where can you see your it/s plz ? i can't find this information.@@thaido1750
@mchawleyii
@mchawleyii 11 ай бұрын
my 6750xt is running out of vram with medvram but not with lowvram. any similar issues there?
@rwarren58
@rwarren58 8 ай бұрын
How much RAM do you have?
@jgtully
@jgtully 10 ай бұрын
This repository now sets up for an NVIDIA card, and gives errors regarding lack of CUDA compatibility during installation. Trying to skip past that and run it results in "RuntimeError: "LayerNormKernelImpl" not implemented for 'Half' " error when trying to generate anything.
@PanKrewetka
@PanKrewetka 11 ай бұрын
Hope to see more about AI and AMD gpu
@seymoria3972
@seymoria3972 Жыл бұрын
This is the best tutorial so far, I got not even single error, diffrent models are working, I can use loras, embeddings. Thank you so much. And btw, I have rx570 so if someone is thinking ,,is it gonna work'' yep it is.
@Snlth48
@Snlth48 11 ай бұрын
I'm using rx 580 and it says insufficient vram
@NorHouda-h6x
@NorHouda-h6x 8 ай бұрын
I got 580rx 8gb and it says Failed to automatically patch torch with ZLUDA. Could not find ZLUDA from PATH.
@hitmanehsan
@hitmanehsan 9 ай бұрын
hello.i got this error on my 6800xt: File "I:\AI SD AMD\stable-diffusion-webui-directml\launch.py", line 48, in main() File "I:\AI SD AMD\stable-diffusion-webui-directml\launch.py", line 39, in main prepare_environment() File "I:\AI SD AMD\stable-diffusion-webui-directml\modules\launch_utils.py", line 560, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this chec
@sandmannneil1
@sandmannneil1 10 ай бұрын
i get this error, i followed your tutorial step by step and other ones but keep getting this error. RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check Press any key to continue . . .
@sandmannneil1
@sandmannneil1 10 ай бұрын
Nevermind found it in comment. The best tutorial by far, i tried alot. Well done
@delos2279
@delos2279 Жыл бұрын
Great tutorial, both for setting it up and using it. Working great on 7800 XT, no issues so far.
@夜々宮
@夜々宮 11 ай бұрын
How? I'm using the same setup, but the VRAM consistently reaches 16GB, and the processing speed is barely 3 it/s for a 512x512 image, and usally runs out of VRAM i need some help
@delos2279
@delos2279 11 ай бұрын
@@夜々宮 I had VRAM issues for the Inpaint and Inpaint Sketch features, or if I tried batch size over 1. So first make sure batch size is 1. Then try this: 1. Open webui-user (.bat file) in a text editor. (make a backup of the original) 2. Find: "set COMMANDLINE_ARGS=" line and add: --lowvram --precision full --no-half --autolaunch The full line should be: set COMMANDLINE_ARGS=--lowvram --precision full --no-half --autolaunch Save file, and try running it. Or you can just try with "--lowvram" If that doesn't help idk, since it works for me.
@ToddGoo
@ToddGoo 10 ай бұрын
I am using 7800xt too. I tried several none of works for me. Will try this one later. Thanks for your info.
@jeremyvolland8508
@jeremyvolland8508 10 ай бұрын
I'm using a Radion 6700 and keep getting "RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check" If I do that though, it will use the CPU instead of the GPU. Any suggestions?
@Colreza
@Colreza 10 ай бұрын
i have the same one, did u find something to fix it?
@gv-art15
@gv-art15 4 ай бұрын
I have the same problem you can fix it?
@uki555
@uki555 11 ай бұрын
This is the best tutorial i've ever watched thank you so much, your video is so underrated brother. I wish you all the best
@unclekracker2684
@unclekracker2684 10 ай бұрын
how do i fix the "Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check" error, even when i used a guide like kzbin.info/www/bejne/rKKQgoR9i5KBhck to add" -precision full --no-half --skip-torch-cuda-test" to command args, it generated images very slowly, as if it was using my cpu and not my gpu, i have a 7800 xt, can i have some help?
@cancel8559
@cancel8559 10 ай бұрын
i'm having the same issue with the same gpu 😭
@Colreza
@Colreza 10 ай бұрын
any updates bro? having the same problem xd
@bluewizard420
@bluewizard420 10 ай бұрын
I have Git and Tortoise Git and Python 3.10.6 and used the DirectML version of stable diffusion but I still get this error: Traceback (most recent call last): File "E:\ai AMD\stable-diffusion-webui-directml\launch.py", line 48, in main() File "E:\ai AMD\stable-diffusion-webui-directml\launch.py", line 39, in main prepare_environment() File "E:\ai AMD\stable-diffusion-webui-directml\modules\launch_utils.py", line 560, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check I have an 6700xt.
@mikeraft3694
@mikeraft3694 10 ай бұрын
Looks like the latest Git update directs install with cudo. Means no AMD.
@mikeraft3694
@mikeraft3694 10 ай бұрын
Finally that helped me: set COMMANDLINE_ARGS=--use-directml --reinstall-torch
@luisveliz6411
@luisveliz6411 10 ай бұрын
@@mikeraft3694 you my friend are a hero thanks
@bluewizard420
@bluewizard420 10 ай бұрын
@@mikeraft3694 I don’t know where I would put this ? Could you be more specific? Thank you for your help.
@NalomYT
@NalomYT 10 ай бұрын
THX MAN!!!!!!!!!!!!!!!!!@@mikeraft3694
@nsf3smm833
@nsf3smm833 11 ай бұрын
This is the most underwatched video of all time. AMD 6700xt is now working hard at creating images. Thank you sir!
@Colreza
@Colreza 10 ай бұрын
Mine is using the cpu instead of the gpu, do you have any info to fix it? I have the same gpu
@gerry._.y
@gerry._.y 11 ай бұрын
this run surprisingly well. but as you mentioned when i checked my cpu/gpu usage. it actually using my cpu to generate images. how to switch to gpu?
@XBIssues
@XBIssues 8 ай бұрын
any ideas for this error during the install "Failed to automatically patch torch with ZLUDA. Could not find ZLUDA from PATH."
@Silvane911
@Silvane911 11 ай бұрын
Confirmed working great on my laptop AMD 6850M
@WillPl4Y
@WillPl4Y 10 ай бұрын
i follow every step but when i try to instal it (double click) webui-user there is error said runtimerror : torch is not able to use gpu; add --skip-torch-cuda-test if i did that,ican install stable diffusion but it will use my cpu to generate spek r5 5500 rx 6650xt 16 gb dual channel
@ulterdamin5318
@ulterdamin5318 3 ай бұрын
I guess it's 6000 and above because I have a 5700xt and did not work either. Same message
@DeepakSingh-vn6ls
@DeepakSingh-vn6ls 11 ай бұрын
RuntimeError: Couldn't install torch.
@SpaceandUniverse72
@SpaceandUniverse72 10 ай бұрын
It doesn't work for me, it stops here and gives me this error, does anyone have a solution? E:\stable-diffusion-webui-directml>git pull Already up to date. venv "E:\stable-diffusion-webui-directml\venv\Scripts\Python.exe" fatal: No names found, cannot describe anything. Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)] Version: 1.7.0 Commit hash: d500e58a65d99bfaa9c7bb0da6c3eb5704fadf25 Traceback (most recent call last): File "E:\stable-diffusion-webui-directml\launch.py", line 48, in main() File "E:\stable-diffusion-webui-directml\launch.py", line 39, in main prepare_environment() File "E:\stable-diffusion-webui-directml\modules\launch_utils.py", line 560, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check Premere un tasto per continuare . . . my configuration: I7-10700K AMD 6800 16gb 32 gb RAM DDR4 3600Ghz Driver AMD 23.12.1 Windows 10 Pro 64bit
@brother-one-1
@brother-one-1 17 күн бұрын
I keep running into issues when I try to clone the file using your tortoise method. Is there a workaround. I can't get anything to work. (exit code 128)
@Limmo1337
@Limmo1337 10 ай бұрын
Nice vid, but how do increase my vram usage. I have a 7900xtx and only use about 2,4gbs of my 24gbs memory when i render an image..
@IRG_Production
@IRG_Production 10 ай бұрын
Hello, I did everything by following you, but I got an error at the end. stderr: the system cannot find the path specified. Wrote
@greatturki6229
@greatturki6229 11 ай бұрын
you are a fu**ing legend. the goat. I wish everybody made tutorials like you. thank you so much.
@ShacoChampagne
@ShacoChampagne Жыл бұрын
thanks, it worked fine on my rx 6700 xt
@gaviera4657
@gaviera4657 11 ай бұрын
I followed all the steps and like the other tutorials, it didn't work. Basically it shows how to install, the problem is that, at the end of the installation, it always gives an error. It always displays at the end: " raise Exception(f"Invalid device_id argument supplied {device_id}. device_id must be in range [0, {num_devices}).") Exception: Invalid device_id argument supplied 0. device_id must be in range [0, 0)". I'm exhausted from trying so many times.
@Daniel-jh7pl
@Daniel-jh7pl 10 ай бұрын
did it like in your guide on amd, but stable uses my cpu not the gpu, what could be wrong?
@User-h9l3m
@User-h9l3m 10 ай бұрын
ive followed everything, but it says python was not found. how do i fix that? And yes i did check the box to add to PATH.
@shadyb834
@shadyb834 11 ай бұрын
Should i use midvram or lowvram for an rx 6600xt (8gb)?
@romina_minka
@romina_minka 10 ай бұрын
I have a RX 6950 XT 16GB and i still have this error..: RuntimeError: Could not allocate tensor with 12615680 bytes. There is not enough GPU video memory available! :(
@xenormxdraws
@xenormxdraws 11 ай бұрын
I'm getting the error, "git did not exit cleanly (exit code 128)" when I attempt to clone. It's always stopping at 37% Any help?
@seraphin01
@seraphin01 Жыл бұрын
the part I still don't understand is how come my 16gb 7800xt with 16gb of ram get the "out of memory" error if I try to use hires fix, while my 2080 8gb does that just fine and render images faster as well.. I mean for the speed I can understand due to cuda optimization, but the Vram I just don't get it..
@fly1063
@fly1063 Жыл бұрын
Hi how did you not get this error with 8go vram? on my 3070 it was impossible to hires straight after the image generation. I am now on a 7800XT and still the same problem.
@DJESHGAMING
@DJESHGAMING Жыл бұрын
Hey, it stopped working for me since today. It seems to not use my GPU anymore which has 24GB. It gives an error when trying to generate -> AttributeError: 'DiffusionEngine' object has no attribute 'is_onnx' Also it tries to use something with 14GB of memory, maybe my system ram. Any idea how to solve this?
@RisingPhoenix96
@RisingPhoenix96 Жыл бұрын
I've never encountered this error, but it seems like a few people have. I've looked around and found a temporary fix: github.com/lshqqytiger/stable-diffusion-webui-directml/issues/296#issuecomment-1751820370 You have to go to the Stable Diffusion folder and right-click in there to open the context menu. Select "Open Git Bash here" and a command window will appear. In the window, type "git checkout f935688" (without quotes) and then press Enter. This will change the active branch of the Stable Diffusion repository. The contents of the folder will, apparently, contain code that fixes the error you've mentioned. After that, you have to edit the webui-user batch file and remove the "git pull" text to prevent Git from trying to pull the latest changes each time you run the file. Save the file after removing the text. Hopefully, you should be good to go after this. This is only a temporary fix. After a while, you should switch back to the "master" branch to receive the latest updates. You basically have to undo all of the above as follows: 1. Go to the Stable Diffusion folder and right-click. 2. Select "Open Git Bash here". 3. Type "git checkout master" without quotes and press Enter to switch to the master branch. 4. Edit the webui-user batch file and add "git pull" at the top so Git pulls the latest changes from the master branch. 5. Run the webui-user batch file. I hope that makes sense.
@humansvd3269
@humansvd3269 Жыл бұрын
When I runt he BAT file for web.ui I keep getting a socket error.
@benhough
@benhough Жыл бұрын
Thanks for the tutorial. I tried to do this with a 7900xtx, but it doesn't seem to work on the GPU at all. I have tried with and without xformers. It only uses the CPU and takes forever. Any suggestions?
@thelaughingmanofficial
@thelaughingmanofficial 11 ай бұрын
You messed up with the installation somewhere then because it's running just fine on my 7900XTX.
@benhough
@benhough 11 ай бұрын
​@@thelaughingmanofficial I redid the installation many times, it did not want to install. I didn't feel like refunding the GPU so I switched to Linux and now everything works perfectly. Thanks for your help though.
@Sum5thefirst
@Sum5thefirst 11 ай бұрын
damn Im having the same problem you were but i don't want to go to Linux 😭😭@@benhough
@NorHouda-h6x
@NorHouda-h6x 8 ай бұрын
RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check
@Snlth48
@Snlth48 11 ай бұрын
Is there any way I can use this with fooocus?
@glassmarble996
@glassmarble996 11 ай бұрын
Hey dreambooth and kohya lora training work with amd?
@Blue_Razor_
@Blue_Razor_ Жыл бұрын
Do you have the issue where after it applies the doggetx optimization, it seems to use roughly 2gb more VRAM for seemingly no reason?
@computerjoy
@computerjoy 10 ай бұрын
What is min spec to run this on a PC?
@RosaMunwalker
@RosaMunwalker Жыл бұрын
Thanks for the video! Is there a way to use only cpu instead of gpu for stable diffusion?
@RisingPhoenix96
@RisingPhoenix96 Жыл бұрын
Try adding this to the webuser config file: --precision full --no-half
@ultralaggerREV1
@ultralaggerREV1 11 ай бұрын
I wouldn’t recommend using the CPU… it’s gonna take way longer
@_mult
@_mult 11 ай бұрын
using Ryzen APU processor 5600g?
@Not_Hans
@Not_Hans 10 ай бұрын
doesn't work. Gives a cuda error... tells you to by pass the cuda process and will only use CPU instead of GPU.
@OtakuDYT
@OtakuDYT Жыл бұрын
I would say just install Stability Matrix and install the Stable Diffusion Package with DirectML enabled and done.
@RisingPhoenix96
@RisingPhoenix96 Жыл бұрын
Interesting idea.
@andytangaming2705
@andytangaming2705 Жыл бұрын
hmm.. can elaborate a bit more? noob here haha 😅😅
@culledmusic6395
@culledmusic6395 9 ай бұрын
Can it run on amd z1 extreme?
@weichaog5585
@weichaog5585 10 ай бұрын
I have 7800xt. first it asked me add --skip-torch-cuda-test to the argument to skip a test. and then it said runtime Error and could not clone stable diffusion with an error code 128.
@lucarollin6033
@lucarollin6033 10 ай бұрын
Same problem and same GPU. I have solved, but I'm not at home now. I'll write to you as soon as possible to tell you which commands to add to the batch file
@weichaog5585
@weichaog5585 10 ай бұрын
thank you very much. appreciated it.@@lucarollin6033
@pigsymcpigface
@pigsymcpigface 11 ай бұрын
Excellent tutorial! Thank you!
@lespretend
@lespretend 10 ай бұрын
I keep getting "Could not allocate tensor with 134217728 bytes. There is not enough GPU video memory available!" using a 6600 even if i have it at lowvram or medvram at 512x512 with no upscaling wtf is going on
@visiblydisturbed1688
@visiblydisturbed1688 10 ай бұрын
I can't even get it to recognize my GPU
@mysamirapenta5659
@mysamirapenta5659 3 ай бұрын
same here, any ideas?
@sungsukim692
@sungsukim692 9 ай бұрын
webui-user.bat after running Error Traceback (most recent call last): File "D:\stable-diffusion-webui-directml\launch.py", line 48, in main() File "D:\stable-diffusion-webui-directml\launch.py", line 39, in main prepare_environment() File "D:\stable-diffusion-webui-directml\modules\launch_utils.py", line 560, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check
@user-mfsc-2024
@user-mfsc-2024 11 ай бұрын
All AMD cards or just 7000 series cards ?
@nextgodlevel
@nextgodlevel 11 ай бұрын
I am not getting all style that u have
@ThomasMeier-c9v
@ThomasMeier-c9v 7 ай бұрын
dont works, RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check
@mathwiz1260
@mathwiz1260 11 ай бұрын
I have an AMD 8GB vram, installed direct ML and i can see the UI, it loads models with no issues but when i click generate image it goes into Unspecified error !! absolutely no idea why it doesn't work.. any idea of why is much appreciated!... AMD is really lagging behind Nvidia on this, it runs smoothly on the other side The automatic version works fine but only uses the CPU at a rate of 20s/it... too slow
@parryhotter3138
@parryhotter3138 11 ай бұрын
i have got a 8gb vega 64 card and ive got 2-7 it per second in automatic1111... so i really dont know what youre doing there. you have to convert the models to onnx. if u dont do that, youll get an error, or very low speed, or wired results. but yes, its a pain in the ass with older amd cards then the 6000 series, because you re not able to run rocm so there is no inpainting available, also you are not able to run many extensions like controlnet, or dreambooth... but just creating images works really fast.
@mathwiz1260
@mathwiz1260 11 ай бұрын
@parryhotter3138 thx I found a way... actually direct ML works fine but only for samplers that are not Karras ... no idea why lol Euler and Euler a work fine and generate at good speed of 3-4s/it ... I get the unspecified error on DM 2m++ karras and other karras samplers ... so I don't use them .. control net is working fine with Euler
@parryhotter3138
@parryhotter3138 11 ай бұрын
​@@mathwiz1260 thats true, you cant use the "newer" samplers. im going for ddpm, dpm, or as you with euler. under linux with rocm and automatic1111, or comfyui youre able to use all samplers, but youll need at least a 6xxx card for that. im going for the 7800xt soon... nvidea is just way to expensive for me. i dont want to go for a 4070 with 12gb which is way more expensive with less vram... so i really hope developers will support amd better in the future, amd has made there turn with rocm.
@TheLoneQuester
@TheLoneQuester Жыл бұрын
After I generate an image, GPU memory remains at 100% (8GB) until I restart my PC. Any further attempts to generate images results in a memory error. Do you have any tips?
@RisingPhoenix96
@RisingPhoenix96 Жыл бұрын
Try setting the "--lowvram" flag in the config file.
@jacobyrassilon
@jacobyrassilon 10 ай бұрын
I somehow got a newer version of python installed and cannot get it to uninstall no matter what. Anyone have any clues on how to install 3.10.6?
@lespretend
@lespretend 10 ай бұрын
You ran Python installer, then clicked 'uninstall python' when it launches? I had to delete my old python installation this way to install this one
@jacobyrassilon
@jacobyrassilon 10 ай бұрын
@@lespretend Yes, I tried that several times. Installed the newer version again, then run the uninstaller in python, then tried the older version and I still get the message that I am running a newer version. Damned frustrating
@KPopTato
@KPopTato 10 ай бұрын
had the same issue, give modify or repair python and then give uninstall
@aaaadorime2017
@aaaadorime2017 Жыл бұрын
i cant run it on 6700XT
@RaiserFPS
@RaiserFPS Жыл бұрын
hello sir, why it is saying "RuntimeError: "LayerNormKernelImpl" not implemented for 'Half'"
@RisingPhoenix96
@RisingPhoenix96 Жыл бұрын
Hi. Try adding the following to the "set COMMANDLINE_ARGS=" section of the config file seen here: 38:28 --precision full --no-half
@RaiserFPS
@RaiserFPS Жыл бұрын
@@RisingPhoenix96by adding that command error is fixed ,but now it is using 100% of my CPU,its not using GPU . GPU usage is 2%. my GPU is RC6700xt
@RisingPhoenix96
@RisingPhoenix96 Жыл бұрын
@@RaiserFPS OK, I'll see if I can reproduce this error on my end and I'll get back to you as soon as I can.
@RaiserFPS
@RaiserFPS Жыл бұрын
@@RisingPhoenix96thanks for helping sir
@RaiserFPS
@RaiserFPS Жыл бұрын
@@RisingPhoenix96 have you find any solution?
@Guiff
@Guiff 11 ай бұрын
If someone has problems because bought a new AMD GPU having before an nvidia; uninstall python and reinstall it.
@xenotron1138
@xenotron1138 11 ай бұрын
I'm running SD on an RX 6600 with no issues other than the amount of time it takes to render. If I add another 6600, would it speed up my generations? Considering the price point and the fact that I'm already invested in one 8gb card, it seems this is a good option. Not really looking to play games on this machine. Just SD and some video editing.
@jeremyvolland8508
@jeremyvolland8508 10 ай бұрын
Are you sure you are actually using the GPU and not the CPU for Stable Diffusion?
@xenotron1138
@xenotron1138 10 ай бұрын
100% sure. @@jeremyvolland8508
@HNS-007
@HNS-007 Жыл бұрын
you saved my life dude!
@RisingPhoenix96
@RisingPhoenix96 Жыл бұрын
Glad I could help.
@general123ist
@general123ist Жыл бұрын
Will older laptop 4gb gpus support this??
@RisingPhoenix96
@RisingPhoenix96 Жыл бұрын
If the GPU in question supports DirectX 12, specifically the DirectML library, the yes, it should work. If you manage to get it working, make sure you set the "--lowvram" argument in the config so you don't get (or reduce) VRAM errors.
@onigirimen
@onigirimen Жыл бұрын
your GPU is capable to use ROCm, don't bother use DirectML, it'll eat up your VRAM+RAM since DirectML memory optimization is so bad.
@kedicchi
@kedicchi Жыл бұрын
how do we use rocm on windows?
@onigirimen
@onigirimen Жыл бұрын
@@kedicchi at this time, straight using windows is no. but you can use WSL2 on windows
@kedicchi
@kedicchi Жыл бұрын
@@onigirimen tried it but couldnt find a video for that
@ryanlevin-pj6ux
@ryanlevin-pj6ux 10 ай бұрын
Doesn't fucking work same error as everything else nothing works my next gpu is gonna be nvidia so I don't have to deal with this bullshit if someone has any ideas mention them please
@timgeurts
@timgeurts 10 ай бұрын
moooooood
@UngaBunga-zo7gu
@UngaBunga-zo7gu Жыл бұрын
I have a RX 5700, will it work?
@kappa173
@kappa173 Жыл бұрын
works for me
@ersin6761
@ersin6761 11 ай бұрын
@@kappa173 can u give ur settings?
@Eduard_Kolesnikov
@Eduard_Kolesnikov 11 ай бұрын
thank you bro for help
@ThomasMeier-c9v
@ThomasMeier-c9v 7 ай бұрын
Fatal: No names found, cannot describe anything. Did i forget something?
@kopidoo
@kopidoo Жыл бұрын
It is wroking but damn so slooooooow :( Just moved to a 7800Xt from a 3060Ti and the speed went from 2-3it/s to 6-7sec/it... UPDATE: with these command ling arguments it gets a boost: --upcast-sampling --medvram --no-half --precision=full --opt-sub-quad-attention --opt-split-attention-v1 --disable-nan-check
@xenormxdraws
@xenormxdraws 11 ай бұрын
Hey man, I failed to clone the repository because it keeps throwing an error at me when it reaches 37% I'd really appreciate it if you could share the one you cloned with me via Google Drive or something. Thanks.
@xenormxdraws
@xenormxdraws 11 ай бұрын
Hey man, I failed to clone the repository because it keeps throwing an error at me when it reaches 37% I'd really appreciate it if you could share the one you cloned with me via Google Drive or something. Thanks.
@andre_tech
@andre_tech Жыл бұрын
Hi.I have a RX 7600 with 8GB vram. Stable Diffusion keep saying I have insuficiente vram (RuntimeError: Could not allocate tensor with 4915840 bytes. There is not enough GPU video memory available! Time taken: 11.1 sec.) , even when it is the first image im making in the session. I tried the "--medvram-sdxl " argument.
@andre_tech
@andre_tech Жыл бұрын
Ah, I saw the "--lowvram " argument and now its working =)
@TheLoneQuester
@TheLoneQuester Жыл бұрын
@@andre_tech For me i can generate an image but then all vram is used and it doesnt refresh unless I restart. Do you know how to fix that?
@andre_tech
@andre_tech Жыл бұрын
@@TheLoneQuester idk how to fix this. But I close the cmd and reopen it right away without closing the tab in the browser so I dont lose my prompts there.
@andre_tech
@andre_tech Жыл бұрын
@@TheLoneQuester Then close the stable diffusion new tab every time you restart the CMD, the old one with the prompt continue to work. Thats the only way I know to have faster use of the tool.
@Eleganttf2
@Eleganttf2 Жыл бұрын
no no lowvram is literally only for 2gb vram users and it will really SLOW your generation time by a lot, are you trying to to generate SDXL ?
@ukrainian333
@ukrainian333 10 ай бұрын
This video - Ultrasonic Sega Google Mega Drive 3D Ultimate HDR Fullscreen 14bit 8K Exclusive Technology 32:9 format IPS AMOLED 60FPS Me - chinese guy that looking to fucking smallest peace of paper LOL Thanks for this video, btw
@awttygaming2510
@awttygaming2510 8 ай бұрын
and you dont even send links like tf
@ghosthunters.network
@ghosthunters.network 6 ай бұрын
getting errors can't get it to run.. command line error.. "model failed to load" and "AttributeError" object has no attribute "lowvram". I had to put 2 lines into my web-user.bat command file.. set COMMANDLINE_ARGS=--skip-torch-cuda-test --lowvram the cuda test was error original in command Line.. then lowvram error and model not load.. even get can't connect at time.. WTF.. anyone know what the hell going on... i been at this for too long! HELP! Thanx..
Faster Stable Diffusion with ZLUDA (AMD GPU)
9:06
Northbound
Рет қаралды 12 М.
How to use Stable Diffusion. Automatic1111 Tutorial
27:10
Sebastian Kamph
Рет қаралды 342 М.
The Singing Challenge #joker #Harriet Quinn
00:35
佐助与鸣人
Рет қаралды 37 МЛН
Real Man relocate to Remote Controlled Car 👨🏻➡️🚙🕹️ #builderc
00:24
КОГДА К БАТЕ ПРИШЕЛ ДРУГ😂#shorts
00:59
BATEK_OFFICIAL
Рет қаралды 8 МЛН
This free AI image editor changes everything
31:29
AI Search
Рет қаралды 136 М.
AMD Radeon 6800XT - Undervolt is the best! [Part 3]
9:43
RockingFPS
Рет қаралды 936
Install Stable Diffusion for AMD GPUs on Windows | ComfyUI and webUI on AMD.
14:39
Easy way - Using RunPod to run CogVideo in ComfyUI Image to Video
20:59
Create consistent characters with Stable diffusion!!
26:41
Not4Talent
Рет қаралды 218 М.