LLM in ComfyUI Tutorial

  Рет қаралды 16,616

Sebastian Kamph

Sebastian Kamph

Күн бұрын

Пікірлер: 113
@michaelkircher9094
@michaelkircher9094 2 ай бұрын
Sebastian you are the golden standard for AI creators. Top notch. IDK how but you keep getting exponentially better with each upload.
@sebastiankamph
@sebastiankamph 2 ай бұрын
That's very kind of you, thank you :) 💫
@GenoG
@GenoG 2 ай бұрын
It's funny how I think of things that would be good or helpful in the AI world and then BOOM, you have a new tutorial video on exactly that thing!! I've been thinking about how to do this for a while... perfect that it's bolted right into ComfyUI!! Great video! Up and running immediately... Kind of a pain that the text can't really be edited without cutting and pasting into a regular prompt window, etc... But that's not on you friend! 5 by 5! You earned FiDolla!! Thank you!
@sebastiankamph
@sebastiankamph 2 ай бұрын
Thank you very much for the continued support, so kind of you! 😊💫
@matze2001
@matze2001 2 ай бұрын
Thanks. I already use Ollama and Florence in ComfyUI. This LLM is a nice resource-efficient alternative.
@SebAnt
@SebAnt 2 ай бұрын
Can Ollama be used for anything else ?
@kironlau
@kironlau 2 ай бұрын
@@SebAnt image to prompt (llava:7b-v1.6-mistral-q5_K_M) or enchane prompt (you input just a sentence, but the llm output a detailed prompt)
@SebAnt
@SebAnt 2 ай бұрын
@@kironlau thank you. I had previously seen a video about Ollama and was planning to install it this weekend, and now wondering if Sarge will suffice.
@lordlucifer989
@lordlucifer989 Ай бұрын
Thanks, this works really well with wildcard processor to feed the text into it.
@Cu-gp4fy
@Cu-gp4fy 2 ай бұрын
Thks a ppreciate the local and cloud options recos for those without the fancy hardware!
@VaiTag08
@VaiTag08 2 ай бұрын
Can Searge LLM be used in img2img for flux? I want an LLM model that can read my input image and generate a prompt for img2img.
2 ай бұрын
Very exciting solution, again!
@nishu4288
@nishu4288 2 ай бұрын
Best AI KZbinr... Never Ask For Patreon For Workflow like Mosly Others
@sebastiankamph
@sebastiankamph 2 ай бұрын
Thank you, very kind! But some of my posts are locked even if this wasn't ;)
@LouisGedo
@LouisGedo 2 ай бұрын
👋 Looking forward to this video
@RodrigoAGJ
@RodrigoAGJ Ай бұрын
Hello Sebastian, is there an alternative method to incorporate a positive prompt (clip text encoder) into this workflow to enhance the visual output?
@alg4668
@alg4668 2 ай бұрын
you can do more with ComfyUI node -Long-CLIP can give you a token length from 77 to 248 max
@eduardmart1237
@eduardmart1237 2 ай бұрын
Does fooocus do something similar, when expanding your prompts?
@dustinjohnsenmedia1889
@dustinjohnsenmedia1889 2 ай бұрын
I got an error "the procedure entry point ggml_backened_cuda_log_set_callback could not be located in dynamic link library
@excido7107
@excido7107 2 ай бұрын
As did I
@nikgrid
@nikgrid 2 ай бұрын
Yeah same
@THbeto8a
@THbeto8a 2 ай бұрын
same
@Melike-oh1ir
@Melike-oh1ir 2 ай бұрын
Did you solve it?
@excido7107
@excido7107 2 ай бұрын
@@Melike-oh1ir No sorry I haven't tried for a while
@MustRunTonyo
@MustRunTonyo 2 ай бұрын
Tutorial on how to create the thumbnail pic? It's gorgeous!
@xyy2759
@xyy2759 2 ай бұрын
these two Searge nodes are a great addition I integrated them into 1 loRA Flux + flux1-dev-Q8_0.gguf + t5-v1_1-xxl-encoder-Q8_0.gguf + Mistral-7B-Instruct-v0.3.Q4_K_M.gguf and it work 5s/it 1.25min to generate. Thank you.
@zRegicideTVz
@zRegicideTVz 2 ай бұрын
Can you share that WF?
@hatuey6326
@hatuey6326 2 ай бұрын
huge thanks !!
@sebastiankamph
@sebastiankamph 2 ай бұрын
Happy to help!
@SimpleTechAI
@SimpleTechAI 2 ай бұрын
So I'm still missing this... CheckpointLoaderNF4 - where is this?
@sebastiankamph
@sebastiankamph 2 ай бұрын
Update to latest comfy. There's an nf4 guide on my channel also btw
@mrbendy
@mrbendy 2 ай бұрын
I want to try this so badly but I can't get Searge-LLM to install. I get the message "(IMPORT FAILED) Searge-LLM for ComfyUI v1.0" in my manager. When I load your workflow I get missing node types for Searge_Output_Node and Searge_LLM_Node. Has anyone had this and have a fix?
@christofferbersau6929
@christofferbersau6929 2 ай бұрын
I get it too
@JarvisDroid-o5t
@JarvisDroid-o5t 29 күн бұрын
For me, install fails because I've configured my conda env with python 3.12.... but Searge-LLM is only complient with python 3.10 or 3.11
@anywayanytime3202
@anywayanytime3202 2 күн бұрын
hey, anyone can fix this problems? i got the same message. please help, i couldn't find a forum that explain about this problem :(
@christofferbersau6929
@christofferbersau6929 2 күн бұрын
@@anywayanytime3202 Tell ChatGTP that you need help installing llama-cpp-python for your system
@baheth3elmy16
@baheth3elmy16 2 ай бұрын
I wonder if you ran into llama.dll error and how you resolved it. There is no resolution or fix on the Github page for that node.
@kritikusi-666
@kritikusi-666 Ай бұрын
how did you add height, width / INT in purple with "control_after_generate" is that a special node that you need to install from comfy UI manager? I keep seeing that in samples but cannot find it.
@ElSarcastro
@ElSarcastro 2 ай бұрын
Now I just hope this makes its way into Forge.
@GenoG
@GenoG 2 ай бұрын
I finally had to move over to ComfyUI... I resisted forever because it seemed to ridiculous, but now that I'm using it, I really like it! I use if for Flux, Pony and XL... I had never tried Pony or XL, but in Comfy they are really easy to use... only took a couple of days and there are TONS of Workflow examples so that you don't have to reinvent the wheel! So, my advice: Jump in, the Comfy water is... Comfy!! @sebastiankamph, see what I did there! 😛
@joromask66
@joromask66 2 ай бұрын
🙌🙌🙌🙌
@GenoG
@GenoG 2 ай бұрын
Thanks!
@sebastiankamph
@sebastiankamph 2 ай бұрын
You bet! Thanks again ☺️💫
@Alchete
@Alchete 2 ай бұрын
You should do another Seb Ross Discord weekly challenge video, but this time with Flux. I really enjoyed those.
@sebastiankamph
@sebastiankamph 2 ай бұрын
Thank you for the suggestion! I'll try again and see how the views are for those nowadays :)
@user-hi3ke6qh7q
@user-hi3ke6qh7q 8 күн бұрын
Just wondering. New to Comfyui. Couldn't you just queue the output for the text only. Copy paste into the prompt. disable and unload models/ empty vram option. and just run the rest of the workflow - it's a few extra seconds. Maybe I don't understand because I'm a noob.
@ronnykhalil
@ronnykhalil 2 ай бұрын
lovely, thanks for sharing! btw, how'd you get that pretty little workflow icon on the sidebar?
@sebastiankamph
@sebastiankamph 2 ай бұрын
Probably just the new ui. Showing how to load it in the video if you don't have it already.
@hsuan2323
@hsuan2323 2 ай бұрын
flux loves long prompts?I am always cutting my prompts shorter and shorter till I stop getting this weird error "RuntimeError: stack expects each tensor to be equal size, but got ..." I can't figure out what it means but shortening the prompt a little usually fixes it. if not, shortening it some more usually fixes it,
@jonathanzeppa
@jonathanzeppa 2 ай бұрын
Will you be doing a video on animation in Flux using ComfyUI? Most of the tutorials I've seen are using external websites, rather than a local machine.
@therookiesplaybook
@therookiesplaybook 2 ай бұрын
This is cool, but you can't edit it the created prompt after. If you love the prompt it creates, but you want to edit the subject or one word, you can't. You have to copy and paste it into the previous node, edit it there, then bypass the LLM node, then generate. So not impossible, but an extra step.
@vandaloart7131
@vandaloart7131 Ай бұрын
how do i add the mistral model or any other model? I am missing only that.
@chenlin322
@chenlin322 2 ай бұрын
love you so much, Seb
@sebastiankamph
@sebastiankamph 2 ай бұрын
🥰
@henroc481
@henroc481 2 ай бұрын
What server service do u use to run comfy if any?
@rodrimora
@rodrimora 2 ай бұрын
Would adding something like "for a T5 encoder" improve the output even more for flux?
@Zuluknob
@Zuluknob 2 ай бұрын
Try "FLUX-Prompt-Generator" on hugging face. you can select different LLM's in the right hand generating window
@deandresnago2796
@deandresnago2796 2 ай бұрын
Hello when i run it it seems to go through but i dont see the output in the output text field ?
@SyamsQbattar
@SyamsQbattar Ай бұрын
What app did you use? COmfyUI? Why doesn't my comfyui look like yours?
@ghr1965
@ghr1965 2 ай бұрын
I have an error when loading the workflow. It is related to the CheckpointLoaderNF4 node. "When loading the graph, the following node types were not found: CheckpointLoaderNF4"
@gohan2091
@gohan2091 2 ай бұрын
How does that minstral llm compare to Florence large?
@jakkalsvibes
@jakkalsvibes 2 ай бұрын
Thanks for the great videos! It's odd how the new Flux Model can generate explicit content without issue, but when it comes to something as simple as showing the middle finger, it always ends up with the index finger instead. And what's this thing about the Flux female chin? Does anyone know how to crack this so it works as intended?
@Gaby-om5rd
@Gaby-om5rd 2 ай бұрын
Hello Sebastian, is it possible to load LLM with an image and have it captioned "ChatGPT" style? Or any method that could caption images somehow similar to ChatGPT but for free or the cheapest option, thanks.
@SimpleTechAI
@SimpleTechAI 2 ай бұрын
Also how did you get your manager to stick across the top? Thanks.
@sebastiankamph
@sebastiankamph 2 ай бұрын
Showing in the video
@JohnVanderbeck
@JohnVanderbeck 2 ай бұрын
Does anyone know how I would map this llm_gguf folder in the extra_model_paths.yaml? Is it just that key?
@earthequalsmissingcurvesqu9359
@earthequalsmissingcurvesqu9359 2 ай бұрын
crystools erscheint bei mir nicht in der Leiste oben. Wie haste das hinbekommen ? danke
@douchymcdouche169
@douchymcdouche169 2 ай бұрын
How did you get the system usage stats on top of the menu bar?
@sebastiankamph
@sebastiankamph 2 ай бұрын
Crystools
@eduardmart1237
@eduardmart1237 2 ай бұрын
How much VRAM do you get?
@neme-ye5kl
@neme-ye5kl 2 ай бұрын
Pleeeeeaaaaase let someone put this into Forge, pleeeeeaase!
@bootinscoot5926
@bootinscoot5926 2 ай бұрын
I'd love to see a video about how to use custom loRA's for flux or other models Cause I have no idea how that works! Great video btw, subbed!
@msampson3d
@msampson3d 2 ай бұрын
It was just a few months ago that a comfy UI node allegedly for integrating LLMs into your work flow was out there that executed malicious code on your machine. Be careful out there folks.
@sebastiankamph
@sebastiankamph 2 ай бұрын
Always be careful! This node is created by Searge who is a well known good guy in the community (and also a moderator in my discord). That's of course not a 100% guarantee, but it's almost as good as it can get on the internet I suppose.
@idoshor4470
@idoshor4470 2 ай бұрын
Guys, I know it's not part of this topic but I tried to get answers everywhere and could not find anything.... please help if you have a free moment, thanks 🙏🙏🙏😬: I tried installing the ComfyUI_UltimateSDUpscale through the manager, update it, manual install it through Git, download the raw files and placing them in the correct folder, but all methods failed. the node is considered missing on Comfy and the installation failed. does anyone else have this problem? maybe after recent ComfyUi update or something? thanks.
@arothmanmusic
@arothmanmusic 2 ай бұрын
"Octopuses." ;)
@SimpleTechAI
@SimpleTechAI 2 ай бұрын
I figured it all out myself but it does not work for me. I had to add a check point loader because yours was red and said undefined, and while it does generate the new prompt after that it spits out a whole list of mismatch size errors so probably not for me thanks.
@sebastiankamph
@sebastiankamph 2 ай бұрын
It's built on Flux, in this instance NF4
@poldilite
@poldilite 2 ай бұрын
For me Searge is not loading in Think Diffusion :(
@sebastiankamph
@sebastiankamph 2 ай бұрын
Sorry to hear that. Go hop on their Discord, there's a very active support chat there.
@Rustmonger
@Rustmonger 2 ай бұрын
Hey man great as always but one thing I think a lot of people would love to see if a straight forward Flux LORA training tutorial. Is that in the works?
@themarlez
@themarlez 2 ай бұрын
doesn't work. It gets stuck trying to download a 312 mb through git
@Radarhacke
@Radarhacke 2 ай бұрын
"Generate a random image prompt" Oh no! More floods of images that fill the civitai database. LOL!
@ThoughtFission
@ThoughtFission 2 ай бұрын
How does it compare to Florence2?
@sebastiankamph
@sebastiankamph 2 ай бұрын
Haven't done a comparison, but you can load any .gguf llms
@LouisGedo
@LouisGedo 2 ай бұрын
👋
@dkemil
@dkemil 2 ай бұрын
it gives (IMPORT FAILED) on the latest comfyui
@vaishnav7
@vaishnav7 2 ай бұрын
Same for me with Something like this: Python.exe: Entry point not found The procedure entry point ggml_backend_cuda_log_set_callback could not be located in the dynamic link library C:\ComfyUl\venv\lib\site-packages\llama_cpp_cuda\lib\llama.dll.
@cfcrow
@cfcrow 2 ай бұрын
@@vaishnav7 same
@sebastiankamph
@sebastiankamph 2 ай бұрын
There are some troubleshooting tips on the official github regarding missing llama-cpp. Could check that out: github.com/SeargeDP/ComfyUI_Searge_LLM/tree/main
@vaishnav7
@vaishnav7 2 ай бұрын
@@sebastiankamph thankyou 🤍✌️
@dkemil
@dkemil 2 ай бұрын
python -m pip install llama-cpp-python did the trick
@naeemulhoque1777
@naeemulhoque1777 2 ай бұрын
whats your system specs?
@sebastiankamph
@sebastiankamph 2 ай бұрын
RTX 4090 24gb vram, 64gb ram.
@naeemulhoque1777
@naeemulhoque1777 2 ай бұрын
@@sebastiankamph thanks. 1 gpu or 2?
@sebastiankamph
@sebastiankamph 2 ай бұрын
@@naeemulhoque1777 1. Not much use for 2 as of yet. I mean you CAN, like in Swarm etc. But it's really not very useful.
@ShakenBacon.
@ShakenBacon. 2 ай бұрын
This is a very informative video. I had no idea LLM could integrate with Comfy. Concerning the usage of other models, I seem to be getting a NotImplementedError for 4-bit quantization with any model other than the Flux NF4 models. I am still researching this on my machine but it could be related to me using Comfy through SwarmUI.
@ShakenBacon.
@ShakenBacon. 2 ай бұрын
Solved it. I feel dumb. I didn't notice that CheckpointLoaderNF4 was being used in the workflow.
@AlexanderGarzon
@AlexanderGarzon 2 ай бұрын
i prefer the ollama node
@sebastiankamph
@sebastiankamph 2 ай бұрын
Why do you prefer it? 😊
@AlexanderGarzon
@AlexanderGarzon 2 ай бұрын
@@sebastiankamph it has more options, also the service ollama can be running in a different computer saving you VRAM.
@gatwick127
@gatwick127 2 ай бұрын
So what does this do exactly in more simple terms? Am wasted and don't have the time to watch the whole video. Would appreciate it thanks :)
@alexblrus9825
@alexblrus9825 2 ай бұрын
again flux... okay
@sebastiankamph
@sebastiankamph 2 ай бұрын
You can run it for all the models, but it's extra powerful for Flux specifically.
@2008spoonman
@2008spoonman 2 ай бұрын
Where is the creative input?! So you type two or three words and……. that’s it. Not sure if I like this way of working.
@gilgamesh.....
@gilgamesh..... 2 ай бұрын
But then you aren't even writing the prompt. One way AI art still takes imagination and effort is in figuring out the prompts. This just makes it as lazy as people that are against AI art say it is. Now you don't even have to think.
@yinodiaz4290
@yinodiaz4290 2 ай бұрын
flux1-dev-bnb-nf4 is needed From hugging face if anyone is wondering why it isn't working
Better Text in Flux. NEW Text Encoder!
5:33
Sebastian Kamph
Рет қаралды 10 М.
Players push long pins through a cardboard box attempting to pop the balloon!
00:31
I thought one thing and the truth is something else 😂
00:34
عائلة ابو رعد Abo Raad family
Рет қаралды 6 МЛН
Flux ControlNet - How to guide for ComfyUI
16:18
Sebastian Kamph
Рет қаралды 17 М.
The Weird Rise Of Anti-Startups
12:57
Enrico Tartarotti
Рет қаралды 411 М.
ChatGPT + Runway = GOD MODE (FULL PROCESS)
18:22
AI Samson
Рет қаралды 85 М.
Use Llama3.2 to "Chat" with Flux.1 in ComfyUI
9:55
Nerdy Rodent
Рет қаралды 15 М.
Qwen Just Casually Started the Local AI Revolution
16:05
Cole Medin
Рет қаралды 87 М.
NEW Flux AMAZING Features Are Here!
10:57
Sebastian Kamph
Рет қаралды 10 М.
FASTEST Way to Get FULL Quality in 8 Seconds - FLUX TURBO Lora!
10:09
Olivio Sarikas
Рет қаралды 24 М.
ComfyUI - Hands are finally FIXED!  This solution works with all models!
12:17
STOP Wasting Time with A1111, Use Forge UI Instead
12:37
Olivio Sarikas
Рет қаралды 61 М.
Players push long pins through a cardboard box attempting to pop the balloon!
00:31