Is it possible to open a navigator tab and download large files? I was trying to run a hunyan video workflow but the model is 24 gb and the only way to download it is on civitai web page, there is not curl available to download
@geomaikara3 ай бұрын
I can't seem to get it right. I am using M1 Mac. Zoe depth is not running. I tried without depth still getting weird results..
@PeanutButter-c1u3 ай бұрын
I've seen many tutorials and tried to find websites to help you use comfyUI for architectural rendering. Most of the time they only show you the basics and then ask you to join their patrons. this is the first ever cohesive workflow I've found which is simple to understand for architects and which is not behind some paywall. Kudos to you for keeping this information free. You don't know how much it has helped people like me.
@farouktaoulilit79424 ай бұрын
Keep going🎉
@henroc4814 ай бұрын
uuh what if I don't have python in my list of tasks? for restarting by killing root
@AIPI15 ай бұрын
Thank You, very helpful!
@vishalchouhan075 ай бұрын
Hi there.. I downloaded the workflows but IPadapter Unified loader and IPadapter advanced showing error.. do you have a fix for this?
@seyiram37835 ай бұрын
same issue. is there a fix?
@Freestyle_ZVC6 ай бұрын
Hi thanks so much. I am really struggling for some reason with your workflow it gets to Ksample and gives me this error underneath. do you know what's causing this? thanks anyways Error occurred when executing KSampler: 'NoneType' object has no attribute 'shape' File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI odes.py", line 1373, in sample return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI odes.py", line 1343, in common_ksampler samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\sample.py", line 43, in sample samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 829, in sample return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 729, in sample return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 716, in sample output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 695, in inner_sample samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 600, in sample samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 635, in sample_dpmpp_2m_sde denoised = model(x, sigmas[i] * s_in, **extra_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 299, in __call__ out = self.inner_model(x, sigma, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 682, in __call__ return self.predict_noise(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 685, in predict_noise return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 279, in sampling_function out = calc_cond_batch(model, conds, x, timestep, model_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 202, in calc_cond_batch c['control'] = control.get_control(input_x, timestep_, c, len(cond_or_uncond)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\controlnet.py", line 161, in get_control control_prev = self.previous_controlnet.get_control(x_noisy, t, cond, batched_number) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\controlnet.py", line 200, in get_control control = self.control_model(x=x_noisy.to(dtype), hint=self.cond_hint, timesteps=timestep.float(), context=context.to(dtype), y=y, **self.extra_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch n\modules\module.py", line 1532, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch n\modules\module.py", line 1541, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\cldm\cldm.py", line 420, in forward assert y.shape[0] == x.shape[0] ^^^^^^^
@Username562916 ай бұрын
hi new subs,just subscribed thanks for your content🎉
@soren.larsson6 ай бұрын
Could you update the link for the workflow? Thanks!
@LightSketch-ai6 ай бұрын
Thanks for pointing this out! The links should be working now
@23rix6 ай бұрын
this wont work in real projects( becouse for example, there's no control over textures (scale, placement etc) . I still render images in 3d, just making some improvements in comfy
@renwar_G6 ай бұрын
I work in archviz too, and I use a workflow like this a lot, because like this you won’t need to think about the colors and composition that much since Comfyui gives new ideas every time, or I use it to give a quick render to the client
@antgrkh5886 ай бұрын
@@renwar_G what happenes when client says I want that floor and you say you cant, its a fantasy
@renwar_G6 ай бұрын
@@antgrkh588 then I use a render engine, or try an imag2img
@dzynity-designuniversity5 ай бұрын
@@antgrkh588 Do inpainting with mask or use generative fill in photoshop