Home Artists Posts Import Register

Downloads

Content


Join discord and tell me your discord username to get a special rank : SECourses Discord

Patreon exclusive posts index

18 February 2024 Update:

  • Auto installer now installs automatically ComfyUI IPAdapter plus by default

6 February 2024 Update:

  • Updated to latest version Torch and xFormers
  • It will auto download SDXL 1.0 base, SDXL Refiner 1.0 base, SDXL VAE, Realistic Vision V6 and SD 1.5 best VAE automatically into the accurate folders
  • It will auto install ComfyUI manager
  • Download comfy_ui.sh and upload into workspace folder
  • Install commands and instructions are posted on this GitHub readme file  just copy paste them into terminal 
  • Also check out the screenshot at the end of this post

17 January 2024 Update:

15 August 2023 Update:

Installation is now faster. Don't forget to select fast-stable-diffusion template. It also nows auto installs ComfyUI Manager. Tested on RTX 3090 machine and speed for SDXL 1024x1024 is 3.93 it/s. It also downloads best VAE file of SDXL. So make sure to use that VAE.

Video published : ComfyUI Master Tutorial - Stable Diffusion XL (SDXL) - Install On PC, Google Colab (Free) & RunPod

If you also upvote this Reddit thread I would appreciate you very much

How To Install

Register your RunPod account : https://runpod.io?ref=1aka98lq

Select Fast Stable diffusion template or any template you wish like Pytorch (if you use this one make sure to customize pod and add http port 3001 to connect via proxy) : https://www.runpod.io/console/explore

Actually Pytorch template is better since it is more lightweight 

Make sure that pod is not broken here a tutorial : https://www.patreon.com/posts/97919576

Download comfy_ui.sh and upload to workspace

Install and instructions are posted on this GitHub readme file 

SDXL workflows are attached. Download them and use load button

This script will automatically download SDXL models as well for you

Upload your LoRAs into /workspace/ComfyUI/models/loras

Upload models into /workspace/ComfyUI/models/checkpoints

All below workflows are updated for SDXL 1.0 base and refiner models

SDXL_1.json workflow has the SDXL setup with refiner with best settings. Working amazing. Here Screenshot

SDXL_With_LORA.json workflow has LoRA included. Refiner is disabled since I used my own trained LoRA and LoRA currently trained only on the base model. You can enable/disable with select node and CTRL+m

SDXL_LoRA_InPAINT.json has LoRA and inpainting. I used inpainting to inpaint my face to improve quality. Hopefully a tutorial video coming to show best workflow for LoRA training and inpainting.

SDXL_Inpaint-No-Lora.json : No LoRA and inpaints with SDXL

SDXL_Refiner-Inpaint-No-Lora.json : No LoRA and inpaints with SDXL refiner

Here step by step screenshots of how to use : step1.png - step2.png - step3.png - step4.png - step5.png

How to kill running instance?

First install below package

  • apt update
  • apt install psmisc

Then for example to kill port 3001 do below

  • fuser -k 3001/tcp


Files

Comments

Anonymous

looks like we're missing `comfy_ui.sh`.

Rafał Ryniak

Hi, how can i install "comfyui manager" on runpod

Furkan Gözükara

Have you tried this? make sure that you move into custom_nodes of comfyui installation cd custom_nodes git clone https://github.com/ltdrdata/ComfyUI-Manager.git Restart ComfyUI

Anonymous

I'm using juptyer lab to connect and when I follow you tut I get stuck at the end with this error OSError: [Errno 98] error while attempting to bind on address ('0.0.0.0', 3001): address already in use (venv) root@94526f2983a6:/workspace/ComfyUI#

indievish

hi - do you have this for windows too? thanks

Furkan Gözükara

on windows they already have pre-installed version. so i don't have. shown in video : https://youtu.be/FnMHbhvWUhE

Pallavi Chauhan

Hi, On Runpod, I am unable to go inside the "checkpoints" folder. I can go in other folders easily but not checkpoints. You have written that all models should go inside "/workspace/ComfyUI/models/checkpoints" Do I need to open the checkpoints folder to run the wget command using the terminal? Can I run it elsewhere giving it the checkpoint folder path?

Furkan Gözükara

that is jupyter lab error. use cd checkpoints in terminal or with wget command give download path like -O checkpoints/model.safetensors of course give full path

Pallavi Chauhan

Hi, I want to install control net with all its nodes on runpod. I was able to download models but the preprocessors are not showing up anywhere.

Furkan Gözükara

hi. i have installer for that here : https://www.patreon.com/posts/84896373 i may update it soon again with newest models if there are any new. but it will download over 60 models available for you and update both controlnet and automatic1111 for you

Pallavi Chauhan

Hi, I wanted to add some additional checkpoints, models, vae, upscale models, ipadapter models and loras in the intial "comfyui.sh" file so that I don't have to run multiple wget commands later on. What code should I add? Some models are from civit ai. Can you add an example for each of these? also the control net models aren't installed inside comfyui "controlnet" folder. It is a separate installation.

Furkan Gözükara

yes you can add all into your sh file with wget. yes controlnet i think you need to manually install. i havent done it before on comfyui

Pallavi Chauhan

Hi, If I want to create subfolders for different kind of models within the checkpoints folder, can I add a code for that in comfyui.sh file? For ex- saving sdxl models into sdxl folder and sd1.5 models into SD1.5 This is the code for SDXL model wget "https://civitai.com/api/download/models/288982" -O ./models/checkpoints/juggernautXL_v8Rundiffusion.safetensors and this is for SD1.5 model wget "https://civitai.com/api/download/models/132760" -O ./models/checkpoints/absolutereality_v181.safetensors They are both stored in the "checkpoints" folder. I want to create a subfolder inside checkpoints folder for sdxl and sd.15 If you could give me an example code for each then it would be very helpful. Thanks!