Oh wow. What a video. I know you have already provided the workflow, but out of respect and appreciation for your hard and wonderful work, I have to watch your videos till the end, I actually enjoy how you build the workflow, good education. Thanks a bunch!!!
@dameguy_90
Күн бұрын
This is a really helpful video. If a pure flux workflow comes out that removes the sdxl model, please make a tutorial like this then too. ^^
@baheth3elmy16
Күн бұрын
Surprisingly the Flux1.Dev.FP8 mode gave me an error. You seem to have used the Full FP8 model. I used the fp8 UNET model only and added the dual clip and VAE nodes. Also, I had to download the SDXL Union and then the Promax Union Controlnet to get the workflow to work. I added an image and prompted (A woman in a restaurant), changed the seed and fixed it. At the end, the image I input in the first group came out as is in the last output, nothing happened to it and it wasn't outpainted. The image I used seemed to be big in size. I used a 512x512 and it worked. Thanks again.
@Dany-w3g
Күн бұрын
It doesn't work for me either, maybe the secret is in his (flux dev+vae clip) model, but he didn't give us a link to download that model so it won't work for anyone.
@gardentv7833
Күн бұрын
i have to download 17GB model. it works
@user-fo9ce3hr5h
6 сағат бұрын
@@gardentv7833 Bro can you share the controlnet-union-sdxl-1.0 and RealvizXL_V5_BakedVAE.safetensors so i can download it?
@PixelArt_YW
Күн бұрын
directly use alimama's flux controlnet inpainting to do outinpaintng, the effect is also very good
@Alex_Niko_Y
Күн бұрын
Thanks for the video! Been waiting a long time on Flux and outpaiting )
@CgTopTips
Күн бұрын
Thanks, It is a tricky method that I want to share
@yklandares
Күн бұрын
Thanks Bro for the workflow you klaasny dude
@tetsuooshima832
Күн бұрын
I have a few questions : 1. what's the difference using ImageBlendAdvance v2 Vs Pad Image for Outpainting (local node) 2. why you invert the mask when there's already invert_mask option on the node. Maybe show us the output to make it clearer what's happening
@erichearduga
Күн бұрын
That comfy layer style breaks my comfyUI everytime: File "E:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\cuda\__init__.py", line 284, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled")... this is after it uninstalls a bunch of stuff including Torch.
@electronicmusicartcollective
Күн бұрын
good video thx, but please change the music to more ambient or lofi to have a better focus on the topic. peace
@thesledge3481
Күн бұрын
This one breaks for me. Main error appears to be "Given groups=1, weight of size [320, 4, 3, 3], expected input[2, 16, 112, 187] to have 4 channels, but got 16 channels instead" in the KSampler SDXL Outpaint Group
@thesledge3481
21 сағат бұрын
I see the problem now - at 5:00 you fail to connect the LoadCheckpoint vae output to the vae input on the Vae Encode (for inpainting) node
@Dany-w3g
Күн бұрын
WOW!
@CgTopTips
Күн бұрын
🙏
@yngeneer
Күн бұрын
thanx alot
@CgTopTips
Күн бұрын
🙏🙏
@gardentv7833
Күн бұрын
Nice work, Thank you, I noticed quality losses in face, deformed teeth, is there anyway to repair this?
@bwheldale
23 сағат бұрын
I noticed that too, so after I temporarily set the scale in the layer utility node from 0.5 to 1 the quality improved, but then the outpainting area was reduced.
@user-fo9ce3hr5h
6 сағат бұрын
@@bwheldale Bro can you share the controlnet-union-sdxl-1.0 and RealvizXL_V5_BakedVAE.safetensors so i can download it?
@geekboystudio2405
16 сағат бұрын
i got this error KSampler unsupported operand type(s) for *: 'float' and 'NoneType' what should i do now
@FilipBejtner-wg1oj
Күн бұрын
How mucho VRAM neded?
@CgTopTips
Күн бұрын
Minimum 6gb vram
@KasperskyGroup
Күн бұрын
Hi Profe.. exist a mistake KSAMPLER mat1 and mat2 shapes cannot be multiplied (10528x16 and 64x3072) ... why?? .Thnaks in advance for all projects all are the best
@gastonboigues4124
23 сағат бұрын
me pasa lo mismo
@gastonboigues4124
23 сағат бұрын
ya funciona bien!!
@KasperskyGroup
20 сағат бұрын
@@gastonboigues4124 Hola ya te genera la imagen ampliada correctamente? ... a mi solo me sale fondo verde en los nodos mejorados de flux con el 0.95
@gastonboigues4124
20 сағат бұрын
@@KasperskyGroup si me funcionó bien cambiando el modelo de control net
@KasperskyGroup
14 сағат бұрын
@@gastonboigues4124 Buenas Gaston,puedes enviarme el link del controlnet por favor.
@MdNoman-tl6yf
Күн бұрын
bro, can i use flux with rtx 4060 ti 16gb?
@erichearduga
Күн бұрын
I run it on rtx3060 with 12gb. Both Dev and Schnell, 8 and 16 bit. Works fine. Takes maybe 30 seconds to do a regular generate, the more complex version of this workflow took about 2:00 using the Hyper-8 step lora.
@erichearduga
Күн бұрын
I just ran this workflow at 30 steps... 3:00 minutes... used 50GB or Ram and 11.4 GB of VRAM--- 16 bit model and clip
@tetsuooshima832
Күн бұрын
@@erichearduga haha that means the more RAM you have... the more RAM gets eaten xD I have 32GB, maybe I don't need to upgrade after all xDD
@therookiesplaybook
21 сағат бұрын
Comfy is so unnecessarily complicated to complete a simple task.
@cjosejaen
Күн бұрын
I cannot continue from the "Load Checkpoint" node ERROR: Could not detect model type of: B:\Data\Models\StableDiffusion\flux1-dev-fp8.safetensors
Пікірлер: 40