theUnlikely

joined 2 years ago
MODERATOR OF
[โ€“] theUnlikely@sopuli.xyz 2 points 1 year ago* (last edited 1 year ago) (1 children)

I added an extra note to the post ๐Ÿ˜‰
Although I notice we already got a nice non-gun entry a few hours before you even wrote this.

[โ€“] theUnlikely@sopuli.xyz 2 points 1 year ago (1 children)

Hmm, perhaps I should remove my two bonus examples? I was just trying to show what MidJourney and Dalle can create based on my own prompt of trying to recreate the scene I mentioned.

Let's see your version of payback ๐Ÿ˜Š

[โ€“] theUnlikely@sopuli.xyz 2 points 1 year ago

Alright! That's a neat little trophy. I'm working on a concept for the next challenge right now.

[โ€“] theUnlikely@sopuli.xyz 2 points 1 year ago

Looks really cool! I'm getting some vibes of The Aerie in NieR Replicant.

[โ€“] theUnlikely@sopuli.xyz 4 points 1 year ago* (last edited 1 year ago) (1 children)

Okay soooooo, that took a lot longer than I anticipated, but I think I got it. It seems it is a problem with the VAE encoding process and it can be handled with the ImageCompositeMasked node that combines the padded image with the new outpainted area so that pre-outpainted area isn't affected by the VAE. I learned this here https://youtu.be/ufzN6dSEfrw?si=4w4vjQTfbSozFC6F&t=498. The whole video is quite useful, but the part I linked to is where he talks about that problem.

The next problem I ran into is that at around the fourth from the last outpainting, ComfyUI would stop, it just wouldn't go any further. The system I'm using has 24GB of VRAM and 42 GB of RAM so I didn't think that was the problem, but just in case I tried it on a beastly RunPod machine that had 48GB VRAM and 58GB of RAM. It had the exact same problem.

To work around this I first bypassed everything except the original gen and the first outpaint. Then I enabled each outpaint one by one until I got to the fourth from the last. At that point I saved the output image and bypassed everything except the original gen and first outpaint and enabled the last four outpaints, loading the last image manually.

I used DreamShaper XL Lightning because there was no way I was going to wait for 60 steps each time with FenrisXL ๐Ÿ˜‚ I tried two different ways of using the same model for inpainting. The first was using the Fooocus Inpaint node and Differential Diffusion node. This worked well, but when comfy stopped working I thought maybe that was the problem so I switched all of those out for some model merging. Basically, it subtracts the base SDXL model from the SDXL inpainting model and adds the Dreamshaper XL Lighting model to that. This creates a "Dreamshaper XL Lightning inpainting model". The SDXL inpainting model can be found here.

You should be able to use this workflow with FenrisXL the whole time if you want. You'll just need to change the steps, CFG, and maybe sampler at each ksampler.

Image with ImageMaskedComposite: https://files.catbox.moe/my4u7r.png

Image without ImageMaskedComposite: https://files.catbox.moe/h8yiut.png

[โ€“] theUnlikely@sopuli.xyz 2 points 1 year ago

Very cool idea to use outpainting like that! I'm wondering if something happened to the image along the way. A lot of the details look burnt out by the final outpainting. Looking at the workflow, I counted 12 VAE decode/encode pairs. I know that changing between latent and pixel space is not a lossless process, so that might be it, but I'm not sure. I'm going to see if I can get a workflow going that maintains the original quality.

Comparison

[โ€“] theUnlikely@sopuli.xyz 4 points 1 year ago

If there were an open sign up for colonists for Mars, Titan, etc. I'd put my name on there without hesitation.

[โ€“] theUnlikely@sopuli.xyz 4 points 1 year ago (1 children)

And so it begins...

[โ€“] theUnlikely@sopuli.xyz 11 points 1 year ago (4 children)

I had to rely on Regional Prompter to get the composition even remotely right. SD really didn't feel like putting any kind of large planet in the sky without it. I also learned today that SDXL models don't know what Saturn is ๐Ÿซ 

Gen infofuturistic, scifi, intricate, elegant, highly detailed, majestic, vast expanse of the Titan's surface, evoking a sense of wonder and awe at the beauty of space <lora:add-detail-xl:0.3> ADDCOMM
exterior of a dream home is perched on a cliff overlooking the vast expanse of the Titan's surface, ADDCOL
neon planet in the sky ADDROW
winged spaceship on landing pad
Steps: 21, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 4019266864, Size: 1216x832, Model hash: d8fd60692a, Model: leosamsHelloworldXL_helloworldXL50GPT4V, VAE hash: 235745af8d, VAE: sdxl_vae.safetensors, Variation seed: 3182209455, Variation seed strength: 0.05, Denoising strength: 0.4, Clip skip: 2, RP Active: True, RP Divide mode: Matrix, RP Matrix submode: Rows, RP Mask submode: Mask, RP Prompt submode: Prompt, RP Calc Mode: Attention, RP Ratios: "1,1;1,2,1", RP Base Ratios: 0.2, RP Use Base: False, RP Use Common: True, RP Use Ncommon: False, RP Options: ["[", "F", "a", "l", "s", "e", "]"], RP LoRA Neg Te Ratios: 0, RP LoRA Neg U Ratios: 0, RP threshold: 0.4, RP LoRA Stop Step: 0, RP LoRA Hires Stop Step: 0, RP Flip: True, Hires upscale: 1.5, Hires steps: 10, Hires upscaler: 4x-UltraSharp, Lora hashes: "add-detail-xl: 9c783c8ce46c, add-detail-xl: 9c783c8ce46c, add-detail-xl: 9c783c8ce46c", Downcast alphas_cumprod: True, Version: v1.8.0, Hashes: {"vae": "235745af8d", "lora:add-detail-xl": "0d9bd1b873", "model": "d8fd60692a"}

[โ€“] theUnlikely@sopuli.xyz 2 points 1 year ago

So you're not allowed to look in the passenger's general vicinity at all then?

[โ€“] theUnlikely@sopuli.xyz 3 points 1 year ago (5 children)

Wow what does Oman have going on over there?

view more: โ€น prev next โ€บ