27 points | by schopra909 6 hours ago ago
7 comments
Post it on r/StableDiffusion
Rad! huggingface link gives 404 on my side though.
Oh damn! Thanks for catching that -- going to ping the HF folks to see what they can do to fix the collection link.
In the meantime here's the individual links to the models:
https://huggingface.co/Linum-AI/linum-v2-720p https://huggingface.co/Linum-AI/linum-v2-360p
Looks like 20GB VRAM isn't enough for the 360p demo :( need to bump my specs :sweat_smile:
Should be fixed now! Thanks again for the heads up
All good, cheers!
Per the RAM comment, you may able to get it run locally with two tweaks:
https://github.com/Linum-AI/linum-v2/blob/298b1bb9186b5b9ff6...
1) Free up the t5 as soon as the text is encoded, so you reclaim GPU RAM
2) Manual Layer Offloading; move layers off GPU once they're done being used to free up space for the remaining layers + activations
Post it on r/StableDiffusion
Rad! huggingface link gives 404 on my side though.
Oh damn! Thanks for catching that -- going to ping the HF folks to see what they can do to fix the collection link.
In the meantime here's the individual links to the models:
https://huggingface.co/Linum-AI/linum-v2-720p https://huggingface.co/Linum-AI/linum-v2-360p
Looks like 20GB VRAM isn't enough for the 360p demo :( need to bump my specs :sweat_smile:
Should be fixed now! Thanks again for the heads up
All good, cheers!
Per the RAM comment, you may able to get it run locally with two tweaks:
https://github.com/Linum-AI/linum-v2/blob/298b1bb9186b5b9ff6...
1) Free up the t5 as soon as the text is encoded, so you reclaim GPU RAM
2) Manual Layer Offloading; move layers off GPU once they're done being used to free up space for the remaining layers + activations