this post was submitted on 12 Jun 2023
29 points (100.0% liked)

AI Generated Images

7195 readers
329 users here now

Community for AI image generation. Any models are allowed. Creativity is valuable! It is recommended to post the model used for reference, but not a rule.

No explicit violence, gore, or nudity.

This is not a NSFW community although exceptions are sometimes made. Any NSFW posts must be marked as NSFW and may be removed at any moderator's discretion. Any suggestive imagery may be removed at any time.

Refer to https://lemmynsfw.com/ for any NSFW imagery.

No misconduct: Harassment, Abuse or assault, Bullying, Illegal activity, Discrimination, Racism, Trolling, Bigotry.

AI Generated Videos are allowed under the same rules. Photosensitivity warning required for any flashing videos.

To embed images type:

“![](put image url in here)”

Follow all sh.itjust.works rules.


Community Challenge Past Entries

Related communities:

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Whisdeer 10 points 1 year ago (1 children)

I run Stable Diffusion locally. It's free but you need a nice video card though.

[–] [email protected] 1 points 1 year ago (1 children)

Any news of like a llama.cpp equivalent for SD? It would be handy to be able to slowly run it without a GPU, and maybe competitive with other free options in terms of images generated per day.

[–] Whisdeer 1 points 1 year ago (1 children)
[–] [email protected] 3 points 1 year ago (1 children)

It's a portable, standard-C++, CPU-based implementation of the code to do inference (i.e. text generation) with the LLaMA language model. You get a command line command that takes text and the model and eventually outputs more text.

You could do the same thing and run Stable Diffusion off of the CPU at some relatively slow speed, but I don't know if anyone has code for it.

[–] [email protected] 2 points 1 year ago (1 children)

There are a few UIs that run SD on CPU.
You can do it with Auto1111.

[–] [email protected] 2 points 1 year ago (1 children)

Wow, that claims to be really fast on CPU actually. Why aren't people using this all the time instead of the annoying services?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

That guide refrences this version for more CPU performance. I haven't tried using CPU, but from my experience with raytraced rendering on a CPU, it's probably very slow compared to GPU. It might be faster than online services with a queue though.