this post was submitted on 02 Jul 2023
19 points (100.0% liked)
Stable Diffusion
4330 readers
53 users here now
Discuss matters related to our favourite AI Art generation technology
Also see
Other communities
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I would agree, but the rate of innovation in AI is so unpredictable that it could go either way.
I don't really agree.
Recent AI inovations are pretty modest and use the innovation of raw fucking power to achieve goals.
Gpt4 uses 230B parameters, whereas to run a 7B LLM you need 16gb of vram already, and llms are o(n²) in complexity in terms of parameters, I'll let you do the maths
Stable diffusion (latent diffusion to be more precise) is about the same, the initial training required billions of teraflop, while it was relatively cheap (100k$), it still rides on modern GPU technology .