I was curious, do you run Stable Diffusion locally? On someone else’s server? What kind of computer do you need to run SD locally?

  • korewa
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I run locally too. I have a 10gb 3080.

    I haven’t had vram issues could you elaborate on your statement?

    I know on local llama I have been limited to 13b models

    • TheForvalaka@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      Stable Diffusion loves VRAM. The larger and more complex the images you’re trying to produce, the more it’ll eat.

      My line of thinking is that if you have a slower GPU it’ll generate slower, sure, but if you run out of VRAM it’ll straight up fail and shout at you.

      I’m not an expert in this field though, so grain of salt, YMMV, all that.

    • lloram239@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      I know on local llama I have been limited to 13b models

      You can run llama.cpp on the CPU with reasonable speeds making full use of normal RAM to run much larger models.

      As for 10GB in SD, I run into lack of VRAM quite constantly when overdoing it, e.g. 1024x768 with multiple ControlNets and some other stuff is pretty much guaranteed to overflow it. I have to reduce the resolution when making use of ControlNet. Dreambooth training didn’t even work at all for me due to lack of VRAM (might be possible to work around, but at least the defaults weren’t usable).

      10GB is still very much usable with SD, but one has to be aware of the limitations. The new SDXL will also increase the VRAM requirements a good bit.