• @Naz@sh.itjust.works
    link
    fedilink
    English
    12 months ago

    You need a lot of VRAM and a large visual model for higher complexity.

    Lower VRAM means you run models that only do one thing consistently/well.

    See: FLUX

    • @bradd@lemmy.world
      link
      fedilink
      English
      12 months ago

      I have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?