• REDACTED@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      As a poor easterner, best I could do is RTX 4060Ti with 16GB of VRAM advertised for AI.

      • branch@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        RTX 4060Ti

        Glad to hear I don’t need the latest and greatest to generate images. My budget is still far from affording even that, so I will have to wait. :-)

        • REDACTED@infosec.pub
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 days ago

          Note that the only requirement to run models is memory size. Theoretically you could buy the cheapest/slowest GPU out there, as long as it has sufficient memory for whatever models you want to use. Performance itself is about how long you’re ready to wait for 1 image, which can vary anywhere between few seconds and minutes.

          I’ve heard people just buying Mac mini’s due to their memory size (unified), even tho the performance is not great, but you can run larger models as opposed to RTX 4060ti