• Stalinwolf@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    1 year ago

    I’ve tried to install this multiple times but always manage to fuck it up somehow. I think the guides I’m following are outdated or pointing me to one or more incompatible files.

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 year ago

      Tough luck running any code published by people who put out models, it’s research-grade software in every sense of the word. “Works on my machine” and “the source is the configuration file” kind of thing.

      Get yourself comfyui, they’re always very fast when it comes to supporting new stuff and the thing is generally faster and easier on VRAM than A1111. Prerequisite is a torch (the python package) enabled with CUDA (nvidia) or rocm (AMD) or whatever Intel uses. Fair warning: Getting rocm to run on not officially supported cards is an adventure in itself, I’m still on torch-1.13.1+rocm5.2 newer builds just won’t work as the GPU I’m telling rocm I have so that it runs in the first place supports instructions that my actual GPU doesn’t, and they started using them.