Slow June, people voting with their feet amid this AI craze, or something else?

  • Magiwarriorx@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    I still use free GPT-3 as a sort of high level search engine, but lately I’m far more interested in local models. I havent used them for much beyond SillyTavern chatbots yet, but some aren’t terribly far off from GPT-3 from what I’ve seen (EDIT: though the models are much smaller at 13bn to 33bn parameters, vs GPT-3s 145bn parameters). Responses are faster on my hardware than on OpenAI’s website and its far less restrictive, no “as a large language model…” warnings. Definitely more interesting than sanitized corporate models.

    The hardware requirements are pretty high, 24GB VRAM to run 13bn parameter 8k context models, but unless you plan on using it for hundreds of hours you can rent a RunPod or something for cheaper than a used 3090.