vluz@kbin.socialtoTechnology@lemmy.world•X/Twitter has updated its Terms of Service to let it use Posts for AI training
10·
1 年前HateLLM will be a smash. /s
HateLLM will be a smash. /s
That’s wonderful to know! Thank you again.
I’ll follow your instructions, this implementation is exactly what I was looking for.
Absolutely stellar write up. Thank you!
I have a couple of questions.
Imagine I have a powerful consumer gpu card to trow at this solution, 4090ti for the sake of example.
- How many containers can share one physical card, taking into account total vram memory will not be exceeded?
- How does one virtual gpu look like in the container? Can I run standard stuff like PyTorch, Tensorflow, and CUDA stuff in general?
If at all true this would be world-changing news.
Oh my Gwyn, this comment section is just amazing.