

foolproof for now
foolproof for now
The post chooses to use Python, where single and double quotes are equivalent and are not a semantic change. In other languages that might be, but that’s not the point of the article. A semantic diff is language dependent.
Probably a mistake, considering the current generation follows the RX 7_00 naming pattern.
I bet he just wants a card to self host models and not give companies his data, but the amount of vram is indeed ridiculous.
yeah, I’ve been wanting a card like that to run local models since 2020 when I got a 3080. Back then I’d have spent a bit more to get one with the same performance but some 20GB of VRAM.
Nowadays, if they released an RX 9070 with at least 24GB at a price between the 16GB model and an RTX 5080 (also 16GB); that would be neat.
They said smart, not a good person.
I hate it, but that’s what happens without competition.
Fine by me. I self-host a lot of stuff but never used 0.0.0.0 for browsing, so I just disabled it here to try it out.
I’m already annoyed when someone is using their phone in the dark and doesn’t adjust the brightness settings.
If you do this during night flights, sincerely, fuck you.
yeah, also why I do both
Absolutely nothing wrong. Their whole argument is that it delivers no guarantees about the things set in these files, but setting these presets is more about convenience than enforcing an equal development environment.
Whoever needs to enforce things like formatting and linting at the project level should be using a CI step.
Can we stop with the absolutes?
It’s okay to commit IDE config if your team uses mostly one editor.
It’s also okay to include extension recommendations. While extensions may not be needed to run the code, depending on the editor and language they’re highly desirable. It’s that kind of extension that should be recommended. I’m sure there’s a setting to disable them if, for some reason, the editor keeps asking you.
sshhhh let the kid develop a bit of impostor syndrome first
what do you want to do?
as in printing \a
in c? It was one of my first ones too.
you’re not yet in college and have been coding for 4 years?
You already have a head start. Most of my Comp Eng classmates hadn’t written a hello world when they started. Go through your CS, get one or more internships, and you’ll have some perspective.
Until last week, you absolutely NEEDED an NVidia GPU equipped with CUDA to run all AI models.
mate, that means they are using PTX directly. If anything, they are more dependent to NVIDIA and the CUDA platform than anyone else.
to simplify: they are bypassing the CUDA API, not the NVIDIA instruction set architecture and not CUDA as a platform.
Then a semantic diff is not for you, just use the old method.