Let’s just shut this down right now. If you built FPGAs ever, it was in college in the 90s, at an awful program of a US university that trained you in SQL on the side and had zero idea of how hardware works. I’m sorry for that.
The world has changed since 30 years ago, and the future of integer operations is in reprogrammable chips. All the benefit of a fab chip, and none of the downside in a cloud environment.
The very idea that you think all these companies are looking to design and build their own single purpose chips for things like inference shows you have zero idea of where the industry is headed.
You’re only describing how ASIC is used in switches, cool. That’s what it’s meant for. That’s not how general use computing works in the world anymore, buddy. It’s never going to be a co-proc in a laptop that can load models and do general inference, or be a useful function for localized NN. It’s simply for the single purpose uses as you said.
I mean, you’re such an absolute know-nothing that it’s hilarious. Nice xenophobic bullshit sprinkled in too. Sorry, no university for me, let alone FPGA in university in the 90s. When my friends were in university they were still spending their time learn Java.
The world has changed since 30 years ago
Indeed. And people like me have been there every step of the way. Your ageism is showing.
and the future of integer operations is in reprogrammable chips
Yes, I remember hearing this exact sentiment 30 years ago. Right around the time we were hearing (again) how neural networks were going to take over the world. People like you are a dime a dozen and end up learning their lessons in a painfully humbling experience. Good luck with that, I hope you take it for the lesson it is.
All the benefit of a fab chip
Except the amount of wasted energy, and extreme amount of logic necessary to make it actually work. You know. The very fucking problem everybody’s working hard to address.
The very idea that you think all these companies are looking to design and build their own single purpose chips
The very idea that you haven’t kept up with the industry and how many companies have developed their own silicon is laugh out loud comedy to me. Hahahaha. TSMC has some news for you.
You’re only describing how ASIC is used in switches
Nope, I actually described how they are used in SoCs, not in switching fabrics.
That’s not how general use computing works in the world anymore, buddy
Except all those Intel processors I mentioned, those ARM chips in your iPhones and Pixels, the ARM processors in your macbooks. You know. Real nobodies in the industry.
It’s never going to be a co-proc in a laptop that can load models and do general inference, or be a useful function for localized NN.
Intel has news for you. It’s impressive how in touch you pretend to be in “the industry” but how little you seem to know about actual products being actually sold today.
Hey, quick question. Does nvidia have FPGAs in their GPUs? No? Hmm. Is the H100 just a huge set of FPGA? No? Oh, weird. I wonder why, since you in all your genuis has said that’s the way everybody’s going. Strange that their entire product roadmap shows zero FPGA on their DPUs, GPUs, or on their soon to arrive SoCs. You should call Jensen, I bet he has so much to learn from a know-it-all like you that has some amazing ideas about US universities. Hey, where is it that all these tech startup CEOs went to university?
Tell you what. Don’t bother responding, nothing you’ve said holds any water or value.
They can be a xenophobic, ageist jagoff all they want. I’m not engaging with them anymore. They’re the carpenter that thinks a hammer solves all problems, if we pretend they actually did anything with FPGA as their day job.
Social media will always devolve into that. I like seeing arguments personally but when it devolves into name calling and ego stroking it gets annoying real quick
They didn’t say anything xenophobic. They may have played into the ageist stuff but it was after you tried to play “I’ve been doing since you before you were born” card, and that makes it fair game imo. You were being unnecessarily aggressive from the start of this exchange and I think they were matching your energy. This is my outside perspective but you may lash out at me too. I don’t have a dog in this fight, I’d have to do research before figuring out if either of you knows what you’re talking about.
Let’s just shut this down right now. If you built FPGAs ever, it was in college in the 90s, at an awful program of a US university that trained you in SQL on the side and had zero idea of how hardware works. I’m sorry for that.
The world has changed since 30 years ago, and the future of integer operations is in reprogrammable chips. All the benefit of a fab chip, and none of the downside in a cloud environment.
The very idea that you think all these companies are looking to design and build their own single purpose chips for things like inference shows you have zero idea of where the industry is headed.
You’re only describing how ASIC is used in switches, cool. That’s what it’s meant for. That’s not how general use computing works in the world anymore, buddy. It’s never going to be a co-proc in a laptop that can load models and do general inference, or be a useful function for localized NN. It’s simply for the single purpose uses as you said.
I mean, you’re such an absolute know-nothing that it’s hilarious. Nice xenophobic bullshit sprinkled in too. Sorry, no university for me, let alone FPGA in university in the 90s. When my friends were in university they were still spending their time learn Java.
Indeed. And people like me have been there every step of the way. Your ageism is showing.
Yes, I remember hearing this exact sentiment 30 years ago. Right around the time we were hearing (again) how neural networks were going to take over the world. People like you are a dime a dozen and end up learning their lessons in a painfully humbling experience. Good luck with that, I hope you take it for the lesson it is.
Except the amount of wasted energy, and extreme amount of logic necessary to make it actually work. You know. The very fucking problem everybody’s working hard to address.
The very idea that you haven’t kept up with the industry and how many companies have developed their own silicon is laugh out loud comedy to me. Hahahaha. TSMC has some news for you.
Nope, I actually described how they are used in SoCs, not in switching fabrics.
Except all those Intel processors I mentioned, those ARM chips in your iPhones and Pixels, the ARM processors in your macbooks. You know. Real nobodies in the industry.
Intel has news for you. It’s impressive how in touch you pretend to be in “the industry” but how little you seem to know about actual products being actually sold today.
Hey, quick question. Does nvidia have FPGAs in their GPUs? No? Hmm. Is the H100 just a huge set of FPGA? No? Oh, weird. I wonder why, since you in all your genuis has said that’s the way everybody’s going. Strange that their entire product roadmap shows zero FPGA on their DPUs, GPUs, or on their soon to arrive SoCs. You should call Jensen, I bet he has so much to learn from a know-it-all like you that has some amazing ideas about US universities. Hey, where is it that all these tech startup CEOs went to university?
Tell you what. Don’t bother responding, nothing you’ve said holds any water or value.
Aaand we’re back on Reddit again…
They can be a xenophobic, ageist jagoff all they want. I’m not engaging with them anymore. They’re the carpenter that thinks a hammer solves all problems, if we pretend they actually did anything with FPGA as their day job.
Granted, it was a very controlled Reddit argument. It had all the elements, but with a bit of class. 😆
Social media will always devolve into that. I like seeing arguments personally but when it devolves into name calling and ego stroking it gets annoying real quick
Sure does 😮💨 Take care!
They didn’t say anything xenophobic. They may have played into the ageist stuff but it was after you tried to play “I’ve been doing since you before you were born” card, and that makes it fair game imo. You were being unnecessarily aggressive from the start of this exchange and I think they were matching your energy. This is my outside perspective but you may lash out at me too. I don’t have a dog in this fight, I’d have to do research before figuring out if either of you knows what you’re talking about.
You should research, then.
Kinda skipping over everything else he said but 👍
Not really. Not worth responding to the rest.
Uh huh… Bet it’s not. Because that would make things difficult.
And the classic down vote for calling out. 😆👍 Hard to shake the Reddit ways.
Have a good one!
Because literally everyone else saw the writing on the wall and is preparing FPGA chips EXCEPT for NVIDIA. 🤦
NVidia is just now trying to make their own ARM chips ffs. 5 years late. You’re dated and outmoded. Get with the future.