ANTI UPGRADE?? WHAT THE FUCK
They’ve been pulling this shit since the early days. Similar tricks were employed in the 486 days to swap out chips, and again in the Celeron days. I think they switched to the slot style intentionally to keep selling chips to a point lol
me when capitalism
thats why we are in dire need of open source hardware.
We have open source designs (RISCV also have GPU designs) but we don’t have manufacture power open sourced yet
Are there any projects to develop that capability that you know of?
No, there isn’t yet, there’s the most i could find, but it’s not machines
i dream of a world where the process will cheapen out enough like pcb design, where you can just submit the design you want and they will fab it out for you.
with more players coming into the game because of sanctions, i hope we are now on the path.
Yes, i hope so too, as for now, semiconductor lithography at home is impossible due how big and complex these machines are, so i have same opinion as you are
https://www.cia.gov/readingroom/docs/DOC_0000498114.pdf
Soviet Computer Technology: Little Prospect for Catching Up
We believe that there are many reasons why the Soviets trail the United States in computer technology:
- The Soviets’ centrally-planned economy does not permit adequate flexibility to design or manufacturing changes frequently encountered in computer production; this situation has often resulted in a shortage of critical components — especially for new products.
If you’re only response to criticism of capitalism is ((communism)), you may just be a cog in the corporate propaganda machine.
I mean they went with a literal cia link.
Thanks for the link to the unbiased study by… the CIA? Huh. Yeah I trust them.
The paper was from 1985. Was the CIA correct?
Marginally. The paper analyzes the capabilities as they existed in the 1980s, but doesn’t draw strong conclusions as to why that may be. It does demonstrate how reliance on central planning results in inadequaciea when said central planning is not operating well, though.
The paper doesn’t really mention it but the central planning of the USSR was actively reeling from Brezhnev dying, Andropov dying, and Chernenko either dying or about to die at the time the CIA thing was written. So yeah, correct is an accurate if imprecise way to put it.
Yeah it’s more a criticism of the ussr in the 80s. Central planning with more tech focus and more democracy would likely not face that specific issue.
But also there’s room for shit like kanban communism which definitely wouldn’t have these problems
IIRC, the slot CPU thing was because they wanted to get the cache closer to the processor, but hadn’t integrated it on-die yet. AMD did the same thing with the original Athlon.
On a related note, Intel’s anticompetitive and anti- consumer tactics are why I’ve been buying AMD since the K6-2.
They had integrated the L2 on-die before that already with the Pentium Pro on Socket 8. IIRC the problem was the yields were exceptionally low on those Pentium Pros and it was specifically the cache failing. So every chip that had bad cache they had to discard or bin it as a lower spec part. The slot and SECC form factor allowed them to use separate silicon on a larger node by having the cache still be on-package (the SECC board) instead of on-die.
AMD followed suit for the memory bandwidth part from the K62 architecture. Intel had no reason to do so.
It’s been at least since the “big iron” days.
Technician comes out to upgrade your mainframe and it consists of installing a jumper to enable the extra features. For only a few million dollars.
Turns out, the difference in the socket is just a few pins here and there, and you can make a 8th or 9th generation Coffee Lake CPU work on your Z170/270 board if you apply a few Kapton tape fixes and mod your BIOS,
Modders giving me a new reason to keep my ye olde z170 mobo instead of just making a new machine with all the nice hardware
Hackers are the only saving grace in this increasingly dystopian world.
Reminds me of drawing lines on old AMD processors with graphite pencils.
When will it stop? No really?
Vote with your wallet, go AMD
Until AMD does the same and it’s back to square one?
It’s been their MO for a long time to keep using the same chipset for as long as possible, if they stop then stop giving either money and just don’t upgrade, not that it really matters with the diminishing return each generation.
It’s really unfortunate they kinda screwed over threadripper customers so bad in this way, but they’re still the lesser evil by a country mile.
It’s in the article; newer gen chips will have extra DRM that will prevent the hacks from working.
Oh, you meant when will the anti-hacks stop?
Bless your heart….
DRM for CPUs.
All normal, nothing to see here, folks!
That’s cool, but is there a subset of features or cpu bound operations or something that makes it worth going through the trouble just to run a faster(?) cpu with slower memory?
deleted by creator
But Kaby Lake and Coffee Lake have the same hardware codecs.
Ah intel, never change!
Stellar work.
For some reason I don’t think I even knew Intel made motherboards.
They don’t, but they define the socket the processor slots into and probably did this to market the newer chips as more advanced than they are (by bundling a minor chip upgrade with an additional chipset upgrade that may have more uplift).
I see no other reason to kneecap upgrades like this when upgrading entails the consumer buying more of your product.
That’s exactly what it is. I previously had Intel hardware for a few generations, but I got seriously pissed off that every time I wanted to upgrade, they had come up with a new incompatible socket and discontinued everything older so I had to also buy a new motherboard.
I think they might be a bit better at supporting older sockets these days, but still, too many sockets and incompatible chipsets.
I wish there was something for HP 800 G3s. I bought them used after a lot of deliberations and would love to keep it running for as long as I can while not losing out on functionalities.