The article title is straight up misinformation at present. From the article itself:
The FuryGPU is set to be open-sourced. “I am intending on open-sourcing the entire stack (PCB schematic/layout, all the HDL, Windows WDDM drivers, API runtime drivers, and Quake ported to use the API) at some point, but there are a number of legal issues,” Barrie wrote in a Hacker News post on Wednesday. Because he works in a tangentially related vocation, he wants to make sure none of this work would break his work contract or licensing etc.
Nothing against OP who simply copied the title, nor the project author. This is impressive but it’s not yet open source and there may be legal hurdles preventing it from becoming so.
That’s fair. I’m hoping for the best outcome.
ah, so thats why it supports windows. Ok.
I’d be willing to help fund a lawyer reviewing things to ensure it can be open sourced.
All this text, yet nowhere its mentioned whether it runs Doom. Clearly the most important thing to run on any device
OG Doom does not support (or need) hardware 3D acceleration. It’s not a polygonal rendering engine.
Relatedly, and probably not to anyone’s surprise, this is why it’s so easy to port to various oddball pieces of hardware. If you have a CPU with enough clocks and memory to run all the calculations, you can get Doom to work since it renders entirely in software. In its original incarnation – modern source ports have since worked around this – it is nonsensical to run Doom at high frame rates anyhow because it has a locked 35 FPS frame rate, tied to the 70hz video mode it ran in. Running it faster would make it… faster.
(Quake can run in software rendering mode as well with no GPU, but in the OG DOS version only in 320x200 and at that rate I think any modern PC could run it well north of 60 FPS with no GPU acceleration at all.)
OG Doom engine uses pre-built lookup tables for fixed point trigonometry. (table captures the full 360 degrees for sine and cosine with 10240 elements)
Tons of software did this for the longest time. Lookup tables have been a staple of home computing for as long as home computers have existed.
And CPUs still do it to this day. Nasty, nasty maths involved in figuring out an optimal combination between lookup table size and refinement calculations because that output can’t be approximate, it has to work how IEEE floats are supposed to work. Pure numerology.
I really like reading people talk about how better programmers in a more civilized age could do things.
Anyone who has ever had to maintain old code will tell you that this more civilized age is right now and that the past was a dark and terrible time.
Seriously, there were no standards, there was barely any documentation even in large organizations and people did things all the time that would get you fired on the spot today. Sure, you had the occasional wunderkind performing amazing feats on hardware that had no business of running these things, but this was not the norm.
I didn’t have to maintain it, actually I haven’t ever worked as a programmer. But I’ve patched a few FOSS things abandoned around 15-25 years before that to work as a hobby activity, some kind of digital archeology.
I think people also do things now for which they’d be fired on the spot 20 years ago. Everything changes.
I suspect what you call “no standards” means in fact “different standards”, but that’s just a cultural difference. Some project from 1995 may use “Hungarian notation” in variable names, well, that was normal then.
That adequate version control and documentation are, eh, a bit more of a norm now, - yes.
I remember those old games that would run faster to the point of hilarity if you put them on anything more modern than they were originally intended to run on. Like the game timing is tied to the frame rate.
This was by and large the reason for the “turbo” buttons on all those 286 and 386 computers back in the day. Disengaging the turbo would artificially slow down your processor to 8086 speed so that all your old games that were timed by processor clock speed and not screen refresh or timers would not be unplayably fast.
Quite a few more modern games have their physics tied to frame rate – if you manage to run them much faster than the hardware available at the time of their releases could, they freak out. The PC port of Dark Souls was a notorious example, as is Skyrim (at least the OG, non “Legendary Edition” or SE versions).
GTA SA 3 VC too
On launch, Spyro: Reignited Trilogy had a level you couldn’t complete unless you changed the settings to lock it to 30fps. It’s probably been patched by now, but was that ever infuriating.
Oh you mean fallout 4?
There used to be dip switches on some older machines (386/486 era), eventually ‘turbo buttons’ that accomplished the same thing, toggling would cut the clock speed so older software would be compatible with clock speed. Those turbo buttons were more a ‘valet mode’ than anything, but it all died out before the Pentium/Athlon era to say the least
Command and Conquer Generals lets you choose game speed for skirmish matches, the natural cap of 60 and an option to uncap. You need superhuman reflexes to play with an uncapped speed on modern hardware !
Interesting, learned something new from my silly comment!
Also, this’ll blow your mind too, Doom wasn’t actually 3D. It was a clever trick involving the lack of the ability to look up and down. They used some sort of algorithm (I forget how it works exactly) to turn the 2D walls, doors, and platforms that appear from the top-down view in the map into vertical stacks of lines that “look” like 3D objects in front of you. The sprites are also all just 2D projections overlayed onto the game.
This system introduced all kinds of wierd quirks in the game, like the trippy effect you get when you activate no-clipping and clip through the edge of the map.
Like for instance, monsters and other sprite objects in the original incarnation of the Doom engine have infinite height. So you can’t step on top of, or over, any monsters if e.g. you are on a ledge high above them. That’s because they’re 2D objects, and their vertical position on the screen is largely only cosmetic. This is why you can’t run under a Cacodemon, for instance.
“Actors” (monsters, etc.) in Doom do have defined heights, but presumably for speed purposes the engine ignores this except for a small subset of checks, namely for projectile collision and checking whether a monster can enter a sector or if the ceiling height is too low, and for crush damage.
This was rectified in later versions of the Doom engine as well as most source ports. By the time Heretic came out (which is just chock-a-block full of flying enemies and also allows the player to fly with a powerup) monsters no longer had infinite height.
Most notably perspective only gets calculated on the horizontal axis, vertically there is no perspective projection. Playing the OG graphics with mouse gets trippy fast because of that. Doom doesn’t use much verticality to hide it. Duke Nukem level design uses it more and it’s noticeable but still tolerable. Modern level design with that kind of funk, forget it.
Here’s a video that explains the limitations of the DOOM engine and with it also briefly how the rendering part of it works (from 4:08 onward) in a very accessible manner:
If you want a more in-depth explanation with a history lesson on top (still accessible, but much heavier), there’s this excellent video:
Ooooo I absolutely want these—thank you!
Original Doom was not GPU accelerated.
Original Doom was not GPU accelerated.
Neither was original Quake. GLQuake was a later update. The original was for DOS using only software rendering.
Yup, and most people played it at something like 10 to 15 fps on hardware of the time. Same with DOOM a couple of years earlier.
With resolution 320x240. And I do not remember how many colors. 256?
I do not remember how many colors. 256?
Yes with a fixed color palette of mostly shades of brown and green.
Nowadays I’ve been seeing lots of people porting Super Mario 64 as the challenge, as Doom is honestly beyond trivial at this point. I’m totally onboard, SM64 is a fantastic game, it shows off traditional shaded polygon and rasterization performance pretty well, and it’s just plain fun to spite Nintendo.
what about Crysis?
Asking the real questions. Anything can run Doom nowadays. I’ve seen it run on a pregnancy test.
yep, doom runs on literally anything. On Fridges, Toasters, Pregnancy Tests, and more.
but Crysis… thats a litmus test.
Do we even have anything that can run Crysis yet?
At this point I’m convinced Doom can run on anything
Subsequently, the project got a boost by the debut of Xilinx Kria System-on-Modules (SoMs), which combine “insanely cheap Zynq UltraScale+ FPGAs with a ton of DSP units and a (comparatively) massive amount of LUTs and FFs, and of particular interest, a hardened PCIe core,” enthused Barrie.
Yes, I understand, the bippity uses mumps in order for the many lutes to flips those zupps in their pacas.
FPGA
Awww, I thought this was an ASIC. Slapping an FPGA on a PCIe card is decidedly less cool. Still, props for creating a usable GPU circuit description, that must have been a nightmare.
In situation where there are affordable (for this purpose) FPGAs - more cool, not less. ASICs you have to actually order somewhere somehow to be produced.
And one can order ASICs from that description, no?
I keep reading the word Fury as Furry
FurryGPUwU
Did you mean MIAOW and Nyuzi?
Same, I wasn’t surprised either. I could see a furry making a GPU for fun
Or at least funding it.
Something something furmark
Yeah, why don’t they lean into it and call it FurryGPU much better.
one of us…?
Yes
That would be disgusting.
*awesome
Nah nothing awesome about furries.
lots awesome about blocking you
Damn a dog fuckers gonna block me.
What year is it?!
The way people dress, Quake, it’s 1996!
Somebody rolled a five or an eight!
RISC-V plans to make ISA extensions that will enable it to work better in graphics applications. Look forward to truly open-source graphics
I’m confused, he made a homemade GPU that can’t be mass-produced, and it runs a 30 year old game at 44 fps, and it may (or may not) actually become open source, and I’m supposed to be excited about it?
You’re not supposed to be anything. It’s a pretty cool feat by one person though.
You should be impressed. Integrated circuits are insanely complex, and any general purpose processing hardware since the 90s is way too complicated for the human mind to comprehend.
Open sourced physical technology is only in its infancy, you may be exited about this trend.
Ive seen open sourced hacking tools, openassistant wireless connectors, complete keyboards.
Its about time someone started on open sourced proper pc hardware, no matter of how small scale it starts.
Imagine a future where you can 3d print a 2d printer and its refillable cartridges at home, with extensive manuals on diy repairs and maintenance and no costs beyond the raw resources and your time.
Open source demonstrates humans cooperating with no profit insensitive. Exactly what capitalism calls impossible. When i first learned about linux it felt incredibly lacking compared to windows, nowadays its my main os, its surpassed windows in anything except good Nvidia drivers.
If you’re not interested then no, you shouldn’t be excited about it.