• Redkey@programming.dev
    link
    fedilink
    arrow-up
    6
    ·
    7 days ago

    A couple of other commenters have given excellent answers already.

    But on the topic in general I think that the more you learn about the history of computing hardware and programming, the more you realise that each successive layer added between the relays/tubes/transistors and the programmer was mostly just to reduce boilerplate coding overhead. The microcode in integrated CPUs took care of routing your inputs and outputs to where they need to be, and triggering the various arithmetic operations as desired. Assemblers calculated addresses and relative jumps for you so you could use human-readable labels and worry less that a random edit to your code would break something because it was moved.

    More complex low-level languages took care of the little dances that needed to be performed in order to do more involved operations with the limited number of CPU registers available, such as advanced conditional branching and maintaining the illusion of variables. Higher-level languages freed the programmer from having to keep such careful tabs on their own memory usage, and helped to improve maintainability by managing abstract data and code structures.

    But ignoring the massive improvements in storage capacity and execution speed, today’s programming environments don’t really do anything that couldn’t have been implemented with those ancient systems, given enough effort and patience. It’s all still just moving numbers around and basic arithmetic and logic. But a whole lot of it, really, really fast.

    The power of modern programming environments lies in how they allow us to properly implement and maintain a staggering amount of complex minutiae with relative ease. Such ease, in fact, that sometimes we even forget that the minutiae are there at all.

    • Mikina@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      6 days ago

      To add to this excelent answer, one thing that made me really understand and realize quite a lot about how do CPUs actually work, and why is most of the stuff the way it is, was playing through the amazing “Turing Complete” puzzle game.

      The premise is simple - you start with basic AND/OR/NOT gates, and slowly build up stuff. You make a NAND, and then can use your design. Then you make a counter, and can use that. The one bit memory. An adder. A multiplexer. All using the component designs you have already done before.

      Eventually, you build up to ALU and RAM, until you end up with a working CPU. Later levels even add creating your instruction sets and assembly language, but I never really got far into that part.

      It’s a great combination of being a puzzle game - you have clear goals, and everything is pretty approachable and very well paced. I had no idea how is memory done on the circuit level, but the game made me figure it out, or had hints when I got stuck.

      And seeing a working CPU that you’ve designed from scratch is pretty cool, but most importantly - even though I’ve had courses on hardware, CPU architecture and the like on college, there’s a lot of stuff I kind of understood, but it never really clicked. This game has helped tremendously in that regard, and it was full of “aha moments” finally connecting a lot of what I know about low-level computing.

      I’m not even into puzzle games that much, but this was just a joy to play. It was so fun I sat through it in one session, up until I got to a complete CPU. I very highly recommend it to anyone.

      • Trigg@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        4 days ago

        Thanks for this suggestion. I bought it last night and then forgot to sleep. Thoroughly enjoying wiring up my own opcodes.