Any of you feel like we’ve become so fixated on graphics and perfomance to the point where the actual game part of a video game is often overlooked, or at least underemphasized? I don’t know about the rest of you, but all I come across on social media regarding gaming is about resolution, ray tracing, DLSS/FSR, frame rates, frame time, CPU and GPU untilization, and all of that stuff, and I’m honestly sick of it! I mean performance markers have always been discussed when it comes to PC gaming, but now even console gaming is getting this treatment! Don’t you miss the days when you just installed the game and just played it? I know I do. What do you think?

  • averyminya@beehaw.org
    link
    fedilink
    arrow-up
    13
    ·
    6 days ago

    You may as well have typed this in 2009 or 2015.

    It used to be that people argued that it’s worth getting the new game console because “better graphics”. The console wars hasn’t gone anywhere, it’s just expanded.

    In any case, in regards to just installing a game and playing it, no, not really. When I was playing games in college in 2012 it was still a time when you would open a game and go to the settings menu to adjust settings.

    Sometimes it was just turning off motion blur, but there was always settings to change to try to reach a stable 60FPS.

    Nothing changed, it just expanded. Now instead of 60FPS it’s a variable 60-240FPS. Instead of just 720p-1080p resolution, unless it’s portable, it’s 1080p minimum otherwise variable up to 4k. Instead of “maxing out” we now have raytracing which pushes software further than our hardware is capable.

    These aren’t bad things, they’re just now 1) slightly marketed, 2) more well known in the social sphere. There isn’t anything stopping you from opening up the game and going right away, and there’s nothing stopping other people from wondering about frame timings and other technical details.

    Sure, focusing on the little things like that can take away from the wider experience, but people pursue things for different reasons. When I got Cyberpunk 2077 I knew that there were issues under the hood, but my experience with the game at launch was also pretty much perfect because I was focused on different things. I personally don’t think a dip here and there is worth fretting over, but some people it ruins the game for them. Other people just like knowing that they’re taking full advantage of their hardware, hence figuring out the utilization of their components.

    There’s one last aspect not mentioned. Architectures. 10 years ago games would just boot up and run… But what about games from 10 years before then? Most players not on consoles were having to do weird CPU timing shenanigans to be able to boot up a game from (now 20) years ago. We’re in the same boat now with emulation, which while emulation is faring better, X360/PS3 generation games that had PC ports are starting to have issues on modern Windows. Even just 5 or 6 years ago games like Sleeping Dogs wouldn’t play nice on modern PC’s, so there’s a whole extra aspect of tinkering on PC that hasn’t even been touched on.

    All this to say, we are in the same boat we’ve always been in. The only difference is that social media now has more knowledge about these aspects of gaming so it’s being focused on more.

    The one thing I do agree with though is that this is all part of software development. Making users need better hardware, intentional or not, is pretty crazy. The fact that consoles themselves now have Quality vs Performance modes is also crazy. But, I will never say no to more options. I actually think it’s wrong that the console version of games often are missing settings adjustments, when the PC counterpart has full control. I understand when it’s to keep performance at an acceptable level, but it can be annoying.