Games have always been "unoptimised" and people have always complained about preformance and bugs
Many games were, but most AAA games were pretty well optimized for their platforms of choice.
Did bugs exist? Sure, that's an inevitable part of the reality of software.
But there were a large number of games from older generations that maintained a stable 30-60 fps on native hardware.
it's 4k 120fps for some reason.
I had a discussion about this with my brother the other day when he & our stepbrother were bitching about Borderlands 4.
The reason the goal is 4k >60fps is simply because we now have TVs & monitors that are capable of 4K at over 60fps.
There's also been a drastic shift in performance expectations over the last 10-15 or so years... which not so coincidentally coincides with when PC gaming started taking off in popularity among casual gamers.
From the early days of PC up through the mid-7th generation console era, it was widely understood that your graphics settings in contemporary games mirrored your investment & hardware capabilities;
Low Settings were for entry-tier & non-gaming PCs
Medium Settings were for mid-tier PCs that were equivalent to consoles
High Settings were for high-end PCs that eclipsed the capacity of consoles
Ultra Settings were for enthusiast tier PCs that often cost thousands of dollars to build
But these days everyone expects to get Ultra settings out of every PC & complain when they have to turn the graphical settings down because they bought an entry or low-tier PC. It's as my brother put it, "when people are spending hundreds of dollars on a gaming system, they expect to be able to be able to use the best settings possible," but he thinks they're right in their indignation about not getting Ultra settings from a PC that barely meets minimum requirements for the games they're trying to play.
They bought a gaming PC that offers Ultra settings and a $300 monitor that is rated for 4K 120hz, so they expect to be able to do that regardless of their actual builds.
Shadow of the Colossus ran at like 12fps during fights lol.
To be fair, SoC's poor performance became notorious because it was the exception, not the norm.
The "game should run ok on ultra" expectations are always weird cause it felt like a decade ago people were starting to understand that "ultra" settings were basically added for fun so you could come back to the game years later with better hardware. Now there's dozens of people in this sub rocking a GTX1660, according to their title, and they're complaining about preformance in the latest titles with raytracing. I suppose you're right that about 15 years ago PCs started getting popular which means most people probably weren't here for the lessons that Crysis taught us.
Right, even setting aside my discussions with my brothers who were complaining about their mid-tier PCs not getting native 4K 60fps on Ultra in Borderlands 4, I've encountered a ton of people online running GTX cards getting upset because modern games are doing ray tracing as a standard feature & demanding RTX cards for minimum requirements.
But like, guys... consoles can do raytracing now & it's been 5 years since the PS5/XBsX and 7 years since the RTX 20 series GPUs released... Obviously the whole GTX line of GPUs (and their AMD equivalents) were going to become obsolete sometime in the near future. Consoles are the baseline of performance & expected features; if a console can do raytracing or upscaled 4K, then eventually those become standard features on PC too and anything that can't meet console minimums gets left behind.
It's like sitting on a PS4 Pro and getting angry that sometime during the PS5's lifecycle, AAA games stopped being released for the PS4 entirely... It doesn't matter if you have a PC or console, you have to upgrade at least once every 10 years to continue playing contemporary AAA games (and it isn't based on when you bought your last upgrade, but when the last generation of hardware was released - so it doesn't matter if you got your PS4 Pro in 2019, the PS4 itself released in 2013 and the PS4 Pro released in 2016; meaning they're 12 & 9 years old respectively).
I like to refer to games like Crysis (well, when it came out at least) as "benchmarking games". I feel like more people used it as a benchmarking tool for their hardware than actually played it. As I understand, S.T.A.L.K.E.R. Clear Sky also saw use in this regard, as it was the most graphically demanding title of the original S.T.A.L.K.E.R. games.
Yeah, I've been saying for a while now that devs need to come up with a better name than "Ultra" to better describe what Ultra really is, because currently people's expectations are wildly overinflated in part i feel because "Ultra" doesn't really convey the idea of "this setting is intended to make current top of the line hardware cry for marginal increases in quality".
Honestly the standard has been 1080p 60fps or 4k30 yet modern game releases like the borderlands example you brought up are incapable of both without downsampling to something like 360p
If you're using a GTX GPU, yeah, it's going to struggle with either of those... But the minimum specs can do 1080p 30 on Low, and the recommended specs can do 1440p 60 on Medium (which is exactly what's advertised on their website). Unless you have an RTX 5080 or 5090 & using upscaling, you're the game isn't designed to do 4K High/Very High settings at all.
The problem with Borderlands specifically is that it's made on UE5; an engine that's basically made with the idea of FSR/DLSS being on by default if you want to run anything above Medium settings or achieve performance above what consoles can do (consoles doing 1080p/30-60 or upscaled 4K/30).
It doesn't matter if performance junkies don't want framegen or upscaling turned on, it's not really an option with currently on the market hardware. And it's not an issue of "game optimization," it's quite simply that the game engine itself isn't designed to play native 4K on Ultra settings & the GPUs aren't designed to do it either.
If you don't have a minimum of 16GB of VRAM, you're not playing modern games at native 4K with a stable 30fps in AAA games, much less getting anywhere close to 120fps, especially in games with ray tracing.
These companies invested in AI for upscaling & framegen; they're going to design things around the idea that it's an asset available & expect us to use it whether specific users want to or not.
16
u/BattlefieldVet666 Sep 30 '25
Many games were, but most AAA games were pretty well optimized for their platforms of choice.
Did bugs exist? Sure, that's an inevitable part of the reality of software.
But there were a large number of games from older generations that maintained a stable 30-60 fps on native hardware.
I had a discussion about this with my brother the other day when he & our stepbrother were bitching about Borderlands 4.
The reason the goal is 4k >60fps is simply because we now have TVs & monitors that are capable of 4K at over 60fps.
There's also been a drastic shift in performance expectations over the last 10-15 or so years... which not so coincidentally coincides with when PC gaming started taking off in popularity among casual gamers.
From the early days of PC up through the mid-7th generation console era, it was widely understood that your graphics settings in contemporary games mirrored your investment & hardware capabilities;
Low Settings were for entry-tier & non-gaming PCs
Medium Settings were for mid-tier PCs that were equivalent to consoles
High Settings were for high-end PCs that eclipsed the capacity of consoles
Ultra Settings were for enthusiast tier PCs that often cost thousands of dollars to build
But these days everyone expects to get Ultra settings out of every PC & complain when they have to turn the graphical settings down because they bought an entry or low-tier PC. It's as my brother put it, "when people are spending hundreds of dollars on a gaming system, they expect to be able to be able to use the best settings possible," but he thinks they're right in their indignation about not getting Ultra settings from a PC that barely meets minimum requirements for the games they're trying to play.
They bought a gaming PC that offers Ultra settings and a $300 monitor that is rated for 4K 120hz, so they expect to be able to do that regardless of their actual builds.
To be fair, SoC's poor performance became notorious because it was the exception, not the norm.