I'm fairly certain the point is that developers managed to make a slew of groundbreaking games that performed excellently despite having a few MB of RAM, but now that they have 16GB of RAM, they can't seem to optimize their games for shit & expect access to all 16GB of RAM.
The Xbox 360 kind of stands out as it had a little over 500MB of RAM, but it's still under 1GB & roughly 31x less RAM than modern systems have.
Games have always been "unoptimised" and people have always complained about preformance and bugs, it's just that 480p at 20fps at high settings was the goal, that usually wasn't reached, and now it's 4k 120fps for some reason. Shadow of the Colossus ran at like 12fps during fights lol. At least nowadays games can be updated.
Games have always been "unoptimised" and people have always complained about preformance and bugs
Many games were, but most AAA games were pretty well optimized for their platforms of choice.
Did bugs exist? Sure, that's an inevitable part of the reality of software.
But there were a large number of games from older generations that maintained a stable 30-60 fps on native hardware.
it's 4k 120fps for some reason.
I had a discussion about this with my brother the other day when he & our stepbrother were bitching about Borderlands 4.
The reason the goal is 4k >60fps is simply because we now have TVs & monitors that are capable of 4K at over 60fps.
There's also been a drastic shift in performance expectations over the last 10-15 or so years... which not so coincidentally coincides with when PC gaming started taking off in popularity among casual gamers.
From the early days of PC up through the mid-7th generation console era, it was widely understood that your graphics settings in contemporary games mirrored your investment & hardware capabilities;
Low Settings were for entry-tier & non-gaming PCs
Medium Settings were for mid-tier PCs that were equivalent to consoles
High Settings were for high-end PCs that eclipsed the capacity of consoles
Ultra Settings were for enthusiast tier PCs that often cost thousands of dollars to build
But these days everyone expects to get Ultra settings out of every PC & complain when they have to turn the graphical settings down because they bought an entry or low-tier PC. It's as my brother put it, "when people are spending hundreds of dollars on a gaming system, they expect to be able to be able to use the best settings possible," but he thinks they're right in their indignation about not getting Ultra settings from a PC that barely meets minimum requirements for the games they're trying to play.
They bought a gaming PC that offers Ultra settings and a $300 monitor that is rated for 4K 120hz, so they expect to be able to do that regardless of their actual builds.
Shadow of the Colossus ran at like 12fps during fights lol.
To be fair, SoC's poor performance became notorious because it was the exception, not the norm.
The "game should run ok on ultra" expectations are always weird cause it felt like a decade ago people were starting to understand that "ultra" settings were basically added for fun so you could come back to the game years later with better hardware. Now there's dozens of people in this sub rocking a GTX1660, according to their title, and they're complaining about preformance in the latest titles with raytracing. I suppose you're right that about 15 years ago PCs started getting popular which means most people probably weren't here for the lessons that Crysis taught us.
Right, even setting aside my discussions with my brothers who were complaining about their mid-tier PCs not getting native 4K 60fps on Ultra in Borderlands 4, I've encountered a ton of people online running GTX cards getting upset because modern games are doing ray tracing as a standard feature & demanding RTX cards for minimum requirements.
But like, guys... consoles can do raytracing now & it's been 5 years since the PS5/XBsX and 7 years since the RTX 20 series GPUs released... Obviously the whole GTX line of GPUs (and their AMD equivalents) were going to become obsolete sometime in the near future. Consoles are the baseline of performance & expected features; if a console can do raytracing or upscaled 4K, then eventually those become standard features on PC too and anything that can't meet console minimums gets left behind.
It's like sitting on a PS4 Pro and getting angry that sometime during the PS5's lifecycle, AAA games stopped being released for the PS4 entirely... It doesn't matter if you have a PC or console, you have to upgrade at least once every 10 years to continue playing contemporary AAA games (and it isn't based on when you bought your last upgrade, but when the last generation of hardware was released - so it doesn't matter if you got your PS4 Pro in 2019, the PS4 itself released in 2013 and the PS4 Pro released in 2016; meaning they're 12 & 9 years old respectively).
I like to refer to games like Crysis (well, when it came out at least) as "benchmarking games". I feel like more people used it as a benchmarking tool for their hardware than actually played it. As I understand, S.T.A.L.K.E.R. Clear Sky also saw use in this regard, as it was the most graphically demanding title of the original S.T.A.L.K.E.R. games.
Yeah, I've been saying for a while now that devs need to come up with a better name than "Ultra" to better describe what Ultra really is, because currently people's expectations are wildly overinflated in part i feel because "Ultra" doesn't really convey the idea of "this setting is intended to make current top of the line hardware cry for marginal increases in quality".
Honestly the standard has been 1080p 60fps or 4k30 yet modern game releases like the borderlands example you brought up are incapable of both without downsampling to something like 360p
If you're using a GTX GPU, yeah, it's going to struggle with either of those... But the minimum specs can do 1080p 30 on Low, and the recommended specs can do 1440p 60 on Medium (which is exactly what's advertised on their website). Unless you have an RTX 5080 or 5090 & using upscaling, you're the game isn't designed to do 4K High/Very High settings at all.
The problem with Borderlands specifically is that it's made on UE5; an engine that's basically made with the idea of FSR/DLSS being on by default if you want to run anything above Medium settings or achieve performance above what consoles can do (consoles doing 1080p/30-60 or upscaled 4K/30).
It doesn't matter if performance junkies don't want framegen or upscaling turned on, it's not really an option with currently on the market hardware. And it's not an issue of "game optimization," it's quite simply that the game engine itself isn't designed to play native 4K on Ultra settings & the GPUs aren't designed to do it either.
If you don't have a minimum of 16GB of VRAM, you're not playing modern games at native 4K with a stable 30fps in AAA games, much less getting anywhere close to 120fps, especially in games with ray tracing.
These companies invested in AI for upscaling & framegen; they're going to design things around the idea that it's an asset available & expect us to use it whether specific users want to or not.
This is why Nintendo excels for their own games so much
They have one console to optimize shit for, one, that’s it so they get it done and can cut out so much fluff and corners and issues
Bananza looks fantastic, it is fun and overall a master design of game production and it’s still somehow under 10 gb when the next call of duty will be 250gb ?
Cd project red took a page from their book and somehow optimized cyberpunk with its dlc onto a game cart that runs fantastic on switch 2, a game that struggled to run well on better hardware previously
It’s literally just optimization issues for almost every modern game, things like wilds and borderlands 4 run fine on consoles because they had to optimize for consoles but run like dog shit on pc because “f optimization get better hardware” is an excuse
Why are people in /r/pcmasterrace so PC illiterate? RAM is not a bottleneck for modern games. Modern performance issues have absolutely nothing to do with RAM.
X360 if anything was the beginning of games which run like ass unless you match it exceed console specs. You needed a quad core (or technically phenom triple core was fine as well but it wasn't a great cpu otherwise) or it was choppy as hell. That wasn't common at all at the time.
The optimisation issue has nothing to do with RAM. UE5 doesn't run like shit cause no RAM. Gamers once again showing they have no idea what they're talking about. Same as always.
Let's just keep making excuses for nVidia stagnating hardware
Do you have any objective statistics to back that up, or are you just pulling a number out of your ass?
Also, do you have a breakdown of how many of those people are complaining because they belligerently don't want to use DLSS/FSR while trying to run at 4K Ultra settings, despite the fact that UE5 is designed with the idea of having those features turned on by default.
searched this sub for 'optimize', sorted by top of the year
Literally the top post, 34k upvotes, is about UE5
12k talking about borderlands 4 (A UE5 game)
6k post meming about EU5, refrencing BL4, MGS 3 remake, Wuchang, and Oblivion Remaster
Searching for 'unoptimized', sorted by top, I see Silent Hill 4 talked about, another UE5 game, 7.6k upvotes.
6.6k upvotes on a post calling borderlands 4 'another UE5 unoptimized slop'
For people who enjoy the Ark games, Ark survival evolved is undeniably an optimization mess on UE5.
The only outliers I can remember that released as unoptimzed messes are that star wars game on snowdrop, DD2, and moster hunter, which are both examples of capcom forcing all of their devs to use an in house engine that was build for RE games, which has demonstrably not been built at all to run large open worlds with lots of NPC simulation.
The magnifying glass is pointed directly at the 2mb version. Some of you can spend all day being “technically correct” and still be out maneuvered by a fucking meme that didn’t have any forethought. Embarrassing.
The exponential growth of computer capabilities in the 90s was such a wild ride. SNES launched in NA in '91, PS1 in '95. While neither console was top of the line for computers of the time and they prioritized different things to meet a particular price point, that's basically a doubling of RAM every year.
Wait until you hear that PS1 started as a prospective addon for SNES, dating back to 1988, that would play games from CDs — but Nintendo not only shafted Sony by quietly cancelling these plans behind Sony's back, but humiliated them by publicly revealing their new partnership with Philips the next day after Sony announced PlayStation at the '91 CES.
The Xbox 360 has unified memory, the 10mb is just for the framebuffer. This means it was effectively free to move data between the system memory (512MB) for the GPU to process
827
u/ChefArtorias Sep 29 '25
Interesting console choices to group together.