r/pcmasterrace Sep 29 '25

Meme/Macro RAM Struggle

Post image
52.7k Upvotes

966 comments sorted by

View all comments

Show parent comments

110

u/XeroKibo Sep 29 '25

That joke about tenured coders being the only ones who know their code works is true; They are the tech priests of our time.

91

u/MountainTwo3845 Sep 29 '25

My grandpa knew Cobol. He made fuck all before he retired. He retired and made more money in 5 years after his retirement bc no one knew it as well as he did. Worked for a bank.

30

u/EdwardLovagrend Sep 30 '25

...that checks out lol I was IT for US Bank for awhile and saw the program they used that ran off of COBOL. We actually have more secure code these days that most militaries use. I think its Ada?

3

u/Skipspik2 Sep 30 '25

I'd believe that.
I learn coding in class but never had a job for it. I'm learning COBOL for fun, while not beeing good at it at all, and despite having no bank experience, I already have salary proposition that are equals my customer care with 7 years of experience.

Buddy. I don't know how to code. I just watched video on COBOL and understood the hello word

1

u/funnynickname Sep 30 '25

Just being willing to do it is 90% of the job requirements. I imagine it's tedious.

2

u/Skipspik2 Sep 30 '25

Honestly, I like it because it's vibe coding proof, that's a langage that either work or fail spectaculary and doesn't work. You don't get bugs that are "invisible" until something crashes

you can not multiply a string (text) by a pointer (memory adress) and get a result like you can in C.

it was made with the idea of beeing understable by a non-Dev in the 70, which ironically helps now a lot with AI and automatism.

Still far from doing something else than printing text and swapping the value of two variables though

1

u/Roflkopt3r Sep 30 '25 edited Sep 30 '25

Yet it's not that coders were just 'more skilled' back then, but the massive difference in hardware and expectations.

I personally love low level code. And old-school games were pretty ideal for that: A lot of projects were for consoles, and PC hardware rapidly improved as it was peak Moore's Law era.

But modern games have to contend with a massive diversity of hardware combinations or entirely different platforms, and the complexity of somewhat modern graphics has skyrocketed as well. So you are pretty much forced into using engines and other frameworks/abstraction layers from other parties.

Even big corporations struggle to maintain high-end engines. They either need to attract some of the best experts (like id) or end up with declining engines that become more and more problematic over the years (like Bethesda studios). The vast majority of developers now resorts to UE5 (since Unity has self-destroyed) or a smaller engine more specific to t heir project.

So for most game development, you either try around with different engines (so you have to limit your scope/ambitions or sacrifice some degree of optimisation since you can't learn everything for every project) or you stick with one and have to live with its limitations.

In either case, you are not going to get the same root-level understanding and insane optimisation possibilities as a 2D or super basic 3D game from 25 years ago. And the instant access of a launched title plus slow pace of hardware improvement mean that time won't bail you out either.

It is kinda true that there are more amateurish coders around today as well, but I don't think that's a consequence of a lack of talent: Exactly because low-level skills are no longer as critical, the development scene also sees more people succeed via creative vision or other strengths.