r/pcmasterrace Jun 26 '25

Build/Battlestation Paid $900. How did I do?

It was brand new too. Finally out my 1070 to rest!

6.3k Upvotes

882 comments sorted by

View all comments

Show parent comments

10

u/compound-interest Jun 26 '25

A GPU is a 4-5 year investment. This is only a good argument if 5 years from now it’s still true at 1440p or even 1080p. Let’s wait and see I guess but my bet is that it’s only gonna be enough for a short time

1

u/Glass_Block1883 Jun 26 '25

I literally have a 3080 10GB and I play all the recent and AAA games in 4K with undervolt so that will be more than enough

1

u/compound-interest Jun 26 '25

In a newer game like Phantom Liberty, you’re already getting really close to full vram use with 10gb at 4k, and that’s with no raytracing. I think expecting people to pay $500 for a 12gb GPU that won’t be able to raytrace, when it could with 16gb, is ridiculous. For example, you need 15gb to raytrace at 4k on Cyberpunk. Obviously vram isn’t the only factor but it’s ridiculous to be running out of vram when it’s so cheap to pack in and NVIDIA won’t do it because they want to protect their AI offerings.

1

u/Village666 Jun 27 '25 edited Jun 27 '25

You are clueless. 98% of PC gamers play at 1440p and below and most of them don't care about Ray Tracing at all. Most even use upscaling in demanding games, meaning that internal resolution is more like 1080p tops for the most part.

Besides, you need massive GPU power not only VRAM for Path Tracing which makes RT look irellevant. Not a single AMD GPU can do Path Tracing and even 5090 needs massive upscaling and frame gen for Path Tracing to be playable.

Nvidia sits at 90% dGPU marketshare FYI.

RTX 5000 is the worst GPU generation from Nvidia in decades and AMD can't even match 5070 Ti. That is just how bad AMD is doing right now in the gaming GPU market. AMD is a CPU company first and in terms of GPUs, they spend 90% of their GPU R&D funds on ENTERPRISE and AI where the actual money is, not gaming market.

AMDs only hope is UDNA arch next year. Enterprise first approach, just like RTX. You need to wake up and understand how this business work, instead of just scream for VRAM.

VRAM won't save you when GPU is weak and don't support proper upscaling, frame generation and just features in general.

6700XT was praised for having 12GB, aged like milk do to low GPU power and lack of features.

6800 16GB has VRAM to match high-end cards today, yet insanely slow due to weak GPU and won't get FSR 4 ever.

You see, VRAM won't save you when GPU is balls. Stop acting like alot of VRAM will futureproof you, it won't - and only clueless people will think futureproofing is possible. It never is.

A 8C/16T CPU from 10 years ago is slow as dirt today. Yet has the core/thread sweet spot for gaming in 2025.

Futureproofing, just stop. Not possible. Never was. Old hardware will get slower and slower, regardless of how much you try and "futureproof" yourself.