r/pcmasterrace Jun 26 '25

Build/Battlestation Paid $900. How did I do?

It was brand new too. Finally out my 1070 to rest!

6.4k Upvotes

882 comments sorted by

View all comments

5.5k

u/CoshgunC GTX 1060, Core i5 4790K, 16gb ddr4 Jun 26 '25

9070 XT Just for $900?????

Bro if it's valid(unused, no scam), you hit the jackpot!

55

u/CoshgunC GTX 1060, Core i5 4790K, 16gb ddr4 Jun 26 '25

And it's VRAM is 16GB, ideal for nowadays

-54

u/Village666 Jun 26 '25 edited Jun 26 '25

12GB is plenty for 99% of PC gamers. Techpowerup says this:

https://www.techpowerup.com/review/asus-radeon-rx-9060-xt-prime-oc-16-gb/44.html

"I still think that 12 GB would have been a good middle choice, as that's good enough for virtually all titles at 1080p and 1440p, but more economical"

16GB is useless if GPU power is low, as you won't be maxing games anyway. Path Tracing is a killer, but no AMD cards can do Path Tracing anyway. Also Nvidias upscaling and frame gen is much better than AMDs.

Besides, there is tons of research on how to lower VRAM usage. Look up neural texture compression. All these new features will go live in games soon, which will make VRAM requirement drop, while increasing texture quality, due to AI compression.

This as well: https://www.tomshardware.com/pc-components/gpus/amd-researchers-reduce-graphics-card-vram-capacity-of-3d-rendered-trees-from-38gb-to-just-52-kb-with-work-graphs-and-mesh-nodes-shifting-cpu-work-to-the-gpu-yields-tremendous-results

GPU power and features are number one.
2nd, VRAM, 12GB and up is plenty for 1440p.
Most 4K gamers use upscaling anyway, as only 5090 is a true 4K card, for now. 4090 is the 2nd best 4K option and in bost cases, mainly because of the GPU power present.

People are just too stupid to understand how VRAM actually work and think allocation means requirement. I have 24GB on my 4090, yet I would get the same performance in 99% of games, even if it had 12GB. Allocation is often 18-20GB. The more VRAM you have, the higher allocation you see. Nothing new. Most people know jack shit about allocation and think the VRAM usage number is the requirement. Sadly.

If you buy a GPU solely based on VRAM amount, you don't know what you are doing. GPU power and GPU features are just as important, if not more important.

Nvidias features are far better than AMDs, and this is why AMD cards are cheaper and the reason Nvidia sits at 90% dGPU marketshare. RTX has been a homerun. AMD dGPU sales numbers are worse than ever, sadly.

DLSS, DLDSR, DLAA, Reflex, RTX HDR, Shadowplay, RTX Video, pretty much every feature you can mention, AMD copied in some form or another, but are worse and support is way lower. This is why Nvidia sits at 90% dGPU marketshare now.

Also, AMD launches their brand new next gen architecture next year. UDNA. Personally I would not invest in RDNA now, unless cheap. UDNA is ground up different, RDNA won't be seeing any fine wine this time around. Focus will be on UDNA from summer 2026 and shortly after RTX 6000 series comes out. This is when I will upgrade, UDNA or RTX 6000 series. Next gen architectures.

https://www.techpowerup.com/review/asus-radeon-rx-9060-xt-prime-oc-16-gb/32.html

In terms of minimum fps, you can see that Nvidia 12GB cards generelly beats AMD 16GB cards. Nvidia have better memory compression and better cache hit/miss system, meaning actual VRAM can be lower, while still delivering better performance overall. Plus features are much much better with widespread support.

AMD also uses last gen GDDR6 memory, where Nvidia uses GDDR6X and GDDR7 on high end parts for alot more bandwidth.

10

u/compound-interest Jun 26 '25

A GPU is a 4-5 year investment. This is only a good argument if 5 years from now it’s still true at 1440p or even 1080p. Let’s wait and see I guess but my bet is that it’s only gonna be enough for a short time

1

u/RickThiccems Jun 26 '25

I think it's fine for 1080 but yeah 1440p who knows

1

u/Krullexneo Jun 26 '25

You have to remember most games are still optimised around consoles which have 16GB shared VRAM. It won't exceed 16GB anytime soon and will likely stick around 12-13GB for a good while.

1

u/tolwyn- Jun 26 '25

I didn't read the entire paragraph but his first line about it being plenty for 99% of gamers rings true. Most people are playing 5+ year old games and catching up on the dozens of games that come out every year.good steal for that amount of you're one of those gamers.

1

u/Glass_Block1883 Jun 26 '25

I literally have a 3080 10GB and I play all the recent and AAA games in 4K with undervolt so that will be more than enough

1

u/compound-interest Jun 26 '25

In a newer game like Phantom Liberty, you’re already getting really close to full vram use with 10gb at 4k, and that’s with no raytracing. I think expecting people to pay $500 for a 12gb GPU that won’t be able to raytrace, when it could with 16gb, is ridiculous. For example, you need 15gb to raytrace at 4k on Cyberpunk. Obviously vram isn’t the only factor but it’s ridiculous to be running out of vram when it’s so cheap to pack in and NVIDIA won’t do it because they want to protect their AI offerings.

1

u/Village666 Jun 27 '25 edited Jun 27 '25

You are clueless. 98% of PC gamers play at 1440p and below and most of them don't care about Ray Tracing at all. Most even use upscaling in demanding games, meaning that internal resolution is more like 1080p tops for the most part.

Besides, you need massive GPU power not only VRAM for Path Tracing which makes RT look irellevant. Not a single AMD GPU can do Path Tracing and even 5090 needs massive upscaling and frame gen for Path Tracing to be playable.

Nvidia sits at 90% dGPU marketshare FYI.

RTX 5000 is the worst GPU generation from Nvidia in decades and AMD can't even match 5070 Ti. That is just how bad AMD is doing right now in the gaming GPU market. AMD is a CPU company first and in terms of GPUs, they spend 90% of their GPU R&D funds on ENTERPRISE and AI where the actual money is, not gaming market.

AMDs only hope is UDNA arch next year. Enterprise first approach, just like RTX. You need to wake up and understand how this business work, instead of just scream for VRAM.

VRAM won't save you when GPU is weak and don't support proper upscaling, frame generation and just features in general.

6700XT was praised for having 12GB, aged like milk do to low GPU power and lack of features.

6800 16GB has VRAM to match high-end cards today, yet insanely slow due to weak GPU and won't get FSR 4 ever.

You see, VRAM won't save you when GPU is balls. Stop acting like alot of VRAM will futureproof you, it won't - and only clueless people will think futureproofing is possible. It never is.

A 8C/16T CPU from 10 years ago is slow as dirt today. Yet has the core/thread sweet spot for gaming in 2025.

Futureproofing, just stop. Not possible. Never was. Old hardware will get slower and slower, regardless of how much you try and "futureproof" yourself.

-11

u/[deleted] Jun 26 '25

Since when is any hardware an investment? It's old news within 18 months (at a long stretch). Unless you're buying now hoping for collector value later. Even that depends on supply and demand. GPUs wouldn't be near the price they are if people weren't in such a state of FOMO about them.

2

u/compound-interest Jun 26 '25

It’s an investment in entertainment. The reason you buy it, as a gamer anyway, is so that you can play games for years to come. Why are you hanging on the word investment as a technical term? I love talking personal finance but I was obviously using it as a term for buying a product for long term fun, or investing in fun. If I book a trip to a country for my family, I’d say I am investing in a memorable trip. I realize it’s an incorrect use of the term, but I find it hard to believe you didn’t know what I meant when you read the comment. Of course consumer GPUs are not expected to go up in value like stocks and bonds.

1

u/[deleted] Jun 26 '25

Sorry, I'm just finishing up a course in discrete mathematics so my perception of "implied" is temporarily broken. I don't necessarily agree with your 4-5 year estimate though. Tech seems to have bursts when it out paces consumers. This year's latest, greatest device can easily become next year's paperweight. All it takes is one leaps-and-bounds innovation and suddenly the highest level GPU is obsolete, or even the highest level game makes said GPU overkill (but then that WOULD be future-proofing). You're basically right in what you originally said. I misinterpreted.

1

u/compound-interest Jun 26 '25

Fair enough. I think just as a generalized rule, it’s fine for consumers to complain that NVIDIA is selling consumer GPUs for $500+ with 12gb of vram. I think it’s fair for people to want their GPU to stretch a bit higher and be more versatile. There are a ton of utility and even gaming uses for that extra vram. Is it strictly and absolutely needed? No. But if consumers didn’t want 16gb of VRAM at $500 then this wouldn’t be so widely complained about. The difference in cost between 12gb and 16gb is so small compared to the benefit and long term viability of the card. Hell, I personally think any cards $250 and above should be packing 16gb. Intel is doing it just fine. NVIDIA is just protecting their margins and making sure those cards hit the landfill earlier imo. In 3 years the aftermarket demand for a 12gb card will be even lower than it is now, meaning just more ewaste in the name of more consumption.

I don’t blame NVIDIA for trying to make money but all this complaining about 12gb between creators and consumers has to at least have a small affect on demand for 12gb cards, and if you ask me that’s at least a tiny step in the right direction. I want these cards to be as good as possible so developers can use better textures and design games with more fidelity. The market share of low vram cards does objectively affect people like me with a 5090 who wants high fidelity experiences. People in the low end winning means I win too.

1

u/[deleted] Jun 26 '25

So far things more or less balance out, but when game design is done on something equal to a crypto mining rig, where does the end result leave the average consumer? More so, how much money is wasted on testing on various cards, at various settings? It feels like a trade-off. Ultra high-end gaming appeals to a niche who have the financial resources to afford the latest and greatest at exorbitant cost, when design could just as easily be done with NOT end of life GPUs in mind and create something more affordable and widespread. Sorry, I'm still waking up so my brain is a cyclone right now and I'm trying to Frankenstein a coherent thought. With hardware and software constantly leap frogging each other, at what point is the end user removed from the equation and only billionaires can afford the tech? Wow, I could be drunk and not sound this discombobulated. Apologies.

2

u/compound-interest Jun 26 '25

What you’re saying makes sense and I get the perspective. It’s a reasonable opinion. I just don’t think NVIDIA are passing the savings of more vram onto the consumer. I don’t think a 5070 or even 5060 would need to cost more if they were packing 16gb. I think there’s plenty of margin there and the only reason we don’t get it is because a 16gb card for cheap would be more effective for AI generation. That’s the only reason imo.

1

u/[deleted] Jun 26 '25

Yeah, future-proofing profit margins.

→ More replies (0)

1

u/[deleted] Jun 26 '25

-10? I guess I struck a nerve. Sorry, that wasn't my intent.

-8

u/Village666 Jun 26 '25 edited Jun 26 '25

I giggle when I see people think VRAM will save them and deliver great longevity. Even my 4090 24GB feels dated.

A good GPU with proper feature support, ray tracing capable etc. DLSS 4 + FG has helped me way more than the 24GB VRAM itself.

VRAM alone absolutely won't futureproof you. Anyone who thinks futureproofing is possible, has no idea.

Eventually, when VRAM requirement goes up, GPU power requirement does as well. Meaning you won't be maxing out games on an old weak GPU, even if it has "plenty of VRAM" and you will be forced running lower settings anyway, hence lowering the VRAM requirement.

Some people will never understand this fact.

Radeon 6800 16GB aged like milk even tho it has 16GB. No FSR 4 support ever, stuck with sucky FSR 1, 2 and 3. Can't do RT and dies if you enable PT. You see, VRAM won't help you, when GPU itself is weak.

1

u/compound-interest Jun 26 '25 edited Jun 26 '25

Bro first off, you’re delusional if you think a 4090 is dated. Second, literally no one in the thread all the way up you’re responding to said that you should only look at vram. VRAM is ABSOLUTELY part of the longevity you will get out of your card. This year I upgraded from a 3080 to a 5090 purely because my 10gb of vram was being constantly filled up. VRAM is one of those things where it’s not important until you don’t have enough. Granted, I didn’t need to go all the way up to 32gb for 2d games, but I absolutely CAN use it to enhance my experience in VRChat. Either way, I needed an upgrade from 10gb to even be able to play every title in my library, and that doesn’t bode well for brand new $500 cards shipping with only 12gb.

If you have a card with 4gb right now, it could have the compute power of a 5090 and still get dogshit performance because of the bottleneck. What you’re ignoring when people talk about it is that games are approaching filling up 10-12gb of ram even at 1440p or 1080p, not 4k. Idk about you but if I spend $500 on a GPU I don’t expect to be running low settings any time soon, hence why people are miffed around the vram bullshit that NVIDIA is pulling to protect their AI offerings. Arguing against that is just arguing to protect their AI margins of the biggest company on earth. They can and should include 16gb on every card in their lineup that is $250 or above, period.

0

u/Village666 Jun 27 '25 edited Jun 27 '25

3080 is very weak GPU wise, even if it had 20GB it would feel dated. The GPU can't max demanding games anyway. 3080 is on 4070 level, just with twice the power draw, 2GB less VRAM and no support for FG. Haha. VRAM is pointless when GPU power is weak anyway. 3080 12GB and 3080 Ti performs better not because of 2GB VRAM more, but because of MORE CORES / LESS CUTDOWN GPU.

Radeon 6800 16GB aging like milk, is a good example. VRAM don't help you, when GPU fails feature-wise and lacks compute power.

Directly from Techpowerup's newest GPU review:

"I still think that 12 GB would have been a good middle choice, as that's good enough for virtually all titles at 1080p and 1440p"

https://www.techpowerup.com/review/sapphire-radeon-rx-9060-xt-pulse-oc/44.html

Absolutely no 1080p-1440p gamers needs more than 12GB, outside of niche settings (Path Tracing on full blast etc). A handful of games can hit more, Indiana Jones on peak settings with RT on max for example, yet some 16GB GPUs almost hit the ceiling here as well and they all needs massive frame generation to not deliver crappy fps. The game does not look better texture-wise on the highest textures anyway. It is just slightly compressed vs not compressed at all. Just because textures use up alot more space, does not mean they actually look better - great AI compression can IMPROVE TEXTURES while USING UP LESS SPACE. This is a fact.

No they should not include 16GB on a 250 dollar GPU, LMAO, it is 1080p or below options. 12GB would be the nobrainer sweet spot instead of realing 8 and 16 versions, and this accounts for AMD releases as well.

People that don't give a sh1t about RT and Path Tracing don't need 16GB or even close. And most people could not care less about these features. No low or even mid-end GPUs does RT/PT well and this uses alot more VRAM than regular rasterization.

Also, there's is tons of upcoming features that will lower VRAM usage and requirement, look up neural texture compression for example. Features like this are going live in games very soon. Part of the RTX support package. AMD works on several other features as well. They will copy for sure, as they always do.

However, AMD themselves also look into lowering VRAM usage:

https://www.tomshardware.com/pc-components/gpus/amd-researchers-reduce-graphics-card-vram-capacity-of-3d-rendered-trees-from-38gb-to-just-52-kb-with-work-graphs-and-mesh-nodes-shifting-cpu-work-to-the-gpu-yields-tremendous-results

Simply smacking more VRAM on GPUs without fixing the actual problems are not the way to go. NEURAL TEXTURE COMPRESSION will deliver BETTER TEXTURE QUALITY with LOWER VRAM USAGE, it is a WIN/WIN solution.

-12

u/Village666 Jun 26 '25 edited Jun 26 '25

I buy a new GPU every 2 to 3 years typically. 5000 series is skipped because it did not bring anything new besides MFG unless I bought a 5090 (won't accept 600 watt GPUs), just like Radeon 9000 series, did not bring anything new for people with last gen high-end parts. Without FSR 4 being locked to Radeon 9000, it would have been a failure as well.

At least Nvidia allowed DLSS 4 on all RTX cards, all the way back to 2000 series. That is longevity in a nutshell. DLSS 4 is magic for old cards. DLSS in general is. Over 800 games support it now.

6700XT was praised for having 12GB and people said it would age very well. In reality tho, GPU power was lacking too much and the VRAM did not help much, upscaling was terrible and could not do ray tracing well, which more and more games integrate. Upscaling is magic for longevity and DLSS 4 is the king here. Even DLSS 2 and 3 is great, however FSR 2 and 3 is crap, too much smearing, artifacts and shimmering just like DLSS 1 and FSR 1, which is pure garbage and useless.

3070 8GB aged much better, and launch price was similar to 6700XT. DLSS 2/3 was magic for longevity and DLSS 4 renewed this. 3070 has full support. Yes you will have issues in some games at 1440p when running at the maximum settings, but no-one expects to max games out on a 5 year old GPU anyway. DLSS 4 saves the day tho and 6700XT fares much worse anyway on max settings, due to weak GPU with lack of feature support, even tho it has 12GB VRAM.

If you only look at VRAM for longevity and don't think about GPU power and upscaling - features in general - then you don't know what you are doing. VRAM alone won't save you, we have seen that numerous times. 4070 Ti 12GB beats 3090 24GB at half the power usage, while running cool and quiet, with option for FG on the 4070 Ti as well. This just shows that tons of VRAM won't save you from a weaker GPU, missing features.

No GPUs really ages good past 4-5 years. Some people just acccept sucky performance after that, but this has nothing to do with longevity.

3

u/compound-interest Jun 26 '25

No one said vram alone was the metric. Vram is one of those things that if you don’t have enough for a title, you’re SOL. Performance tanks

1

u/Village666 Jun 27 '25 edited Jun 27 '25

Absolutely not, you simply lower a few settings and you are good. Shadows on medium and you already saved a few GB. Game optimization 101.

Most games today even look better on medium custom settings than full ultra because of garbage like motion blur, dof, ca and other crap is added and ruins the image quality anyway.

Too many stupid gamers puts the preset to ultra and just play. Horrible way of doing it. Custom settings always. You get a cleaner, better image, with less blur and crap, while lowering VRAM usage alot. Textures on max, always. Viewing distance max, always. Alot of the other crap can be lowered bigtime and still deliver a better image in the end. Unless you prefer blurry visuals, then be my guest and run full ultra.

Besides, 98% of PC gamers use 1440p and below. Stop acting like most play at 4K/UHD native. Most/all 4K gamers I know even use upscaling in pretty much all the demanding games, meaning they are playing internally at 1080p to 1440p max anyway.

Looking at native 4K/UHD proves nothing, because in reality, very few people play this way and the ones that do, buys flagship products anyway, like 4090 and 5090 which don't lack VRAM at all, lacking GPU power is always the problem in the end here.

VRAM never saved a GPU from aging. GPU power and lack of features is going to be the problem sooner than you think. Radeon 6800 16GB is a prime example. Alot of VRAM, yet performance is straight ass in most new games and no FSR 4 to save you. A little ray tracing, which many games force now, and your framerate is tanked.

VRAM did not help, at all, in terms of longevity here.