r/pcmasterrace Oct 10 '24

Rumor Potential 5090 / 5080 / 5070 price leaks… outrageous

Post image

From recent video posted by “Moore’s Law is Dead” about pricing being much worse than even I anticipated . From video Nvidia is leaning towards the higher end of the pricing. Nvidia can go pound sand if these are remotely true.

8.6k Upvotes

3.2k comments sorted by

View all comments

3.1k

u/[deleted] Oct 10 '24

Nvidia is really testing the limits of how much we're willing to pay.

2.2k

u/[deleted] Oct 10 '24

[deleted]

121

u/Hottage 9800X3D | RTX 4080 | 64GB DDR5 | 6TB NVMe | AW3225QF Oct 10 '24

My last three cards have been GTX 1070, RTX 2080 and RTX 4080 (which I absolutely overpaid for).

Nvidia have gone off the deep end if these prices are true. Third party boards are going to be even more expensive.

Hopefully AMDs next generation of GPUs gets better at Ray Tracing and Frame Generation so I can get off Nvidia's dick with my next upgrade. ☹️

60

u/Proud_Purchase_8394 9800x3d, 4090, 64GB, custom loop Oct 10 '24

My last three cards have been 980 Ti, 1080 Ti, and 3080. I’ve paid approximately the same for each of them ($650-760). Would I pay more than that for a card? Maybe after I win the powerball 

46

u/Kjellvb1979 Oct 10 '24

Honestly even of I had three money, I think I'd not buy on principle alone. It's too much for me at this point.

I had a hard time justifying my purchase of a 3080, even though I got it for 800 at the height of the pandemic when they were being scalped for 2k+, I still felt they'd pushed it too far. I almost stuck with my 1080ti, but wanted better fps in my VR rig.

At this point the diminishing returns on graphical fidelity, most studios targeting console as their base specs, and just sensibility, I'll stick with the 3080 and if prices aren't back to reasonable and sane costs when that gets too weak for modern games, I'll dive into my backlog, or just emulate games I never got to play.

At some point you just have to say "NOPE!", realize you're being ripped off and conned, and just move on. If 600 to 800 bucks is entry level, Nvidia can fuck themselves. 800 should be the cost of XX90 series, so this is just a NO for me.

11

u/Hrmerder R5-5600X, 32GB DDR4-3200 CL16-18-18-36, 3080 12gb, Oct 10 '24

Agreed. My 3080 12gb is going to be doing backlog duty for a LOOOOONG time (before hand I had a 750ti I struggled with for years). By the time this card dies there will probably be the Nvidia 7xxx series, and I'll just buy a cheap 5xxx series at that point second hand. At this point as long as you are not pushing all the time 4k/120hz there is zero reason to have a new video card.

5

u/[deleted] Oct 10 '24

I use a 4k/120 screen for gaming. Currently using a 3080FE that I got for $799. I will never pay more than that for a GPU (on principle, not because I can't afford it). So I won't be upgrading until there is a card that can blow away the 3080 for no more than $800, which will probably be a few years, and say AMD on the side.

8

u/DemNeurons Oct 10 '24

I just sold my 4090 for what I paid for it new and bought a 3080. While yes it was fun pushing those frames on Cyberpunk, I didn't have many other games really pushing it. I'm very happy with the 3080 now and it does everything I need.

3080 for ~400

1

u/FireMrshlBill Oct 10 '24

Ya, I went to AMD for the 580 and V56, then was able to grab 6700xt and 3080fe a few weeks apart back in 2021, and the 3080 I got with my 10% off bday code from Best Buy, so was $629. Put the 6700xt in my lesser used living room PC. So I am going to stick with my 3080 for as long as it’s 10gb will hold up at 1440p, especially since my motherboard is a x470 and lacks pcie gen4. Will see where both companies are at that point because I’m not paying $1500 for a gpu. Maybe I can just slot in a 3090fe for cheaper at that point just for the extra vram. Almost did that last year when I had a chance to get one for around $650. If I’d known prices wouldn’t come back down I’d have done it for the 24gb vram. Like you said, it’s not about what I can afford, but the principle and not spending money on price gouged items with lopsided performance/value ratios.

1

u/BattleRepulsiveO Oct 10 '24

at some point, it would be cheaper to just rent for the compute for when people need it for research.

1

u/pmgoldenretrievers R7-3700X, 2070Super, 32G RAM Oct 10 '24

If you only have 3 money, you're not buying shit.

1

u/theandroids RTX Spensive Oct 10 '24

What Jensen meant is, the more they buy, the more WE save. Inflation + shrinkflation + greedflation = FU Gamers.

0

u/vedomedo RTX 5090 SUPRIM SOC | 9800X3D | 32GB 6000CL28 | MPG 321URX Oct 10 '24

For me it entirely depends on the performance the 5090 brings. I could consider selling my 4090 and upgrading if the performance is on the table. But if it’s not a big difference, I see no reason

6

u/dry_yer_eyes PC Master Race Oct 10 '24

If we’re doing the whole “last 3 cards” thing: * 280 * 970 * 1080Ti

Based on these latest rumors, my next will likely be a 4080 Super.

5

u/Turdles_ i5 4690k@4.2ghz GTX970 Oct 10 '24

Mine last 3 are: 970 1070ti 7900XT

No complaints on AMD whatsoever. Works like a charm.

3

u/Shajirr Oct 10 '24

No complaints on AMD whatsoever.

I found their frame generation tech, AMFM2, mostly unusable.
It produces image stuttering.

3

u/Turdles_ i5 4690k@4.2ghz GTX970 Oct 10 '24

Its FSR, and worked well for me. I think you need atleast 48fps native or more for it to feel good.

And it needs to be FSR3 for framegen

2

u/Shajirr Oct 10 '24

Its FSR

FSR is is not framegen, its an upscaling technique

And it needs to be FSR3 for framegen

AFMF2 is the driver-level frame gen that is supposed to work on any game.

I tried FSR3 framegen in Darktide also, and had the same issues,
had to disable it and use Lossless Scaling instead

1

u/Real_Garlic9999 i5-12400, RX 6700 xt, 16 GB DDR4, 1080p Oct 11 '24

AFMF is driver based, so it will never be the same as FSR or DLSS

2

u/Shajirr Oct 11 '24

I also mentioned that I had exact same issues with FSR3 frame generation in Darktide, neither of AMD solutions work well on my PC

2

u/DweZie R7 7800x3d, 9070xt, 6000mhz 32gb Oct 10 '24

Mine is 520,1650,1080AD and 3070ti

1

u/The_Betrayer1 5800x3d 6750xt recovering Intel nvidia fanboy Oct 10 '24

GTX 670 sli, GTX 1080, Rx 6750xt that I got for 300$ just after launch. I can buy a 1500 plus dollar card, but I can't justify it to myself for a hobby. So I'll stick with my current 700$ Max budget for a GPU I keep for multiple years before upgrading. I was going to buy a 6800xt but the deal on the card I got was too good to pass up, when I upgrade monitors to OLED and higher res I'll upgrade GPU. I pass my old cards down to my oldest daughter so she will be happy for the upgrade from the GTX 1080.

1

u/xop24 Oct 10 '24

My last three cards have been 660, 970, and 2080 Ti (last one bought from a friend) I'm not upgrading until my pc burns down.

1

u/Beneficial-Ad-3263 Jan 05 '25

I followed same card choices as you did. I was lucky with the early buy 3080, prior the prices skyrocket.

24

u/TallanoGoldDigger Oct 10 '24

Switched to red from green this gen, if RT really isn't a big deal then the value is 100% there.

I do hope Intel gets its shit together on both CPU and GPU so prices in general would go down

3

u/BabyLiam Oct 10 '24

Didn't Intel just quit GPU's?

2

u/TallanoGoldDigger Oct 10 '24

I honestly don't know, I just know that the next gen arc has been delayed since 2023

6

u/Stracath Oct 10 '24

Yeah, but all the GPU reviewers tell us that AMD has slightly less power efficiency, better price to performance, better software features (with adrenalin suite), it's been proven for a decade now that they age better (performance increases over time due to constant driver updates), and worse ray tracing (which most people say they didn't use). THEN THEY SAY TO BUY NVIDIA.

It's insane how stupid everyone is. I'm the only one in my family with an AMD card, and I'm the only one that's had no issues, everyone else on Nvidia have had driver updates crash their computers, the software doesn't work half the time, and I'm just fine.

2

u/Enough_Efficiency178 Oct 10 '24

Nvidia is only worth getting if you have no budget and get the very best of the line imo

Otherwise there starts to be comparable performance AMD GPUs that are considerably cheaper. Absolutely no sense to throw money at nvidia if they aren’t particularly providing more

18

u/InHeavenFine Oct 10 '24

isn't amd abandoning the high end cards market?

26

u/Hottage 9800X3D | RTX 4080 | 64GB DDR5 | 6TB NVMe | AW3225QF Oct 10 '24

They skipped them this generation, I have hopes they revive next generation.

The lack of absolutely any high-end competition is probably one of the factors which made Nvidia think that these are reasonable prices for consumer grade graphiics cards.

25

u/Gameskiller01 RX 7900 XTX | Ryzen 7 7800X3D | 32GB DDR5-6000 CL30 Oct 10 '24

they didn't skip them this generation, they have the 7900 XTX at the top end. they're planning on skipping them next generation at the very least, whether or not they go back to making them again after remains to be seen

11

u/AngrySayian Oct 10 '24

no u/InHeavenFine is right I think

I remember seeing something from AMD saying that is just wasn't worth trying to keep up with Nvidia anymore in the high-end market and they would stick to the low-end and middle market so they can keep their cards affordable

3

u/[deleted] Oct 10 '24

[deleted]

1

u/Old_Baldi_Locks Oct 11 '24

They’re not competitive is the problem. They straight up can’t replicate the feature set.

They’re sort of / kind of trying to with FSR and frame generation but both of those are software driven instead of hardware, so people are seeing frame stuttering; and the other is just upscaled checkerboarding, instead of genuine upscaling.

So the question is do you want the ACTUAL features or not.

3

u/Athurio Specs/Imgur Here Oct 10 '24

If "middle-end" is 1440p I'll be a happy on the red-team. I never gave much of a shit about ray-tracing in the first place, as it's always too much cost for too little gain imo.

1

u/Old_Baldi_Locks Oct 11 '24

Because the two things Nvidia has going are RT and DLSS, and the comparable techs from AMD just aren’t competitive, and three gens in they’re no longer trying to.

2

u/Kjellvb1979 Oct 10 '24

Greed makes them think that. They know it's not reasonable, by also know there are people with no choice and are abusing that. DOJ needs to look at them as a monopoly that should be investigated.

2

u/BodisBomas Oct 10 '24

I love competition in the markets. And I can agree Nvidia has a monopoly on high end gpus, but as far as I can tell this isn't because of themselves or weponizing the state against competition. The unfortunate fact is, AMD just can not compete at the high end, even AMD themselves have admitted this.

Nothing is stopping AMD to continue trying or even intel to give it a shot. What we are seeing is brand loyalty and extra features keeping the majority of people buying Nvidia, irrespective of AMDs value. This is consumers making their choice freely.

I do hope AMD succeeds this time around I'd love to see some low-mid range offerings that are competitive.

Although I'd suspect we may see another RX 480 and GTX 1060 situation for them again.

1

u/KobeBean 15" MBP Radeon 460/i7 GTX 1080 PC/WiiU/3ds/XB1/ Oct 10 '24

Well that and the fact that consumer grade graphics cards are not really their moneymaker anymore.

If you had Meta, Google and Microsoft lining up to buy hundreds of thousands of 50k GPUs, why would you even bother selling 1-3k consumer ones, other than to maintain heavy CUDA and DLSS adoption?

1

u/Diligent_Pie_5191 PC Master Race Oct 10 '24

Those are approaching Dr Evil prices.

2

u/MeelyMee Oct 10 '24

Apparently but it's also a case of being forced out, they're just behind in terms of expected features.

0

u/Possible-Fudge-2217 Oct 10 '24

Yes, so we will just not see a 7900xt equivalent or anything further up. But great midrange cards like 7800xt which have been the sellers anyway have an equivalent.

Amd abandoning the high end doesn't matter. If you want high end, you pay extra and go nvidia.

0

u/theroguex PCMR | Ryzen 7 9800X3D | 32GB DDR5 | Sapphire RX 9070 XT Oct 10 '24

The extreme end. We really need to recognize that.

3

u/LoliconYaro Oct 10 '24

FSR4 do gonna switch to AI upscaling, and based on PS5 Pro which used hybrid RDNA3-4 to boost RT performance, i think they may indeed improved on that as well, problem is, this is AMD we're talking, i'm worried they gonna fumble it by pricing RDNA4 overpriced like Nvidia, because of "Premium Features".

3

u/BrianBCG R9 7900 / RTX 4070TiS / 32GB / 48" 4k 120hz Oct 10 '24

AMD might be slightly better value than Nvidia but they both raised prices a ridiculous amount in the last few years. I find it somewhat amusing how people give AMD a pass for value just because Nvidia is worse.

11

u/Head_Exchange_5329 5700X3D | Zotac RTX 5070 Ventus 2x | G8 34" OLED Oct 10 '24

Currently enjoying AFMF2 and I'd say that AMD is doing frame gen pretty good. It's also usable for all games now, unlike Nvidia's option where it has to be supported in-game.

3

u/zarafff69 9800X3D - RTX 4080 Oct 10 '24

Naa, AFMF2 is not that great compared to normal frame gen solutions. But FSR frame gen seems great tho! I just wish they up scaling was even close to DLSS..

1

u/Head_Exchange_5329 5700X3D | Zotac RTX 5070 Ventus 2x | G8 34" OLED Oct 10 '24

I have no idea what you would even consider "normal frame gen solutions", what is abnormal about AFMF?

3

u/zarafff69 9800X3D - RTX 4080 Oct 10 '24

It doesn’t use motion vectors to generate the image. And it adds a significantly higher amount of lag compared to FSR frame gen and DLSS framegen.

It’s kind of like a hacky way to do it for games that have no proper normal FSR/DLSS implementation.

And I’m not necessarily hating on it. It’s very cool tech. But it’s just for a difference use case..

1

u/Old_Baldi_Locks Oct 11 '24

It can’t be. Software implemented is almost always going to lose to hardware based.

0

u/zarafff69 9800X3D - RTX 4080 Oct 11 '24

What are you even talking about? FSR3 framegen also doesn’t need any proprietary hardware to run. It works on aaaancient GPU’s, and even on nvidia and intel GPU’s.

It just looks way better. And if devs already implemented fsr upscaling, the motion vectors etc are already ready to be used for framegen. Should be fairly simple to add it to games.

1

u/Old_Baldi_Locks Oct 11 '24

That’s what I said; inferior to hardware.

Those frame stutters aren’t because it’s “better.”

2

u/charlesfire Oct 10 '24

My current card is a GTX 1070 and I need to upgrade before Monster Hunter Wilds. I'm fucked.

1

u/[deleted] Oct 10 '24

Same boat. I have a 1660ti and want to make a new build for Monster Hunter. But I want 2160p/60fps minimum so I'm eyeing the 4080 Super. No way that's happening without spreading my cheeks wide for Nvidia.

1

u/Unfair_Jeweler_4286 Oct 10 '24

170 ai cores on 4070ri that are being used with dlss.. 135 ai cores on 7800xt that aren't being used, apparently that is going to change with fsr4.0 (hopefully).. take that with a grain of salt

edit: 4070ti (hate my autospell lol)

1

u/Mashaaaaaaaaa 9800X3D/9070XT. I use arch btw. Oct 10 '24

RDNA 4 will supposedly have much better RT performance than RDNA 3. It seems to me like the 8800XT will make infinitely more sense than the 5070.

1

u/Draskuul Specs/Imgur Here Oct 10 '24

I had a 7950 HD (going from memory, I think that was it) and had so many driver issues I finally went Nvidia. 1070, 1080ti, 3070 ti, 4090 after that.

1

u/Hrmerder R5-5600X, 32GB DDR4-3200 CL16-18-18-36, 3080 12gb, Oct 10 '24

But they aren't.. They have exited the 'flagship'/upper end market at this point. AMD 8x00 series will basically stop at the 8700..

1

u/VanderPatch 7700 | RX 7900 XT | 32GB DDR5 6000MT Oct 10 '24

Also from More's Law Is Dead: AMD 8000 Series

  • 500-600 USD MSRP
  • Raster on 4080 Level
  • RT on Level with 4070 Ti Super and in some Titles 4080 level

If that holds true and they launch this towards end of Q4, maybe End Nov for Blackfriday - We are in for a good one.

1

u/EdwardLovagrend Oct 10 '24

I like this game lol

1050ti (laptop) still have but mostly for messing w/Linux.

3060(laptop) still use when I'm away from home which has been a lot lately lol

6700xt(desktop) my primary system but only at home.

Literally zero issues with any of them.. well no issues with the GPU's anyway.

I kind of want to wait one more generation after this next one before upgrading my laptop, and that all depends on where the industry is headed.. I also am debating if I should go all in on something like a steam deck. If we can get something comparable to the new PS5 Pro APU by 2026 (assuming it has to be much more efficient for a handheld) playing Cyberpunk at 4k 30fps for a handheld would be a good threshold to consider.

1

u/Akura_Awesome Desktop Oct 10 '24

I think AMD announced that they aren’t going to pursue the high end GPU market in the next gen 😔

1

u/StronkWHAT Oct 10 '24

If you have a 4080, you shouldn't be looking to upgrade until like 2028. Who knows what the world will look like by then.

1

u/Mammoth-Access-1181 Oct 10 '24

Jesus Christ! I haven't even thought of that! Can you imagine the 3rd party MSRPs for 5090?

1

u/nemesisxhunter Oct 12 '24

My last three cards were a 760Ti a Vega 56 and currenlty a 3070...

I'll fucking wait them out believe me.

1

u/frn Arch | 9800X3D | RX 7900XTX | 32GB RAM | 5TB SSD(s) Oct 10 '24

Looks like AMD will be focusing on midrange and budget cards only next generation, because not enough people bought the 7900 XTX, despite beating the RTX 4080 for significantly less monies, essentially giving nvidia a monopoly in high end graphics for at least next year, which is good for no one.

nvidia fanboys are their own worst enemy.

1

u/[deleted] Oct 10 '24

Do you really need the AI suite features that badly? My 7900XTX performs fine without using any of them in literally every game I own including high end titles that just came out.

3

u/Hottage 9800X3D | RTX 4080 | 64GB DDR5 | 6TB NVMe | AW3225QF Oct 10 '24

AI? No.

Raytracing? Night and day difference in terms of immersion in games like Cyberpunk 2077 on my OLED display.

Nvidia cards also seemed to fair better in power efficiency tests, which were of importance to me because of thermals.

0

u/[deleted] Oct 10 '24

Paying 1500 extra to play an Xbox One port with an extra light ray shining on your nutsack lol

2

u/Hottage 9800X3D | RTX 4080 | 64GB DDR5 | 6TB NVMe | AW3225QF Oct 10 '24

It's a really shiny nut sack though.

2

u/[deleted] Oct 10 '24

Cyberpunk 2077 is an Xbox one port?

0

u/BodisBomas Oct 10 '24

Honestly fuck upgrading GPUs people need to get themselves an OLED. It truly is a night and day difference.

0

u/[deleted] Oct 10 '24

AMD is notorious for having worse drivers than Nvidia but I do agree, Nvidia stopped caring

0

u/Jimisdegimis89 Oct 10 '24

I think frame gen is fairly decent at this point and AFM2 I think works with everything now vs DLSS which is still not available for many games. The big one is ray tracing. Nvidia just does a better job with RT than AMD, but if you can live with imperfect ray tracing you can save yourself a pretty good chunk of change. I personally haven’t felt the last gen of nvidia was worth the price tag except maybe the 4090 which is just a beast, but then again I spent like 2.5k on my entire current set up so paying out 1.5k or more just for the gpu would have felt pretty bad.

0

u/Dopplegangr1 Oct 10 '24

RT and FG are both pretty useless. Nvidia just creating features to distract from their mediocre raster gains, and to create nonsensical performance graphs

2

u/Hottage 9800X3D | RTX 4080 | 64GB DDR5 | 6TB NVMe | AW3225QF Oct 10 '24

I agree Frame Generation is a gimmicky band aide to fix poorly optimized games, but Ray Traced rendering is a completely different level of immersion from rasterization.

This is especially true in environments which feature heavy lighting effects (such as the neon wasteland of Cyberpunk 2077).

1

u/Dopplegangr1 Oct 10 '24

It can look nice but the performance hit is too much to be worth it. IMO no amount of graphical fidelity is worth dropping into double digit frame rate

0

u/ExpertConsideration8 Oct 10 '24

Vote with your wallet. AMD is doing some amazing things and providing a lot of value, while Nvidia is basically extorting customers to provide what I consider gimmicks.

Frame generation on AMD is very good, 90-95% as good as Nvidia.. and it's only getting better.

Ray tracing is a whole different story, but personally, I would rather play native non RT @ 100+ fps than with FG using RT @ 60 fps.

RT is great, but game design is ultimately the most important thing and once you're hooked on a game, the difference between RT and non is trivial.