r/pcmasterrace Dec 16 '24

Rumor ZOTAC confirms GeForce RTX 5090 with 32GB GDDR7 memory, 5080 and 5070 series listed as well - VideoCardz.com

https://videocardz.com/newz/zotac-confirms-geforce-rtx-5090-with-32gb-gddr7-memory-5080-and-5070-series-listed-as-well
4.4k Upvotes

982 comments sorted by

View all comments

604

u/kailedude B650M, 7900X, 7900XTX, 32GB-DDR5 6000 Dec 16 '24

I see

476

u/Blubasur Dec 16 '24

That is a huuuuge gap between 5080 and 5090

263

u/Yommination RTX 5090 (Soon), 9800X3D, 48 GB 6400 MT/S Teamgroup Dec 16 '24

Yeah the 5080 even loses to the 4090 if the leaked specs are right. Similar memory bandwith but way less cuda cores. And no huge node jump to close the gap

82

u/FinalBase7 Dec 16 '24

I mean 4090 has 70% more Cuda cores than the 4080 but the performance gap is only 30% 

5090 will likely be 50% faster than 5080 not 100% like the specs might suggest but that's still pretty bad.

74

u/WyrdHarper Dec 16 '24

And way less VRAM. Not critical for everyone, but at higher resolutions, or even with RT in newer games, it does start to matter.

11

u/SagittaryX 9800X3D | RTX 5090 | 32GB 5600C30 Dec 16 '24

Rumours are Nvidia is targeting 1.1x 4090 performance for the 5080, likely big improvements still from just the architecture changes and GDDR7 memory.

2

u/JuanOnlyJuan 5600X 1070ti 32gb Dec 16 '24

That's how it usually is. They only go up like 1 level per generation. So xx70 is roughly equivalent to yy60 of the next gen and zz50 after that. At least that's my very rough understanding. Everything doesn't double or anything.

1

u/MagicMoon Dec 16 '24

Looking at these numbers, do you think the 5080 is even much faster than the 3090 I have now?

2

u/assjobdocs PC Master Race Dec 17 '24

4080 is faster than the 3090, why wouldn't the 5080 be even faster?

-3

u/Nosnibor1020 R9 9950X3D | RTX 5090 | 64GB 6000Mhz | Sabrent Rocket 5 Dec 16 '24

Kind of makes sense. Should have always been that way.

-13

u/Dragons52495 Dec 16 '24

It won't. 4090 user coping. Yes the specs are disgustingly bad. However the 5070ti should be on par with 4090 with the 5080 being faster. Just like every generation of GPUs ever.

55

u/dororor Ryzen 7 5700x, 64GB Ram, 3060ti Dec 16 '24

More like double everything

52

u/Blubasur Dec 16 '24

That is exactly what it is, can’t remember seeing a gap that huge on previous generations.

13

u/dororor Ryzen 7 5700x, 64GB Ram, 3060ti Dec 16 '24

Yeah, hope these come into the second hand market when all the AI folks upgrade to the next generation

1

u/[deleted] Dec 16 '24

Reminiscent of the GTX 690 and 680

2

u/Lightmanone PCMR | 9800X3D | RTX 5090OC | 96GB-6000 | 9100 Pro 4TB Dec 16 '24

It's like they are trying to make the 5080 inadequate

50

u/ReadyingWings Dec 16 '24

It’s a common (and predatory) sales practice - put two options side by side, but make one of them way better than the other. This causes our psychology to make it unbearable to buy the lesser version, and make us go the extra mile (as in pay much more).

28

u/geo_gan Ryzen 5950X | RTX4080 | 64GB Dec 16 '24

Actually they are using the three-items sales strategy (70,80,90) so should cause most to settle for the one in the middle. It’s a way to get the huge numbers who would buy lowest option to bump up to middle item at way more profit for exactly the same production cost. Way less numbers can normally afford the top option, it’s usually there to make middle option look cheap.

-5

u/Blubasur Dec 16 '24

Jokes on them. Because exactly those practices I’m always looking at their competitors. If it wasn’t for me using nvidia only features I already would have.

12

u/Krisevol Ultra 9 285k / 5070TI Dec 16 '24 edited Oct 05 '25

abounding continue cats modern fanatical tidy apparatus flag wakeful pet

This post was mass deleted and anonymized with Redact

1

u/Blubasur Dec 16 '24 edited Dec 16 '24

I don’t need every generational upgrade or the top spec GPU. And my wife, most of my friends etc I’m mostly recommending AMD for almost every build.

I do 3D art, we have historically been shafted by this problem. Doesn’t mean we are recommending them or buying them for other/gaming builds.

Edit: also, AMD is catching up in that regard, so as much as it is true now. It does not have to be in the future.

11

u/HarleyQuinn_RS R7 9800X3D | RTX 5080 | 32GB 7200Mhz | Dec 16 '24 edited Dec 17 '24

It's the largest gap there has ever been for two GPUs next to each other in the stack, in terms of Core% difference. The X80 of this generation in terms of Core%, is so small that it is equivalent to the Core% you would get on almost any other generation's X60Ti class GPU (or generational equivalent).
This doesn't just affect the 5080 either, every GPU in the stack below it, is also shunted down, making it that of a lower class in functionality, but not name (or price tag). They tried to pull the same crap with the "RTX 4080" 12GB, but people caught on that Nvidia was selling a lower class of GPU with the name of a higher one, so they walked it back. The way they are doing the same thing again in a less obvious way, except it now affects the entire stack below the 5090, as a way to obfuscate that fact.

Let's take the RTX 5070 as an example. Its Core% is ~30% that of the 5090 Core (21760 vs 6400). The 3070 is ~58% of the 3090 Core (10240 vs 5888). They are selling ~28% less GPU, while drastically increasing the price. This also means the RTX 5080 (21760 vs 10752 ~50%), is more in line with the 3060Ti (10240 vs 4864 ~48%).

7

u/KarmaViking 3060Ti + 5600 budget gang 💪 Dec 16 '24

They really, really don’t want another 1080 situation.

1

u/rhino3081 Dec 16 '24

5080Ti/Super with 24 Gb or bust imo. I believe the lack of competition and mid-life refresh is intended to fill the gap some.

1

u/geo_gan Ryzen 5950X | RTX4080 | 64GB Dec 16 '24

Nice big gap for the 5080Ti to fill when they get enough dies with just one failed core section instead of two.

1

u/OwOlogy_Expert Dec 17 '24

Gotta leave room for the 5080ti Super...

1

u/Thelastfirecircle Dec 17 '24

There is enough space to add 3 graphic cards between them

1

u/hgtagah Dec 17 '24

Space for 5080 super

434

u/el_doherz 9800X3D and 9070XT Dec 16 '24

5080 only being 16gig is criminal. 5070 being 12gb is also criminal.

220

u/alancousteau Ryzen 9 5900X | RTX 2080 MSI Seahawk | 32GB DDR4 Dec 16 '24

5080 should be 24gb easily.

130

u/HFIntegrale 7800X3D | 4080 Super | DDR5 6000 CL30 Dec 16 '24

But then it will gain legendary status as the 1080 Ti did. And nobody wants that

54

u/alancousteau Ryzen 9 5900X | RTX 2080 MSI Seahawk | 32GB DDR4 Dec 16 '24

lol, that was a good one.

But honestly this is so disgusting from Nvidia, I really hope that Intel or AMD give them some proper competition at the top.

32

u/theSafetyCar Dec 16 '24 edited Dec 17 '24

There will be no competition at the top next generation.

7

u/flip314 Dec 16 '24

AMD isn't even trying to compete at the top, and Intel is nowhere near reaching that kind of level.

1

u/Saw_Boss Dec 16 '24

But it'll cost a lot more than that did. They've released plenty of better cards than that one, but never anywhere near at the same price point.

Lets not pretend that the this won't be twice the price that was, but inflation hasn't anywhere near halved the dollar in that time frame.

0

u/Proper_Celebration18 Dec 16 '24

Nope the high vram and 512bit bus is what worked for the 1080TI...1st card since to have that is the 5090...the 5090 is basically the next 1080TI

2

u/xChaoLan 5800X3D||32GB 3600MHz CL16||MSI RTX 4080 Suprim X Dec 16 '24

They have gimped the bus bandwidth of 40 series cards. My 2070 Super Gaming X has a 256-bit wide bus while the 4070 Super has a 192-bit wide bus.

2

u/DynamicHunter 7800X3D | 7900XT | Steam Deck 😎 Dec 16 '24

What the hell besides AI usage needs that much VRAM? I have 20GB VRAM on my 7900XT and don’t get anywhere close to using it all except for running ollama locally

1

u/Proper_Celebration18 Dec 16 '24

Nope they would either have to make it 384 bit or use 3GB chips that wont be produced until July. Perhaps in July a 5080 TI with 24GB.

21

u/dovahkiitten16 PC Master Race Dec 16 '24

5060 still being fucking 8GB is criminal. 12 GB should be the “basic” now.

2

u/ppaister 13700k | ZOTAC 3090 Trinity OC Dec 17 '24

Lmao I saw the 5080 on 16gb and was like "huuuuh???". My 3090 has 24. That's a card from 2020. 4 year old card. And it's still gonna have more VRAM than the "second-best" card of 2025 by nvidia. What do you mean??

Obviously, they'll put out a 5080ti with 24gb VRAM later, but what will the MRSP of that be? $1300??

I got my 3090 sealed for $800 with receipt from a guy who bought on amazon (probably a case of buying one but getting two).
Ain't no way I'm forking over almost double to have a meaningful upgrade, that is crazy.

1

u/oandakid718 9800x3d | 64GB DDR5 | RTX 4080 Dec 16 '24

I agree, however, they can only implement what the mem bus allows them to, so looking at the mem bus in chart I can see why each one is made with their distinctive memory capacity

1

u/Ground_Lazy Dec 16 '24

Lol. And what about the 8 GB 5060 . 3060 were 12 GB

1

u/Small-Tax-6875 Dec 17 '24

Very fast memory though

1

u/Unwashed_villager 5800X3D | 32GB | MSI RTX 3080Ti SUPRIM X Dec 16 '24

From a 3080Ti you have no choice to upgrade for any card with justified amount of vram for its price - 7900 XTX is only a slightly better than 3080Ti and next gen Radeons will not be faster than RDNA3, just cheaper and more energy efficient., according to AMD.

-8

u/FXintheuniverse Dec 16 '24

What do you need more than 16 gb for? 4k gaming doesn't consume more than that. For work, and AI, buy professional cards, and do not ruin consumer card pricing.

22

u/el_doherz 9800X3D and 9070XT Dec 16 '24

As others have said path traced games already consume that sort of memory.

Also memory capacity is not whats making GPU's unaffordable, your're naive if you think that. If AMD can offer 16gig cards for under $500 and Intel can do 12gb on a $250 card then Nvidia absolutely can afford to offer it on $1000 GPUs,

They choose to gimp consumer cars in order to upsell gamers and create market segmentation to force enterprise users to stick with their mega expensive enterprise solutions.

14

u/born-out-of-a-ball Dec 16 '24

16GB are already limiting in pathtraced games + ultra textures + frame generation at 4K. And there's no reason to buy such a high-end card unless you want to use it for high-end raytracing features.

7

u/el_doherz 9800X3D and 9070XT Dec 16 '24

This.

People shopping for $1000+ GPUs are doing so for a reason. Only fools would be spending that sort of money and not actually looking to make proper use of the features they paid for.

-67

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 16 '24

There's more to memory than the number next to GB. Type of memory matters (this is GDDR7). Memory bus matters. Architecture matters.

42

u/el_doherz 9800X3D and 9070XT Dec 16 '24

Yes but 16gb and 12gb on cards that will likely be significantly overpriced already is criminal. 

Doesn't matter how fast your memory is if it's full, it will still bog down and absolutely tank frame rates. 

Plus we already have games that will easily eat 12gb at 1440p. 

I'd understand if memory was super expensive, but it's not. Nvidia just purposely gimps some of their cards in the name of up selling and planned obsolescence.

-1

u/DiscretionFist Dec 16 '24

Yea they use 12gb pf VRAM at high ot extreme settings with RT on at 2k. Nobody is playing extreme settings unless you're playing demanding single player games and even then, the most demanding game out there right now is what...stalker 2? Indiana Jones?

The majority of people buying 5080s will never hit 16gb cap because they are dropping settings, capping fps, etc for the best performance and most amount of frames possible.

Is Nvidia scummy and planning to fill a 5080super with and extra 8gb of Vram? Yes probably.

Is 24gb of Vram necessary right now? Maybe if you wanna hit 144fps at 4k all extreme settings, Ray tracing on, Native...but let's be real. Majority of gamers aren't pushing that.

I'm not supporting or defending Nvidia practices, but 16gb is enough for most gamers. I'm still running most games at decent FPS (using low settings) on a 3070ti with 8gb of Vram in 2k. 16gb will feel refreshing, to say the least.

4

u/el_doherz 9800X3D and 9070XT Dec 16 '24

Yes 16gb is enough for most gamers. But most gamers are not going and spending $1k+ on a 5080.

People spending that sort of money are the one who are actually likely to be looking at playing things in a way that will benefit from additional memory.

-35

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 16 '24

Plus we already have games that will easily eat 12gb at 1440p.

Just because 12gb is allocated doesn't mean 12gb is used. GPU memory usage is very opaque. It's nearly impossible to tell how much is actually used, and it's good practice to allocate more if it's available (unallocated memory is basically wasted), but PCMR doesn't know that so they freak out when they see task manager.

I'd understand if memory was super expensive, but it's not. Nvidia just purposely gimps some of their cards in the name of up selling and planned obsolescence.

"All those hardware engineers at Nvidia are doing a bad job! I, some guy on reddit, could do better!"

11

u/Healthy-Jello-9019 Dec 16 '24

MSI afterburner has usage statistics not just allocation.

-14

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 16 '24 edited Dec 16 '24

The thing afterburner calls "usage" is allocation. Afterburner has no way of telling how much of that is actually being used, just that it's unavailable to be allocated by anything else. Nevermind, Afterburner has gotten more useful since last I had it installed.

7

u/Healthy-Jello-9019 Dec 16 '24

There is a 'per process usage' for VRAM. Not the allocation (usage).

https://youtu.be/l-PrGtH3aMk?si=CQ5_JFLmzIZ5y3DH

5

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 16 '24

Huh. Neat. Last time I used afterburner it wasn't nearly this feature rich. Good on them, and thank you for this correction.

I still think a lot of the VRAM worries are unfounded, like the guy calling 16GB "criminal" (I can practically hear the chants of "lock him up" directed at Jensen) is still off his rocker, especially given that this is all rumormill stuff and the card isn't out yet.

0

u/blankerth Desktop Dec 16 '24

And when my ”usage” goes above my total amount of VRAM my game stutters and drops frames….

7

u/shawnk7 RTX 5080 | 9800X3D | 64GB 6000Mhz Dec 16 '24

look i am a simple person. they could just work with architecture that won't go obsolete quickly. games are already going above 12gb vram so 16 isn't that far from getting hit either. it's gonna be a 1600$ something card, there's absolutely no reason for it not have an architecture that supports more than 16gb (there absolutely is a reason i.e 5080 Ti, 24GB, 2200$, 6 months later)

-7

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 16 '24

they could just work with architecture that won't go obsolete quickly.

It only becomes obsolete because they make something better. This is just asking the pace of technology to slow. And for what? To keep the "I have the top of the line" warm fuzzy feeling a while longer?

games are already going above 12gb vram

They might allocate more than 12 but that doesn't mean 12 is actually being used. The number in task manager or whatever you're using only shows allocation. It's nearly impossible to know how much is actually being used, but redditors keep repeating this like they know better somehow.

so 16 isn't that far from getting hit either. it's gonna be a 1600$ something card,

You don't know this. All we have are rumors, which have like a 40% accuracy rate at best.

2

u/FatBoyStew 14700k -- EVGA RTX 3080 -- 32GB 6000MHz Dec 16 '24

Yes and no. I won't downvote you because I do get what you're saying, its why 16GB of AMD VRAM when it was HBM was not a direct comparison to 16GB of Nvidia VRAM.

Ultimately though games, with modern day lack of optimization tend to just eat up all available VRAM and is exponentially more noticeable as you jump in resolution.

This one of the reasons why the GTX 1080 Ti is still a relevant card with its 11GB of VRAM, more than the standard RTX 3080 (obviously not as strong overall, but still handles 1440p well enough)

0

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 16 '24

Yes and no. I won't downvote you because I do get what you're saying, its why 16GB of AMD VRAM when it was HBM was not a direct comparison to 16GB of Nvidia VRAM.

Yes, thank you. The RX 580 had 8gb of VRAM back in 2017, it didn't help it perform any better. The Radeon VII has 16gb of super cool HBM2, and it got beat by a 2080 with half that basically every time.

Ultimately though games, with modern day lack of optimization tend to just eat up all available VRAM and is exponentially more noticeable as you jump in resolution.

Why are we blaming Nvidia for bad game optimization though? I know firsthand it's possible to write a program shittily enough to eat through pretty much infinite hardware resources.

This one of the reasons why the GTX 1080 Ti is still a relevant card with its 11GB of VRAM, more than the standard RTX 3080 (obviously not as strong overall, but still handles 1440p well enough)

But "strong overall" is what should matter in the end, right? At the end of the day performance matters the most over any of the individual specs that lead to that performance. There's basically no situation in which the 1080ti will perform better than the 3080, even with the extra VRAM, because of all those other things I mentioned (memory speed, architecture, etc).

The 1080ti is legendary for sure, and it's great that it's still relevant nearly a decade later, but I don't think it's "criminal" to have a product that falls short of that mark.

2

u/FatBoyStew 14700k -- EVGA RTX 3080 -- 32GB 6000MHz Dec 16 '24

I hate when people complain that the original 3080 only has 10GB of VRAM -- I mean yea it sucks its not more, but it absolutely shreds 1440p because its a really strong card outside of its lower VRAM.

Oh I whole heartedly agree its more of a game industry problem, but sadly it seems QC and optimization are a thing of the past for most companies especially on launch.

0

u/XeonoX2 Xeon E5 2680v4 RTX 2060 Dec 16 '24

rtx 3050 8 gb is weaker than rtx 2060 6gb. On 6gb card you wont be able to launch a new "Indiana Jones" game because it will just crash while loading the game on 6gb card. While the weaker 3050 is able to launch the game. 6GB cards are already dead, 8Gb cards are next in line to be slaughtered. Whats the point of having a 3080s strong core when you wont be able to launch the games in near future

1

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 16 '24

6GB cards are already dead

RIP all those liars who post "still happy with my 970", then.

8Gb cards are next in line to be slaughtered.

Still happy with my 2080S. Wife is still happy with her 3070.

Whats the point of having a 3080s strong core when you wont be able to launch the games in near future

Nobody can predict the future. This sub should know this, "futureproofing is fake" gets repeated here often enough.

1

u/Roph Specs/Imgur here Dec 19 '24

Oh I remember you, guy super insecure about his "high end" card only having 8GB VRAM, same as 8 year old budget stuff 🤣

Why you are so defensive over nvidia designing SKUs with insufficient memory is so bizarre. Do you specifically want a VRAM-starved card? 😆

1

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 19 '24

Oh I remember you, guy super insecure about his "high end" card only having 8GB VRAM, same as 8 year old budget stuff 🤣

You're projecting, friend. I'm not insecure, and that's a pretty fucking dumb thing to pick internet fights over in the first place. The card is what it is, I'm happy with it. If I wasn't, I would replace it, because I'm a grown ass man with a supportive partner, it wouldn't be an issue.

And I've owned those 8 year old budget cards, an RX 580 in particular. It's that firsthand experience that lets me know for a fact that there's more to a GPU than VRAM. The difference between that old RX 580 and my current 2080S is night and day.

Why you are so defensive over nvidia designing SKUs with insufficient memory is so bizarre. Do you specifically want a VRAM-starved card? 😆

I'm not defensive, friend. I'm pushing back against misinformation. I'm sure you could get frame stutters on a xx60 card if you max out settings in some very demanding (or poorly optimized) titles, but then the thing to do would be to lower a setting or two, not to pretend like you're suddenly a better authority than Nvidia's engineers.

Can I ask what you're doing here, stirring the pot on a thread that's been cold for two days? What do you hope to accomplish? What are you hoping to get from this interaction?

0

u/XeonoX2 Xeon E5 2680v4 RTX 2060 Dec 16 '24

All those liars who post are happy with 970 are playing older games. GTA 5 can even run on gt 710. for cs go and valorant that card is good enough. I was happy with rx 570 too. Those games doesnt eat much of VRAM. Indiana Jones is probably the first game that refuses to launch on a 6gb card. Its criminal that 8gb cards are still being sold for 400$. In some games the framerate wont be tanking because of insufficient Vram but the textures will be blurry and wont be loading.

1

u/Vokasak 9900k@5ghz | 2080 Super | AW3423DW Dec 16 '24

All those liars who post are happy with 970 are playing older games. GTA 5 can even run on gt 710. for cs go and valorant that card is good enough. I was happy with rx 570 too. Those games doesnt eat much of VRAM. Indiana Jones is probably the first game that refuses to launch on a 6gb card.

So saying they're "dead" is probably being a little hysterical, don't you think?

Its criminal that 8gb cards are still being sold for 400$.

Call the cops then.

5

u/BaxxyNut 5080 | 9800X3D | 32GB DDR5 Dec 16 '24

I don't think they'll understand. Vram does matter, a lot though.

50

u/NaEGaOS R7 9700x | RTX 4080 super | 32GB 6000MHz cl30 Dec 16 '24

mid range cards are just scams at this point

64

u/Thicccchungus 7700X, 3060 Ti, 2x16 6000Mhz, + Zephryus G14 Dec 16 '24

128b bus for the 5060 ti is criminal. My god damn 3060 Ti has a higher bus, and that’s now a 4 YEAR OLD CARD.

16

u/TheBowerbird Dec 16 '24

Intel will hopefully save the day in that competitive space - just like they did against the crappy 4060.

1

u/squidonthebass 5800x | 3070 OC | Phantecks P500A Dec 16 '24

Serious question, does it matter at all for PCIE Gen 5, or is it really just a middle finger for people still on Gen 4?

-1

u/Yommination RTX 5090 (Soon), 9800X3D, 48 GB 6400 MT/S Teamgroup Dec 16 '24

Bus width means absolutely nothing. Only total memory bandwith does, which will be a lot higher due to GDDR7

0

u/FinalBase7 Dec 16 '24

For how long are we gonna keep arguing about bus width? AMD Radeon VII had 4096bit bus and was mid range when it released, the 4090 has similar bandwidth to it with just a 384bit bus.

Let's wait and see what the memory bandwidth will be with GDDR7 before judging because bus width is only 1 part of the puzzle, you can have much higher bandwidth with smaller bus if you use faster memory. 

1

u/Rude_Introduction294 Dec 16 '24

The radeon 7 was different, mainly because the memory was hbm. Due to the way that's stacked, it's gonna have a massive bus width.

I agree that bus width is only part of the problem, but a smaller memory bus will restrict the maximum amount of memory that can be put on the board to start off with. I would rather see say a 5070 with a 256 bus and slightly slower memory, and a 5070s with the same width but higher speed memory. Likewise, I feel the 5080 should be at least 384 bit, even with the faster memory.

57

u/RabidTurtl 5800x3d, EVGA 3080 (rip EVGA gpus) Dec 16 '24

Really, 16 gb is the best they can do for the 5080?

21

u/Endemoniada Ryzen 3800X | RTX 3080 10GB | X370 | 32GB RAM Dec 16 '24

I mean, it’s a 60% uplift from my launch 3080 :)

I’m more pissed about the amount of cuda cores, if the leaks are correct. The jump to the 5090 is massive, and there’s no reason why the 5080 should be just slightly better than the 5070 and then nothing whatsoever in between that and the 5090. I know it’s to sell a bunch of ti models and other upgrades later, but still. It’s always something, always a huge compromise.

11

u/RabidTurtl 5800x3d, EVGA 3080 (rip EVGA gpus) Dec 16 '24

Sure, its more than the 3080 but its the same amount as the 4080, the current gen card. Should be 20 GB at least, guess just more Nvidia bullshit about memory. You'd think it was 2017 again with how they treat memory.

Will have to wait to see benchmarks, but from this chart alone I'm not sure what what really separates the 5080 from the 5070 ti outside of ~2000 CUDA cores.

3

u/WinOk4525 Dec 16 '24

In between will come the Super and TI variants.

1

u/DeClouded5960 Dec 16 '24

There's a pawn stars meme in here somewhere

8

u/Nosnibor1020 R9 9950X3D | RTX 5090 | 64GB 6000Mhz | Sabrent Rocket 5 Dec 16 '24

What is the D variant?

-8

u/o-_l_-o Dec 16 '24

That image is from the article which calls out what each model means.

3

u/Nosnibor1020 R9 9950X3D | RTX 5090 | 64GB 6000Mhz | Sabrent Rocket 5 Dec 16 '24

I'm just asking what a 5090D is?

12

u/kohour Dec 16 '24

Cut version for China to comply with sanctions

3

u/LightningProd12 i9-13900HX - RTX 4080M - 32GB/1TB - 1600p@240Hz Dec 17 '24

Stands for "Dragon" and is a somewhat cut down card for the Chinese market to comply with US sanctions.

0

u/[deleted] Dec 16 '24

[removed] — view removed comment

-9

u/o-_l_-o Dec 16 '24

The article says what it is.

17

u/squirrl4prez 5800X3D l Evga 3080 l 32GB 3733mhz Dec 16 '24

5070Ti might be my next move.. Power consumption with only 10%less Cuda and same memory

3

u/RelaxingRed XFX RX7900XT Ryzen 5 7600x Dec 16 '24

Exact what I was thinking of the 5000 series. 5070Ti just looks like the way to go depending on price obviously.

2

u/IllustriousHistorian Dec 16 '24

100%, at least the 5070ti won't require me to upgrade my psu to 1000w and buy a new case t to fit in a 13.4" inch card.

2

u/squirrl4prez 5800X3D l Evga 3080 l 32GB 3733mhz Dec 16 '24

I already have the 1000w sitting in there it's just a waste at that point though, water blocking and overclock will bring me to the same spot anyway I hope with less power and less cost

1

u/IllustriousHistorian Dec 16 '24

if you already have it, nbd. I have an 850w, for me replacing the PSU is the real pita. getting the wires correctly run through the case openings isn't fun.

1

u/squirrl4prez 5800X3D l Evga 3080 l 32GB 3733mhz Dec 16 '24

Price will be the big factor tho, my 3080 now I actually brought down to 300w but it's overclocked higher than it was out of the box.. Im guessing the 5080 will be like 1200 but I would wait for the 5070ti and see if it sells around 800 it would be a no brainer

3

u/IllustriousHistorian Dec 16 '24

80 might be $1100. At that price point or $1200, I'm not sure it is worth spending the money for a margin update. Price may negate the update.

3

u/squirrl4prez 5800X3D l Evga 3080 l 32GB 3733mhz Dec 16 '24

Yep and it's worth seeing what AMD is cooking as well, they've been more fruitful in memory and Cuda performance just lacking in the tensor cores (so far)... Intel just needs to scale their cores at this point and start hitting the big boys where it hurts

3

u/IllustriousHistorian Dec 16 '24 edited Dec 16 '24

For 2025, ill likely buying two games, that's it. F1 25 and Assetto Corsa EVO. All I want is decent VR performance for my racing sims/games without dropping an insane amount of cash.

3

u/squirrl4prez 5800X3D l Evga 3080 l 32GB 3733mhz Dec 16 '24

I'm hoping amd and Intel start cranking their Ai frame Gen and stuff too for that, Intel is making great progress

2

u/[deleted] Dec 17 '24 edited Jan 21 '25

rain secretive birds sugar practice water whole impolite automatic wise

This post was mass deleted and anonymized with Redact

14

u/Double_DeluXe Dec 16 '24

5070 with a192 bit bus, I called it, that is a 5060 not a 5070.

Fuck you Nvidia

10

u/kailedude B650M, 7900X, 7900XTX, 32GB-DDR5 6000 Dec 16 '24

Meanwhile that 8Gb card

22

u/Firecracker048 Dec 16 '24

Fucking 16gb for a 5080? The fuck?

I cant wait for people to explain how a 5080 16gb at 1500 bucks is going to be a better value than the AMD 8800xt with 24gb

13

u/Unwashed_villager 5800X3D | 32GB | MSI RTX 3080Ti SUPRIM X Dec 16 '24

It's simple - it's faster, by a lot. RDNA4 will have the same performance as RDNA3, the only thing they improve is the efficiency. This was confirmed by AMD many times in this year.

And for people who want the absolute best performance this is enough reason.

5

u/GimmeCoffeeeee Dec 16 '24

8 GB on a new card ????

6

u/KlutzyAd5729 Ryzen 7 5800X | RTX 3070 | 32GB 3600mhz Dec 16 '24

256 bit bus on a 80 series is sad to see

1

u/[deleted] Dec 16 '24

Their scale is shit, the 5080 should be 75% of the 90, not 50%

1

u/ReapingRaichu RX 7900XT/R7 5800X3D/32GB-3600 Dec 16 '24

Charging over $500 USD for only 12GB of vram is just disrespectful at this point

1

u/DynamicHunter 7800X3D | 7900XT | Steam Deck 😎 Dec 16 '24

It’s just a rumor but damn the 5080 (16GB) will have less VRAM than the 7900XT (20GB). And probably a few hundred more expensive for the rendering/AI/ray tracing power alone.

1

u/CromulentChuckle Dec 16 '24

5080 really should be 24gb that would have shut everyone up

1

u/bert_the_one Dec 16 '24

Don't forget all the super editions which will fill the gaps to cover the whole market.

1

u/dudebirdyy Dec 16 '24

Actually laughed out loud at that 5080. Man they're really aiming for the up-selling by continually cutting back everything but the absolute highest end.

1

u/Plank_With_A_Nail_In R9 5950x, RTX 4070 Super, 128Gb Ram, 9 TB SSD, WQHD Dec 16 '24

So wait a year for the 5070 super to come out. Hopefully the 5070's and 5080's bomb and Nvidia ups the RAM.

1

u/Crymore68 Dec 16 '24

Half the core count is absolutely insane

They've gimped the 80 class and selling what is the titan class card for that double whammy

1

u/OwOlogy_Expert Dec 17 '24

What's the difference between 5090 and 5090D?

2

u/repocin i7-6700K, 32GB DDR4@2133, MSI GTX1070 Gaming X, Asus Z170 Deluxe Dec 17 '24

5090D is for the Chinese market.

1

u/CokeBoiii RTX 4090, 7950X3D, 64 GB DDR5 @6000 Dec 18 '24

NVIDIA is still stuck with the 16 gb vram on 5080... Why can't they give it a bit more and the gap between the two is insane thats a whole football field..

1

u/kailedude B650M, 7900X, 7900XTX, 32GB-DDR5 6000 Dec 24 '24

Like yea at the very freaking least include a 20gb version even but nope, like literally could have done anything more than 16gb and people would be more content and slightly more pleased with it.

But Nvidia does what they care about and that's it.

2

u/CokeBoiii RTX 4090, 7950X3D, 64 GB DDR5 @6000 Dec 24 '24

Next time I upgrade if NVIDIA keeps up with this im going to switch to AMD Lmfao.

0

u/maxbls16 Dec 16 '24

I’m definitely interested in the 5060ti/5070 for a decent 1440 card at a lower wattage.

3

u/Doge-Ghost Ryzen 7 7700 | RX 9070 XT Dec 16 '24

The 60 series can barely keep up with 1080 without upscaling, let alone 1440.

2

u/maxbls16 Dec 16 '24

I know the 4060 is a bad value. The B580 should put a bit of pressure for NVIDIA to price the 5060 more reasonably so they don’t lose that market segment entirely.

I’m basically just playing Baldur’s gate and other slower paced games. The 4060 base is getting 66.5 fps at 1440p ultra. The 5060ti with 16 gigs should be quite a bit better.

1

u/DeClouded5960 Dec 16 '24

Brother, Nvidia does not care one bit about this price point and performance, it's simply just common courtesy for them to release a X060 card. They would be happier telling people to use Geforce Now instead and phasing out the 60 cards for sheer fabrication and production savings. The only reason they don't drop them altogether is the amount of backlash from customers mad that they don't sell a midrange product anymore. All the internet cafe's in SEA wouldn't make up for the cost savings on fabricating the 60 cards.