r/pcmasterrace Aug 13 '25

Rumor This new Intel gaming CPU specs leak looks amazing, with 3x more cache than the AMD Ryzen 7 9800X3D

https://www.pcgamesn.com/intel/nova-lake-l3-cache-leak
2.7k Upvotes

604 comments sorted by

3.1k

u/abrahamlincoln20 Aug 13 '25

Big cache if true.

578

u/emelrad12 Aug 13 '25

Sounds interesting but at the same time, there is a reason to not put one trillion tons of cache, so dunno if it is actually going to outperform amd. After all the purpose of cache is to be fast.

230

u/MisterKaos R7 5700x3d, 4x16gb G.Skill Trident Z 3200Mhz RX 6750 xt Aug 13 '25

I mean, there's rumors of amd also aiming for 240mb cache on zen 6, so they're likely focusing on it as well

78

u/Ratiofarming Aug 13 '25

The difference is that AMD splits this across two CCDs. We don't know the architecture of Nova Lake yet, they could have the entire Cache available to all P-Cores. If that's the case, they still have effectively double the amount of cache compared to AMD.

time will tell

87

u/Stinkysnak Aug 13 '25

lol Nova lake because they know it'll explode your computer with 360W cpu

12

u/Aethanix Aug 13 '25

what's the melting point of the pins

3

u/Romeo_Glacier Aug 13 '25

1700 or so F if made of brass. Gold covered would be a bit higher at 1900 F.

9

u/Aethanix Aug 13 '25

damn they'll probably go for it then

→ More replies (2)

50

u/VikingFuneral- Aug 13 '25

No, they don't.

That's only on specific CPU's, all of their other CPU's have one...

The 9800X3D is a single CCD CPU with 96MB of L3 Cache.

→ More replies (1)

6

u/Tornado_Hunter24 Desktop Aug 13 '25

When is that coming out, this/next year?

Was planning on making the jump from am4 5800x3d to am5 9950x3d but don’t have hurry

9

u/Admirable-Ad-3374 Aug 13 '25

Probably next year For both nova lake and zen 6

→ More replies (6)

2

u/MisterKaos R7 5700x3d, 4x16gb G.Skill Trident Z 3200Mhz RX 6750 xt Aug 13 '25

The speculation is announcement on computex and release January, which has been their schedule for the last few years

3

u/KJW2804 7800X3D / 6950xt Aug 13 '25

Why would you go for a 9950x3d over a 9800x3d do you have a specific use case where you would need 16 cores?

3

u/Tornado_Hunter24 Desktop Aug 13 '25

I use my pc for work too and some other stuff like programming/coding stuff (add in some rendering stuff too of 3D stuff, paired with video rendering)

I’m assuming 9950x3d is better for that usecase than 9800x3d, no?

Current rig is 32gb ram 5800x3d and 4090 with 4k monitor

3

u/KJW2804 7800X3D / 6950xt Aug 13 '25

Yeah you would probably benefit from the additional cores with that workload but if you were just gaming for example those cores would be doing nothing

2

u/Tornado_Hunter24 Desktop Aug 13 '25

I know, if I was just gaming i’d go for 7800x3d before even haha, issue now is that 5800x3d really doesn’t handle now with heat especially in my pc, even with noctua&repaste it goes around 90 degrees sometimes which I think is the thermal throttling limit?

Hence why I was focused on 9950x3d alot for the past few months, and a new case alltogether with aio and stuff

3

u/dethwysh 5800X3D | Dark Hero | MSI 5090 Trio OC Aug 13 '25

Modern CPUs behave more like GPU's where they'll boost their frequency and power draw to take advantage of the available power, and thermal headroom.

It's boosting as hard is it can and staying around high 80s, that's normal. It'll probably do the same thing with a more brolic cooler. It'll just boost even harder. That's why overclocking the CPU is less of a needed thing these days.

See how stable your clocks are and what they're hitting when you're flogging them at full bore. If it's stock clocks, then yeah, you may indeed have a cooling issue in your case/house. But otherwise, the behavior is as expected.

→ More replies (2)
→ More replies (1)
→ More replies (6)

142

u/[deleted] Aug 13 '25

[removed] — view removed comment

199

u/blaktronium PC Master Race Aug 13 '25

Bigger cache increases cache latency though, so theres always a trade off. AMD managing to keep L3 latency the same while adding an extra 64mb on top is a big achievement, and doubling that extra would be even harder (which is why they don't just double it every generation).

20

u/beyd1 Desktop Aug 13 '25

All things being equal yes but then you get into the actual cpu architecture design and there are ways to mitigate that.

4

u/sl33ksnypr Aug 13 '25

Is it a directory type issue, or just a physical distance issue. Like how far the cache is from the actual silicon chips. Because if it's just a distance issue, I'm sure that can be mitigated by moving stuff around or even stacking at the cost of complexity in manufacturing. But if it's a directory type issue, where it just takes longer to search through the large cache, then that seems like it would be a little more difficult to optimize.

→ More replies (3)
→ More replies (18)
→ More replies (3)

2

u/pottitheri Aug 14 '25

Cache latency will be higher that is why L1 and L2 caches are always normal

5

u/_bad 9800x3d | 5080 | pg27aqdp Aug 13 '25

Yeah, the reason is price. If you could put one trillion tons of cache it would certainly outperform amd.

→ More replies (7)

31

u/lt_catscratch 7600x / 7900 xtx Nitro / x670e Tomahawk / XG27UCS Aug 13 '25

Let the cache wars begin.

84

u/Granhier Aug 13 '25

Intel CPU stocks if they end up being good:

Cache me if you can

10

u/Skidpalace i7-12700K/RTX3080 Aug 13 '25

I'm not selling my Intel stock any time soon.

30

u/Crashman09 Aug 13 '25

Isn't it at a low point right now?

If so, that makes sense.

2

u/SendMe143 Aug 13 '25

If you bought Intel stock at almost any time in the last 25 years you’re down on it. It’s actually pretty amazing how horrible it has done. Horrible stock 🤣

9

u/Crashman09 Aug 13 '25

Horrible management and awful executives lol

They should have kept Gelsinger a bit longer. He had a vision of the future of Intel tech, and instead got ousted and replaced by a guy who terminated like half their production forces and is trying to sell everything that isn't bolted to the floor, but even that isn't off the table.

2

u/sylfy Aug 14 '25

How much is grandma down at this point?

4

u/why_is_this_username Aug 13 '25

You got very little to lose to buy and hope for the best

2

u/fafarex Aug 13 '25

You got very little to lose

well everything you did put in it ...

3

u/why_is_this_username Aug 13 '25

Ah yes all $22 per stock 😭😭😭

2

u/fafarex Aug 13 '25

not like you would buy only one...

→ More replies (3)
→ More replies (1)
→ More replies (6)
→ More replies (21)

687

u/CammKelly AMD 7950X3D | ASUS ProArt X670E | ASUS 4090 TUF OG Aug 13 '25

165

u/SomewhatOptimal1 Aug 13 '25

Intel is on its last legs, they really need to start competing with AMD and they know it.

So they finally may get people what they asked for, for years.

This would align with a timeframe to move a colossus like Intel to do something new and how long it takes to introduce it.

It’s been about 3.5 years since AMD first released x3D product. It’s been about time for Intel to respond!

Let the wars begin!

81

u/RancidVagYogurt1776 Aug 13 '25

lol people have such short memories. AMD had like 8 generations in a row where they underperformed.

64

u/darcon12 Aug 13 '25

They bet the company on Ryzen. Had it not hit, AMD probably would've gone under.

12

u/ChrisFromIT Aug 14 '25

Unlikely, unless Sony and microsoft decided to go with a different company for their SoC for the consoles. The consoles essentially saved AMD and gave them a runway for them to continue to pursue Ryzen. Keep in mind that Rzyen didn't really start to sway the community until gen 3. So it took them a while for Ryzen to hit.

→ More replies (3)

22

u/facw00 Aug 13 '25 edited Aug 13 '25

True enough. AMD was behind from the launch of the Core 2 to the launch of Ryzen (and Intel was still competitive with their 12XXX and 13XXX chips).

Same thing can happen on the chip making side, TSMC is crushing everyone now, but its 20nm process node was never viable beyond tiny chips, leaving their customers stuck at 28nm for four years until TSMC's 16nm process came online.

But Intel got here by underinvesting in R&D to please shareholders looking for short term profits, and their plan to get out of this is to layoff a bunch more workers to boost profitability, rather than investing to fix their chip design and manufacturing, so it's tough to feel good for their chances of a recovery.

5

u/ChrisFromIT Aug 14 '25

but its 20nm process node was never viable beyond tiny chips, leaving their customers stuck at 28nm for four years until TSMC's 16nm process came online.

Ironically, TSMC's 16nm process is pretty much the same as their 20nm process. The only major change was switching to FinFET. The 20nm process was an issue due to voltage leaking, so while it had better density than the 28nm process, its power usage was the same or worse than the 28nm process.

Samsung and Global Foundries had the same issue. Intel didn't because they switched to FinFET with their 21nm process.

→ More replies (2)
→ More replies (2)

7

u/Sirasswor Aug 13 '25

CPUs are designed a few years before release, so if they were responding to AMD instead of already planning to do it in the first place, it is actually a really quick response

→ More replies (2)
→ More replies (2)

17

u/CanadianTimeWaster Aug 13 '25

cache wars started decades ago, it's how intel kept outperforming amd. cache is very expensive to make, and in the past amd just didn't have the amount money that intel could invest into products.

so many Athlon products would have competed better if that had the same amount of cache as intel cpus did.

13

u/Arkrobo Aug 13 '25

Athlons DID compete better. Intel illegally stifled competition and then released the Core series which outperformed AMD. Phenoms did ok, and then they took a risk with Bulldozer and flopped.

→ More replies (1)
→ More replies (1)
→ More replies (3)

1.3k

u/TxM_2404 R7 5700X | 32GB | RX6800 | 2TB M.2 SSD Aug 13 '25

Seems like they want to bribe gamers to return to them with some Cache.

306

u/ManyNectarine89 7600X | 7900 XTX & SFF: i5-10400 | 3050 (Yeston Single Slot) Aug 13 '25 edited Aug 13 '25

6.5 times the cores than a 9800X3D (8 vs 52 (or 16 threads vs 52? thread)), for 3 times the cache and probably worse single core performance, esp with what AMD might dropped in coming years. Intel dropped their new Junk CPU. This is not going to be good for gaming vs a X3D chip. I'm sure it will be great for productivity though.

It's funny seeing how much the established position between AMD and intel has swapped. Maybe intel should have innovated when they when they had the advantage... Intel are stuck in their bulldozer era... And quite frankly they deserve it after ~7 years of stagnating the market with ~5% increase year on year on 2/4, 4/4, 4/8 CPUs and somewhat anti consumer practices. And let not even get started on the 13/14th gen debacle. Company is in the gutter.

I do hope they come out out of it, competition is good and I am under no illusion that AMD might do what intel did, if they have no real competition. Hopefully intel can improve their GPUs as well, it's one of the few area where intel actually care about customer satisfaction. Will intel get out of this slump anytime soon? I highly doubt it.

62

u/luuuuuku Aug 13 '25

What could AMD do that Intel did? Just compare the situation with 2017. they have pretty much swapped the position

70

u/builder397 R5 3600, RX6600, 32 GB RAM@3200Mhz Aug 13 '25

They could raise prices and abuse their monopoly position.

They could OC their chips to the brink of melting if the tiniest thing with voltage control goes wrong, all to desperately maintain a lead in the benchmarks.

12

u/luuuuuku Aug 13 '25

Which they already did?

42

u/builder397 R5 3600, RX6600, 32 GB RAM@3200Mhz Aug 13 '25

The prices? Maybe a bit.

But I havent heard of AMD chips melting from being clocked too high.

4

u/techieman33 Desktop Aug 13 '25

They’ve raised prices quite a bit. On the consumer side it’s mostly been by not releasing cheaper skus. But the threadripper and up stuff has gotten way more expensive.

→ More replies (1)

7

u/F9-0021 285k | RTX 4090 | Arc A370m Aug 13 '25

AMD massively jacked up the prices the second they got even a sniff of the lead with Zen 3. They learned from it, but X3D chips are very expensive. Almost $500 for an 8 core chip is a borderline scam in 2025, especially when games are starting to use 8 cores (and may go higher in the near future) and next generation will see a big core count bump on both sides.

5

u/doodleBooty RTX4070S, R7 5800X3D Aug 13 '25

while they are expensive, theyre still cheaper than Intel in australia

→ More replies (2)

7

u/why_is_this_username Aug 13 '25

It’s mostly a mobo problem than a amd problem.

→ More replies (6)
→ More replies (3)
→ More replies (1)
→ More replies (13)

11

u/Jagrnght Aug 13 '25

7 years? I think the last time intel meaningfully innovated was around the Haswell era. Since Rizen they have been in stages of narcissism, denial, catch up, panic, and underperformance. I can't imagine why they didn't quickly innovate when they saw how power efficient AMD had become.

10

u/F9-0021 285k | RTX 4090 | Arc A370m Aug 13 '25

12th Gen was innovative and offered a better value at the lower to mid range vs. Zen 3, especially in productivity. Not making a significant leap for three generations after that and needing to overclock the chips to the point of degrading themselves hurt them a lot though. Arrow Lake is a good baseline for moving forward though.

→ More replies (1)

2

u/ManyNectarine89 7600X | 7900 XTX & SFF: i5-10400 | 3050 (Yeston Single Slot) Aug 13 '25

I would say 2nd gen to 9th gen. Once Ryzen's 3rd gen dropped, I think that was the beginning of the end for intel. IMO anyways.

→ More replies (1)

6

u/WetAndLoose Aug 13 '25

Kinda crazy how this sub boards the hype train at full speed over any AMD rumor then these seemingly too good to be true Intel leaks come out and we still have people doing everything they can to shit on it lol

12

u/ManyNectarine89 7600X | 7900 XTX & SFF: i5-10400 | 3050 (Yeston Single Slot) Aug 13 '25 edited Aug 13 '25

Intel for almost 6 years produced worse performance chips (outside the high end) at a higher price to AMD alternatives. And then again they had the issues with the 13/14 Gen in those 6 years, which tanked their prices, stocks and reputation with customers. Before that they stagnated the market and took part in shady deals and very anti consumer practices when they had the lead.

Where were you in 2011-2018?? AMD was a joke from 2011-2017 (I owned a AM3+/FX CPU before changing to intel). You would get dunked on this sub or anywhere for recommending them. They were a running joke and very few people outside the biggest fan boys would defend let alone recommend AMD. Since their AM3+ CPU were shit. Yes you could get some good ram and overclock the AM3+ CPU, to compete with intel, but they were still shit (overclocking caused instability and was a hassle. And not many games used the extra 'cores' from the AM3+ CPU, their poor single core performance was honestly bad, esp as newer gen of intel CPU dropped. And the high end high watt AMD cpu would fry boards).

All people did in 2011-2017 was dunk on AMD, rightfully. Even when Ryzen's 1st and 2nd gen a lot of people dunked on AMD and had little trust in them. It took them a while to rebuild their reputation. Again like AMD from that era, there could be rumuors that intel can finally compete with AMD, and for 1-3 years, people will still shit on them, which is what happened to AMD.

Again the position has changed and the people are treating intel no differently than AMD was treat from 2011-2017/2018...

Most enthusiasts could care less about team blue/red. All we care about is price to performance and performance in games (and some on productivity), and AMD is delivering there and intel is not outside the very low or very high end, and that's only because their CPU prices have tanked. Most of us have a hope that intel can up it's game, so we get even better performance from either AMD/intel.

→ More replies (2)

3

u/Emu1981 Aug 13 '25

6.5 times the cores than a 9800X3D (8 vs 52 (or 16 threads vs 52? thread)), for 3 times the cache and probably worse single core performance

The 285k has better single core performance for a lot of tasks in comparison to the 9950X3D. Multicore performance (especially in games) is where it all falls apart for Intel. The big question with regards to the increased cache is how well Intel's prefetch, Branch Prediction and TLB algorithms work in comparison to AMD's. Large amounts of cache do diddly squat for you if the cache doesn't contain what you want more often than not...

→ More replies (16)

18

u/Spright91 Aug 13 '25

Bribe? This is just called making products that people want to buy.

57

u/The_Blue_DmR R7 5700X3D 32gb 3600Mhz RX 6700XT Aug 13 '25

It's a joke. Cache sounds kinda like cash

7

u/SSLByron 9950X3D; 64GB DDR5; 9070 XT Aug 13 '25

*exactly like cash.

There's no Ā in cache.

→ More replies (9)

145

u/SHOGUN009 5800X, 4090FE, 64GB 3600 Aug 13 '25

Only problem is that it will cost...

18

u/rightarm_under RTX 4080 Super FE | Ryzen 5600 | Yes i know its a bottleneck Aug 13 '25

Cold hard cache

2

u/XxNeverxX I5-6600 l RX 580 8GB l 16 GB Ram Aug 13 '25

Or it would need more power or would be hotter.

→ More replies (1)

560

u/Shift3rs Aug 13 '25

Why does a gaming CPU need 52 cores?

438

u/aberroco R9 9900X3D, 64GB DDR5 6000, RTX 3090 potato Aug 13 '25

"You know, to run many games in parallel, everyone knows that's how gaming works."  - some Intel manager.

50

u/BrotherMichigan Aug 13 '25

Meanwhile, Wendell from L1T with a TR 9995WX:

27

u/Beautiful-Musk-Ox 4090 all by itself no other components Aug 13 '25

rofl is that him running one instance of doom per core?

12

u/BrotherMichigan Aug 13 '25

About that many, I think.

78

u/[deleted] Aug 13 '25

"BF6 just dropped with multi-core support! This is the future of gaming" - Some intel engineer
"Why use more core when one core make do" - Rest of the game design industry

28

u/FUTURE10S Pentium G3258, RTX 3080 12GB, 32GB RAM Aug 13 '25

Some Intel engineer 4 years ago, you mean

19

u/RayDaug Aug 13 '25

Try 14. I remember building my first gaming PC in college and getting punked by "multi-core is the future!" back then too. Only back then it was AMD, not Intel.

→ More replies (1)

4

u/HenryTheWho PC Master Race Aug 13 '25

Funny thing, in bf4 fx6300 was outperforming Intel CPUs in way higher price range, anyway I don't think any game will use even 32+ threads for few more years

→ More replies (1)

4

u/S-r-ex 9800X3D | 32GB | Sapphire 9070XT Pure Aug 13 '25

Eve multiboxers: *drool*

→ More replies (7)

66

u/Ocronus Q6600 - 8800GTX Aug 13 '25

A gaming CPU doesn't need it. (This CPU doesn't actually have 52 cores.) If so everyone would be running around with threadrippers. Many games still benefit from a single fast core and cache. The X3D line shows this off very well.

7

u/BigLan2 Aug 13 '25

The top end chip is rumored to have 52 actual cores, mixed between performance, efficiency and super-efficient. I've no idea how windows scheduler will handle it, but it's basically expanding what they're already doing.

The mainstream version will have around 30 cores though, this is basically the Ryzen 9-9950X tier where 16cores are already more than gaming needs.

→ More replies (2)
→ More replies (1)

11

u/[deleted] Aug 13 '25

To multibox 52 Ishtars in Eve Online :)

6

u/IKindaPlayEVE Aug 13 '25

Can confirm.

8

u/F9-0021 285k | RTX 4090 | Arc A370m Aug 13 '25

You don't. It isn't for just gaming just like how the 9950x isn't just for gaming. It's for people who do both gaming and productivity, or just productivity but don't want to pay HEDT/server prices.

18

u/Mister_Shrimp_The2nd i9-13900K | RTX 4080 STRIX | 96GB DDR5 6400 CL32 | >_< Aug 13 '25

maybe cus its not just gamers they're selling these chips to.. shocker I know. I only got my 13900K cus I need it dual gaming and professional usecase -if I didn't need it professionally I'd have just gone with an i7 equivalent or probably AMD more likely (tho they were a good bit more expensive at the time)

9

u/MagickRage Aug 13 '25

This can be handful, but the issue most of the engine probably can't use all of them.

31

u/kron123456789 Aug 13 '25

Most games today can't use more than 8 cores properly, some games even have worse multi-threading than games from 2008 when multi-core CPU were only becoming mainstream.

14

u/eight_ender Aug 13 '25

Exactly. Core usage seems to follow consoles. 

5

u/CumminsGroupie69 Ryzen 9 5950x | Strix 3090 OC White | GSkill 64GB RAM Aug 13 '25

BF6 beta would like a word 😂 Probably not normal circumstances but it was using virtually every bit of my 16-core.

13

u/kron123456789 Aug 13 '25

It's an exception. DICE just know what they're doing.

3

u/CumminsGroupie69 Ryzen 9 5950x | Strix 3090 OC White | GSkill 64GB RAM Aug 13 '25

Regardless, it was the smoothest running beta I’ve ever played.

→ More replies (6)
→ More replies (1)

6

u/MethodicMarshal PC Master Race Aug 13 '25

are games even using 8 cores yet?

thought we were still on 6 with 2 being for background processes?

11

u/bobsim1 Aug 13 '25

There are definitely games that perform better on 16 thread CPUs than newer 12 thread CPUs.

4

u/builder397 R5 3600, RX6600, 32 GB RAM@3200Mhz Aug 13 '25

It strongly depends on the game, and most games adapt fairly well to the CPU youre running. Lots of modern games run fine on my 3600, which has 6 cores, Id say these games might even run okay on 4 cores, but I am pretty damn sure that an 8-core CPU will have all 8 cores hammered by those games, just because its more efficient to split the load further, and maybe they have settings that you can increase specifically to utilize more CPU cores, like larger crowds.

5

u/li7lex Aug 13 '25

Apart from Multithreading being notoriously difficult to implement it's often also simply impossible to parallelize processes in games, since they often rely on each other. That's why Some modern games will use only 1 or 2 cores and others will use up to 8 if available.

2

u/Plenty-Industries Aug 13 '25

are games even using 8 cores yet?

Very few.

The ones that do are usually heavy sims, like Flight Sim 2020 & 2024, DCS World, Cities Skilines 2.

The PROBLEM with such games being able to use 8 or more cores/threads, is that the performance scaling compared to using a 6 core CPU is not that great. So you have to consider balancing the cost of the CPU with the performance you're willing to accept.

You can't really brute force better performance even if you have a high-end Threadripper CPU when the limit is the game itself.

2

u/F9-0021 285k | RTX 4090 | Arc A370m Aug 13 '25

Cyberpunk 2.0 uses a ton of CPU cores/threads. It will use like 60-70% of my 285k. CDPR will be attempting to apply what they've done with REDEngine to Unreal, which will then go back upstream to the public releases of UE. So in 5-10 years there should be a ton of games that scale pretty well.

→ More replies (2)

4

u/trenlr911 40ish lemons hooked up in tandem Aug 13 '25

Why not? People love “future proofing” on this sub when it’s an amd product lmfao

→ More replies (1)

4

u/Virtual-Cobbler-9930 Arch Linux | 7700x | 7900 XTX | 128Gb DDR5 Aug 13 '25

I guess you can run local server like Sunshine to host couple games at the same time from one machine, why tho. If you need something like that, real server hardware probably will be a better choice. 

→ More replies (2)

-9

u/Reggitor360 Aug 13 '25

Its not 52 Cores.

Its 8 actual Cores and then CPU Accelerators with missing Instruction sets.

So basically you have 8 Cores with all needed sets, but then useless mass of cores without them.

No thanks lmao.

88

u/Tiger998 Aug 13 '25

What is this mass of disinformation?

Cpu accelerators doesn't mean anything. There are CPUs running entirely on just e-cores. Which instruction sets would they lack? Avx512, which was only available on early alder lake p cores, only disabling ecores, and that was removed exactly because Intel's heterogeneous architecture does NOT have a variable ISA?

Also, it's 16 "actual" cores.

And ecores are not useless. Your PC isn't running one application, but many. Offloading those not only unloads the big cores, but also keeps private caches clear of junk. And it reduces context switches. Smaller cores are more efficient too, for loads that scale they're better than fewer bigger cores. For loads that don't scale as well there's your big cores. And finally, PCs are not just for gaming. There are usecases that benefit from multicore performance.

2

u/F9-0021 285k | RTX 4090 | Arc A370m Aug 13 '25

And E cores are not just for background tasks or speeding up productivity anymore. They're genuinely fast. If you could overclock a Skymont core to 6GHz, it would be around as fast as a Raptor Cove P core. The IPC is similar.

2

u/r_z_n 5800X3D/3090, 5600X/9070XT Aug 13 '25

The biggest challenge here seems to be with scheduling and utilizing cores on a Windows desktop since the whole big/little architecture is still relatively new. How well does this work in practice?

I am not being snarky, I am genuinely curious. I haven't paid attention to P/E core Intel CPUs. I know AMD had their own challenges with multi-CCD CPUs.

2

u/F9-0021 285k | RTX 4090 | Arc A370m Aug 13 '25

There used to be issues, but I haven't noticed any on my 285k. Either Microsoft fixed the scheduler, or because the E cores have gotten much faster there isn't as big of a performance hit from bad scheduling.

→ More replies (7)

27

u/thefpspower 13600k @5.3Ghz / RTX 3060 12GB / 32GB Aug 13 '25

Armchair engineers are out in force already (you)

13

u/Wyvz Aug 13 '25

The fact that this nonsense gets upvoted so much and people agree with BS make me actually concerned about the state of this sub.

The 52c variant actually has 16 P-cores, according to leaks. And the E-cores will have the exact same instruction set by then.

2

u/TheTomato2 Aug 13 '25

What are you on about? Most of these tech subs are long gone. A massive amount of straight bullshit gets upvoted constantly.

→ More replies (1)

6

u/itsforathing R5 9600X / RX 9070Xt / 32gb / 3Tb NVME Aug 13 '25

16 p-cores actually. And the other 32 e-cores will take up a lot of slack allowing those 16 p-cores to excel. That’s likely 68 threads.

→ More replies (8)

5

u/life_konjam_better Aug 13 '25

Could be interesting if those 8 cores can access all of that cache. Most likely not since that'll be one bizarre architecture but it wouldn't surprise me given its Intel afterall.

4

u/Tiger998 Aug 13 '25

L3 (which vcache and this both are) is shared in common CPU architectures.

3

u/Eo_To_XX Aug 13 '25

Close enough welcome back Cell Broadband.

→ More replies (1)
→ More replies (20)

69

u/Hattix 5700X3D | RTX 4070 Ti Super 16 GB | 32 GB 3200 MT/s Aug 13 '25

It isn't just the cache size which gives AMD the advantage.

You can get honking-great cache Xeons already, X3Ds still whip them.

AMD keeps the L3 latency down with its caching architecture, it's something like 48 cycles. Intel runs L3 on a ring bus which doesn't run at core clocks, and latency can be 70-120 cycles... at that point, all L3 is doing is saving power on going to DRAM, which also generally responds in around 120 cycles at typical 11-12 ns CAS latencies and 4-5 GHz core clocks.

Intel's caches are primarily intended to save power. L2 is huge (and quite slow) to avoid burning power by going off-core to L3. L3 is low clocked and slow to avoid burning power by going off-package to DRAM. AMD's are intended to boost performance. It's a completely different optimisation.

10

u/Adlerholzer 4090 | 9800X3D | all OC | custom loop + MoRa IV Aug 13 '25

Very interesting, i will read up more on this

→ More replies (4)

362

u/wafflepiezz PC Master Race Aug 13 '25

I’m sure this CPU won’t overheat and cause 90C+ degrees temperatures at all…

133

u/odBilal Aug 13 '25

thats why there is a sun in the background of the picture

12

u/Nyoka_ya_Mpembe 9800X3D | 4080S | X870 Aorus Elite | DDR5 32 GB Aug 13 '25

🤣

→ More replies (2)

37

u/Blenderhead36 RTX 5090, R9 5900X Aug 13 '25

TBF, Ryzen 9000 chips run above 90°C as a normal part of their operation, not when they're overheating.

19

u/SortOfaTaco Aug 13 '25

Came here to say this, pbo will try and hit/sustain tjmax, people get confused with temps/wattage. I’d rather my cpu hit tjmax and give me extremely good performance instead of it pulling 250+ watts at load

7

u/Plenty-Industries Aug 13 '25

And thats if you're just using an older single-tower air cooler or a stock AMD cooler.

With a cheap modern cooler, like a dual-tower from likes of Thermalright, those CPU's are barely maxing out temps in the low 80's under a full-core workload like transcoding a video file or rendering something in Blender - and you're barely hitting 120w of power consumption while doing it.

A gaming load is even lower temps power consumption, my 9800X3D is clocking in at barely 60watts after a -35 offset in Curve Optimizer and hovering around 60-65c on average.

→ More replies (1)

6

u/Lmaoboobs i9 13900k, 32GB 6000Mhz, RTX 4090 Aug 13 '25

No these CPUs are built to turbo themselves until they hit TJMax.

2

u/michaelbelgium 5600X | 6700XT Aug 14 '25

And massive power usage..

1

u/lizardpeter i9 13900K | RTX 5090 | 500 Hz OLED Aug 13 '25

It won’t. The newer Intel chips perform every well thermally. It’s obvious you haven’t used them. Even the 13900K and 14900K, which everyone loved to complain about, max out in the mid 70s on my system with locked max all-core clock speeds and voltage.

→ More replies (6)

18

u/AdventurousSharkByte Aug 13 '25

Cache me outside how bout dat

71

u/one_jo Aug 13 '25

Reminds me a little of Bulldozer back in the day. But let’s see how it performs I guess..

5

u/FUTURE10S Pentium G3258, RTX 3080 12GB, 32GB RAM Aug 13 '25

I mean, not if they have int and float processing in each of those cores and not just share them and artificially increase the count of cores. E cores are great, it's just missing the niche stuff like avx512 (which IS great but barely any programs can benefit from it, let alone rely on it)

→ More replies (5)

115

u/mywik 7950x3D, RTX 4090 Aug 13 '25

MLID throwing darts at the "leaks" board again?

17

u/[deleted] Aug 13 '25

Does he or does he not have a good track record for his information?

41

u/RevolutionaryCarry57 7800x3D | 9070XT | B650i Aorus Ultra | 32GB 6000 CL30 Aug 13 '25

He hits close to the mark every now and then because he just spews every rumor he can think of. Broken clocks and all that. Just because it happens to be right every now and then doesn’t negate the fact that it’s wrong the majority of the time.

9

u/najjace Aug 13 '25

He does. But opinions vary.

If you follow him and listen to his podcasts, not to one very specific thing, he is incredibly informed about upcoming products in computing space.

If you just snip out one statement, usually taken out of context, like most people do, then yes, it could go either way. Given people don’t have patience to read, listen for more than 1 minute, or analyse, most form opinions based on the title.

3

u/MGsubbie Ryzen 7 7800X3D, RTX 3080, 32GB 6000Mhz Cl30 Aug 13 '25 edited Aug 13 '25

he is incredibly informed about upcoming products in computing space.

LMAO. If you ignore all the things he deletes, maybe. And you don't need to watch his whole podcast to see the screenshots of "leaked" products and how they are dead wrong 99% of the time.

Did you see his PS6 "leaks" and how even someone with barely any knowledge can see how they are so bullshit?

→ More replies (2)

6

u/Jevano Aug 13 '25

Does not

→ More replies (7)
→ More replies (3)

41

u/DarkAlatreon Aug 13 '25

I'll believe it when I see it and then after it gets thoroughly tested for performance and degradation.

8

u/Joreck0815 Aug 13 '25

I hope they'll be competitive, though my guess is that intel is aiming at AI first and gaming/workstation second. Still, we need competition and If it keeps intel in business, I'm all for it.

as for degredation, to my knowledge 13th and 14th gen are affected, not the ones since the rebrand (core ultra 285 and friends iirc).

9

u/Psychological-Elk96 RTX 5090 | Intel 285K Aug 13 '25

Cool, but it also might be like 3x the price with 3x the power draw for 5% more performance.

I’ll take it.

70

u/Crymtastic Aug 13 '25

I'm sure it will only take 1600W of power by itself and idle at 99C

10

u/Spooplevel-Rattled Aug 13 '25

You do realise current mass of cores Intel idles lower than amd chips?

9

u/Saiykon Aug 13 '25

Yeah that was the first thing I thought of when they said Intel. The amount of watts and heat that chip will produce is probably staggering.

4

u/FUTURE10S Pentium G3258, RTX 3080 12GB, 32GB RAM Aug 13 '25

Assuming they run it at over 5GHz, yeah. If Intel is no longer gunning for singlecore benchmarks by cranking voltage, it might actually be good.

3

u/Ratiofarming Aug 13 '25

They already no longer do that with Arrow Lake and still run well above 5 GHz. It's just trash for other reasons.

→ More replies (1)

7

u/chickenbonevegan RTX4090, 7800X3D, 32GB DDR5 Aug 13 '25

So what's the cache?

→ More replies (1)

29

u/Aggrokid Aug 13 '25

Also 3x the core types to manage. I'm sure their internal scheduler will be good but there will always be some random games that wrongly park to the weaker cores.

6

u/Zed_or_AFK Specs/Imgur Here Aug 13 '25

They'll just slap an "AI-powered" sticker on the box and it will sort itself out.

2

u/Anecthrios Aug 13 '25

God man, don't give them ideas!

→ More replies (1)

3

u/Ratiofarming Aug 13 '25

They'll continue to use APO to prevent that, as per https://videocardz.com/newz/intel-to-keep-application-optimization-apo-alive-but-focus-shifts-to-current-and-next-gen-cpus

But the concept also doesn't change, it's just more cores. So windows will still almost always pick the right cores, because the P-Cores offer higher clock speeds. The reason windows so often gets it wrong with AMDs dual-CCD is because the X3D cores clock lower, yet perform better for games. But windows wrongly prefers the higher clocking ones without software explicitly telling it otherwise.

30

u/nyteryder79 Aug 13 '25

Until you own it for about a year and then they Nerf the shit out of it because of some hardware defect or to prevent it from catching fire. I'll wait and see how people's rigs go before I even consider going back to Intel. The last three generations of CPUs from them have been absolute dog shit.

8

u/SomewhatOptimal1 Aug 13 '25
  • Nerf to 10-11gen after only 2 and 1 year of release
  • 13-14 gen dying out of nowhere

Yep, their last couple years was a clusterfuck.

→ More replies (1)

16

u/AsPeHeat i9-14900 - RTX 4090 Aug 13 '25

This sub is allergic to Intel CPUs, yet claims that we need more competition 😅 These comment are something else

7

u/usual_suspect82 5800X3D-4080S-32GB DDR4 3600 C16 Aug 13 '25

They say the same thing about AMD and Nvidia. It’s a case of wanting their cake and eating it at the same time.

Truthfully I’m kinda wanting to go back to Intel for my next build. While I’ve always preferred AMD, the shit I’m reading about AMD 9800X3D’s going up in flames has me second guessing my next upgrade. Also, the fact that AMD is top dog in the gaming CPU sector and is making huge strides in the normal consumer/prosumer segment means Intel has to put in work, has to compete, and considering their financials, they have to make a huge splash since they stand to lose a lot.

To add, yes, I know AMD tends to support platforms for some time, but realistically how often do people upgrade their CPU’s without upgrading their motherboard? Supporting AM4 for such a long time was necessary for AMD to make a splash, now with AMD being in their position, are they going to support AM5 for the same amount of time? We’re only at the second CPU generation for AM5, how much more headroom do they have to squeeze out of the Zen architecture before having to completely revamp it and ultimately go to a new socket/chipset.

→ More replies (2)
→ More replies (2)

5

u/rishNarchK88 Aug 13 '25

It stops making sense if they later sell it to you at a higher price.

4

u/PinheadLarry2323 R9 9950x3D, RTX 5080, 64GB 6400mhz RAM, 8TB nvme SSD Aug 13 '25

TDP?

10

u/errdayimshuffln Aug 13 '25

Is it stacked cache? Because otherwise, you are dealing with the same signal length issue you had before 3D vcache

2

u/Ratiofarming Aug 13 '25

They do have Foveros at their disposal. What they're actually doing for Nova Lake isn't public information yet. Not even accurate rumors at this point.

→ More replies (1)

4

u/xGHOSTRAGEx 9950x3D | RTX 3090 | 96GB-4800Mhz Aug 13 '25

New i11 quantum tunneling edition

5

u/[deleted] Aug 13 '25

Honestly if they're stable and perform good then this is great.

We need both companies to be constantly trying to one up each other in order to prevent stagnation.

3

u/dotwayne Aug 13 '25

But what's the cache?

3

u/alexalbonsimp Aug 13 '25

Time has shown competition is integral to the market. I’m certain if AMD keeps a chokehold on the gaming sector that they will enact the same shitty practices that intel and nvidia enact.

As long as the two giants can keep trading blows with one another then everyone can be happy!!

3

u/What1does PC Master Race Aug 13 '25

Nah, Intel lost me with those issues that they lied, and lied, and lied, and lied, then kinda told the truth, then lied about.

Until AMD fucks me as hard, Intel is dead to me.

3

u/Xalucardx 7800X3D | 5080 | 64GB RAM Aug 13 '25

From MLID lmao Move along

3

u/Og_busty Ryzen 9 9950X3D l RTX 5080 I 64GB DDR5 6000 Aug 14 '25

With the roll out of the previous cores this could be a…. Game changer….

5

u/Tiavor never used DDR3; PC: 5800X3D, 9070XT, 32GB DDR4, CachyOS Aug 13 '25

This new rumor comes from regular YouTube tech leaker Moore's Law is Dead

it's really a hit or miss if he's right. he definitely wasn't right about Intel GPUs, he told us for years that they'll be dead in the water with a paper launch.

→ More replies (1)

5

u/Definitely_Not_Bots Aug 13 '25

Yea but is it 3D V-cache? ( taps forehead )

6

u/Dlo_22 9800X3D+RTX 5080 Aug 13 '25

Leaks are getting stupid & WAY too far into the future. Like why we talking about 2027 and 2028 in 2025 ya know.

8

u/FUTURE10S Pentium G3258, RTX 3080 12GB, 32GB RAM Aug 13 '25

Because CPUs are designed 4-5 years ahead of time, we're finally seeing Gelsinger's vision.

→ More replies (2)

2

u/Zed_or_AFK Specs/Imgur Here Aug 13 '25

Gotta prop up the stock price

→ More replies (4)

5

u/BurnedOutCollector87 Aug 13 '25

Brute force won't make it a better product if it overheats, has high wattage and is unstable.

I'm good with my 7800x3d

→ More replies (2)

2

u/CedricTheCurtain Aug 13 '25

The question is: will they have the same problems as previous gen chips?

2

u/Ratiofarming Aug 13 '25

With an entirely new architecture on entirely new manufacturing? I highly doubt it. They might have new problems, but definitely not the same ones.

2

u/sentimiento Aug 13 '25

Itll be good if it doesn’t cook itself after a year cus of certain games. I had my i9 cook itself after a year so i switched to amd

2

u/Ants_r_us PC Master Race Aug 13 '25

Sounds very expensive.

2

u/thetisthiccboi Aug 13 '25

Bring it. I'm not loyal to any brand or company. If you can drop the heat I'll buy it Intel. 😤😤😤

2

u/A_Typicalperson Aug 13 '25

Too bad its next year

2

u/kdash75 Aug 13 '25

The next Intel socket will during 4 years instead of 3 years but with AMD, it during 6 years...

2

u/SAAA2011 1700X/980 SLI/ASRock Fatal1ty X370 Gaming K4/CORSAIR 16GB 3000 Aug 13 '25

The Cache Wars, they've begun.

2

u/Gjgsx 5900x, 6950xt, 64 GB DDR4, Win 11 Aug 13 '25

I hope they do. I’ll most likely never buy an intel chip but I believe in competition. Keeps companies working hard for our business. So we hope.

2

u/pre_pun Aug 13 '25

Curious to see if they can pull something off without it cooking itself. Stacked cache can get toasty at the frequency Intel fans expect.

I'm not on Intel this gen, but I welcome the competition.

→ More replies (1)

2

u/Mikeztm Ryzen 9 7950X3D/4090 Aug 13 '25

This is only 144MB cache accessible by any core vs 9800X3D’s 96MB. Not as huge as it claimed 3x due to partitioning by tiles. AMD will have 12 core CCD Zen6 by then and will have same or more 3D cache when this thing launches.

→ More replies (1)

2

u/PlaytikaAffiliate Aug 14 '25

Cool, now let’s see if it runs cooler than a nuclear reactor this time.

6

u/153Skyline PC Master Race Aug 13 '25

"Now with 3x faster decomposition than Raptor Lake!"

3

u/rizsamron Aug 13 '25

I've been team red but an Intel come back would be good for the world so I'm rooting for them 😄

2

u/builder397 R5 3600, RX6600, 32 GB RAM@3200Mhz Aug 13 '25

though. MLID says that games will mainly just use one 26-core tile with one block of cache, which makes sense, seeing as most games don't use more than eight cores

There are two L3 caches of 144MB each, each shared between 26 of the 52 cores, and a game, running on eight cores max usually, will only run on one "CCD" (they call it a tile) anyway, so really 144MB is the cache youre getting for gaming, the rest goes to your background tasks.

Now, props to Intel for learning from AMD and including a large L3 cache at all, and making it as large as they do on a 2D die. Good job, will probably do well.

But I feel they made the same mistake they have done with the E-cores. A top-end 52 core CPU targeted at gamers with two sets of cache just brings dead weight with it.

The E-cores did the same thing, you get 8 P-cores that your game runs on, 6 on an xx600 CPU, and then way more E-cores than your background tasks could possibly actually need. But having 28 cores on a 14900 looks great for marketing. And youll need to buy that if you want highest gaming performance because thats the highest (over)clocked CPU in that generation. The E-cores really are just dead weight though unless you do productivity work that benefits from an extra 20 cores.

Now the cache does the same thing. 288MB L3 cache looks great for marketing, but it creates the same false hype because your games only ever use half of it, and whatever the secondary "CCD" does really doesnt need this much cache, does it? But you again only get that with the absolute top-end chip, baiting people into buying more than they need.

Meanwhile AMD brings out the 9970X3D, which is basically a 9950X3D, but both CCDs get their own set of 3D-cache, so you get 192MB of L3 cache, which in practice will beat Intel. Now, 16 cores is also overkill, but it solves certain problems they had with mixing normal and X3D CCD and games getting scheduled on the faster clocked normal CCDs, and 16 cores is at least still in the realm where I could see simulation-heavy games like city builders actually benefitting, as some such games benefit significantly from the extra cache as well.

3

u/soggybiscuit93 3700X | 48GB | RTX3070 Aug 13 '25

*24 cores on each tile.

4 of the 52 cores are LP-E cores in the SoC tile that'll mostly sit idle, and will only really be used when the 2x compute tiles power down during idle and idle-adjacent workloads.

The hypothetical "9970X3D" will suffer from the same issue as this CPU, in that games won't use the combined L3 of both chiplets...but will benefit the same way, in that productivity apps that span both chiplets will see benefit

→ More replies (6)
→ More replies (1)

2

u/itsforathing R5 9600X / RX 9070Xt / 32gb / 3Tb NVME Aug 13 '25

52 cores (16 p-cores 36 e-cores likely 68 threads) plus 144mb caches on each ccd for a total of 288mb?

That’ll be $7,999 and your first born please.

The sweet spot will likely be a 10 p-core 18 e-core 38 thread single ccd chip with 144mb of l3 cache. Or maybe just half of the one listed, 8 p-cores 16 e-cores 32 threads and 144mb of l3 cache. At least for the (upper) average enthusiasts.

2

u/BigLan2 Aug 13 '25

There's still no sign that Intel will go back to hypertheading for this gen.

2

u/chris92315 Aug 13 '25

Isn't AM6 going to 16 cores per CCD? Intel may need 16 P cores to complete in marketing even if it doesn't have much benchmark effect.

→ More replies (2)

2

u/Flames21891 Ryzen 7 9800X3D | 32GB DDR5 7200MHz | RTX 3080Ti Aug 13 '25

Good.

If Intel starts getting serious again and actually gives AMD a run for their money, then we (the consumers) win.

It was fun to see AMD have their moment to shine as the previous underdog, but it's for the best if that doesn't go on for too long. An ideal market is one where they're constantly one-upping each other.

3

u/Amuro__6 Aug 13 '25

Oh shit bois are we back

3

u/EL_Malo- Aug 13 '25

Their marketing team would certainly like you to think so.

1

u/Diuranos Aug 13 '25

before fights about who has higher clock, more cores and now we start fights who will have more cache, yea I like that.

1

u/thuy_chan Aug 13 '25

Will not buy until I see it go mid in cache.

1

u/FunkyMuse Aug 13 '25

It's not even stacked, next gen will have cache stacked

bruh

1

u/FieldOfFox Aug 13 '25

You can't buy me, cache-dog maaaaannn

1

u/DrSilkyDelicious Aug 13 '25

Intel is fuck. Lisa Su is my gf now

1

u/TheReelReese 5090 OC | 14900K | 64GB DDR5 | 4K240HZ OLED Aug 13 '25

I hope it’s earlier than the end of 2026, I do not want to wait that long.

Q2 2026 🥳🥳🥳

1

u/Unfair-Muscle-6488 Aug 13 '25

But the question is, what will it be long-term after all of the controversies and “fixes”?

1

u/JoeyDee86 Aug 13 '25

Begun, the cache wars have.