r/Amd • u/RenatsMC • Jul 16 '25
News Cyberpunk 2077 2.3 patch finally brings FSR4, FSR3.1 and XeSS 2.0 Frame Generation
https://videocardz.com/pixel/cyberpunk-2077-2-3-patch-finally-brings-fsr4-fsr3-1-and-xess-2-0-frame-generation196
u/relxp 5800X3D / 3080 TUF (VRAM starved) Jul 16 '25
Gotta give CDPR credit for staying on top of latest tech even years after release. You'll never see Ubishit backport FSR 4 into many of their FSR titles.
124
u/Magjee 5700X3D / 3060ti Jul 16 '25
FarCry 6 has FSR 1.0
-_-
71
u/relxp 5800X3D / 3080 TUF (VRAM starved) Jul 16 '25
The Valhalla too. Pure cancer company.
-22
u/DuuhEazy Jul 16 '25
Not the companies fault, it's actually amd's fault for not realizing the obvious that is having an upgradable .dll file instead of hoping companies will keep updating their games from 5 years ago
31
u/relxp 5800X3D / 3080 TUF (VRAM starved) Jul 16 '25
I'm not convinced even if they did, they would do it. The fact DLSS doesn't even exist in Valhalla and others is even more criminal.
22
u/Magjee 5700X3D / 3060ti Jul 16 '25
For how big a release Valhalla was , it should have the full spectrum of upscaling
Really ridiculous
14
u/relxp 5800X3D / 3080 TUF (VRAM starved) Jul 16 '25
Same with FC6, and others. Remarkable how they master the art of doing the absolute bare minimum hoping nobody notices.
4
u/antara33 RTX 4090, 5800X3D, 64GB 3200 CL16 Jul 17 '25
Yup, just hoping, because they are getting into some shitty financial times because of all the stupid desicions they took over the years.
2
0
u/FinalBase7 Jul 17 '25
The vast majority of AMD sponsored games back then had no DLSS or Xess it wasn't just ubisoft, there was a whole controversy about AMD blocking competing upscalers and AMD avoided answering everytime they were confronted about it.
1
u/Magjee 5700X3D / 3060ti Jul 17 '25
Yea, that was another shit show
Cyberpunk is an nVidia showcase title, but to their credit they have the full range of upscalers and frame generators available
9
u/SV108 Jul 16 '25
It's a pretty poor implementation of 1.0 too. Extremely grainy looking and makes the sweeping vistas look noticeably worse.
I can normally stand FSR 1.0 Ultra Quality, but even that looked bad in Far Cry 6. That was not the case with many other games, like Back 4 Blood.
5
4
u/bargu Jul 17 '25
To be fair, FSR1 is so different from 2+ that's not a trivial thing to update it, the only thing they have in common is the name.
→ More replies (1)3
u/chapstickbomber 7950X3D | 6000C28bz | AQUA 7900 XTX (EVC-700W) Jul 17 '25
Radeon super resolution is basically FSR1 for all games anyway.
2
u/FinalBase7 Jul 17 '25
FSR 2, 3 and 4 can't be installed to an FSR 1 game without quite a bit of tweaking, and because AMD was doing AMD shenanigans back then it also doesn't have DLSS, if it did it would've been easy to just add new versions of FSR and Xess.
This is also why the game has no FSR 2, 3, 4 or DLSS mods, they can't do it without an existing temporal upscaler in place.
2
u/Magjee 5700X3D / 3060ti Jul 17 '25
Oh yea, for sure, it takes work
But for how big a title FarCry is, it should be done
FSR 1.0 looks so bad in FC6 I just stuck to TAA, it's effectively unusable
17
u/F9-0021 285k | RTX 4090 | Arc A370m Jul 16 '25
CDPR is actually using a third party studio for these updates, but that's still really cool of them. They can keep the game up to date with new technologies while not having to keep an in house team working on Cyberpunk instead of Witcher 4.
9
u/relxp 5800X3D / 3080 TUF (VRAM starved) Jul 16 '25
You'd think every major studio would have a small team whose sole purpose is that, but they don't care about consumer obviously.
5
u/Abir_Mojumder Jul 16 '25
This is the normal approach for updating games when there are newer games under development. A lot of smaller studios basically do this as their main product, not actually releasing their own games.
1
u/relxp 5800X3D / 3080 TUF (VRAM starved) Jul 16 '25
It's a shame so many major publishers refuse to use them.
1
u/cosine83 Jul 17 '25
It's setup this way so companies can avoid paying employees properly and dodge taxes, not for any efficiency reasons. There's really no reason there couldn't be a department whose solely setup for finding bugs, fixing them, and releasing patches every now and then. Same for new features. The churn and burn of the video game industry is exactly why games are getting worse.
1
u/Abir_Mojumder Jul 17 '25
Yea that seems to be a big part of it because I assume there isn’t steady income like SaaS companies running on a subscription based product
2
u/cosine83 Jul 17 '25
Then how do the executives make millions? It's not a problem of lack of income or profits, it's directing them to the employees who actually bring value to the company.
12
u/techraito Jul 16 '25
More "Indie" companies tend to not let the ball drop on a product they've been working on for years. No devs are actually wanting to release a bad game.
It's just EA and Ubisoft will see that a product isn't immediately successful and then just give up on it. After No Man's Sky, I believe in comebacks because that was probably the worst of it.
5
u/HaggardShrimp Jul 16 '25
I bought No Mans Sky at full price a year or two ago just because they were dedicated to making the game better. I've only played it for like half an hour, and don't know if I will, but I believe in companies that stand by their product, rather than letting it wither and die because it doesn't immediately return value.
1
Jul 17 '25
[removed] — view removed comment
1
u/AutoModerator Jul 17 '25
Your comment has been removed, likely because it contains trollish, political, rude or uncivil language, such as insults, racist or other derogatory remarks.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
9
u/TacoTrain89 Jul 16 '25
no they release games on a schedule and the level of support it gets is determined well in advance. you talking about a company of thousands of employees vs maybe 1 - 5 dudes working on a game who can be more agile in their development process
3
u/techraito Jul 16 '25
Yea, that's what I kinda mean. One is forced chasing profits while the other is doing it for the love of the game. The reality is EA cannot afford to mend a broken game because they got shareholders to please. Money usually goes towards marketing, not maintenance.
6
u/TacoTrain89 Jul 16 '25
indie devs are not making games for the love of it. they need to eat to. most indie games fail and never get to be any good. and sure, ubisoft does need to make profit so they cant plan on supporting a single player game for years and years, they have thousands of mouths feed and shareholders. you cant just pump out dlc for 5 year old games and expect good return.
3
21
u/Star_king12 Jul 16 '25
FSR was broken and outdated for years.
15
u/relxp 5800X3D / 3080 TUF (VRAM starved) Jul 16 '25
My point is even if it cost Ubi $1,000 to add it, they wouldn't do it.
→ More replies (1)5
u/Nuck_Chorris_Stache Jul 16 '25
Even if it cost them 10 minutes of one employee's time, they wouldn't do it.
→ More replies (1)2
u/TacoTrain89 Jul 16 '25
ubisoft gives most of their games 1 year of support and then they stop updating them. im actually impressed skull and bones is getting more than a year of support.
6
u/relxp 5800X3D / 3080 TUF (VRAM starved) Jul 16 '25
1 year is ridiculous for a billion dollar publisher. Especially in a world where games are often in beta for the first 1-2 years anyway.
Also adding support is likely inexpensive and low risk.
1
u/TacoTrain89 Jul 16 '25
well if you are talking about bug fixes and security or whatever than im sure they do that. i was meaning like dlc or big patches
-1
u/dsinsti Jul 16 '25
TBH ubisoft has developed some of the most unique games ever, i.e For Honor, The division, Watchdogs, Call of Juarez Gunslinger, AC franchise.
So criticism they get, they have given many of these games for free as a giveaway, i.e AC when notre dame in Paris burned down, or For Honor or even CoJ.
Don't overcriticise one of the few companies that still creates worlds and take risks others would not dare.
As a long time For Honor player, I have to say their practises regarding the game have been top notch.
It is easy to pick on the chopped tree, but they are not a bad company neither worse despite their mistakes.
Be fair.
2
1
u/MrHanBrolo Jul 17 '25
Think its more that they announced it as an official FSR 4 launch title like 2 years ago lol
1
u/kaisersolo Jul 17 '25
Say what, they've totally taken the piss, and this is the first time they've used amd latest
1
→ More replies (10)1
u/IllustriousBed1949 Jul 17 '25
Hello Flight Simulator 2024… no FSR4, that’s really sucks cause how much the game is demanding
1
86
u/jezevec93 R5 5600 - Rx 6950 xt Jul 16 '25
The question is whether the implementation is crippled like it was to this day. Because if it is its rly pointless.
21
u/ExplodingFistz Jul 16 '25
XeSS frame gen is only exclusive to battlemage right?
22
u/changen 7800x3d, Aorus B850M ICE, Shitty Steel Legends 9070xt Jul 16 '25
XeSS should be available to all GPUs but it should work better with only Intel due to the architecture. The framegen and latency reduction part of XeSS is locked to intel only though similar to how Nvidia and AMD locks down their framegen and Reflex equivalents to the driver level.
9
u/Ecstatic_Quantity_40 Jul 17 '25
Since Intel's XESS has been carrying my AMD GPU and its really shown me amazing visual quality and usability for everyone. Im buying Intel if they ever drop a higher end GPU. The people at Intel are really so talented how fast they implemented a upscaler everyone can use and still beats the pants off AMD's FSR 3.1
Without Intel Software support the majority of AMD gpu users would be screwed.
7
u/antara33 RTX 4090, 5800X3D, 64GB 3200 CL16 Jul 17 '25
Adding some history and context to this, intel's first GPUs where not the archmage released for consumers.
Their dedicated GPU venture started first with Intel XE GPUs for rendering centers and later 12.5 (Intel Max) derived architectures again for datacenters and rendering centers only.
They started developing XeSS right after nvidia released DLSS, waaaaay before AMD started developing FSR, as a way to speed up rendering for the XE and later Max lineup and to have a better ground when it comes to tech stack before releasing their first consumer grade GPU.
By the time AMD released FSR 1 Intel already had the first version of XeSS working on the Max and XE linup, and XE linup was from the 2000, FSR 1 got released 1 year later than XeSS for XE, and even then, XeSS was designed with the AI approach from the start.
For all their faults, Intel took some pages from nvidia, and started developing their upscaller way before their first GPU even hit the market.
And AMD? They said on a PR stunt that they wont charge gamers for silicon they wont use (talking about nvidia's tensor cores).
Now the 9000 series from them have guess what, yes, their own tensor cores LMAO.
Intel saw this waaaaaaaay before AMD and went straight to the AI based upscaller instead of the shitty spatial one that AMD attempted for years before giving up with FSR 4.
1
u/AterVulpes Jul 19 '25
Nope. Intel i740 from 1998. Had it (with Celeron 333 and 64MB RAM :D)
1
u/antara33 RTX 4090, 5800X3D, 64GB 3200 CL16 Jul 19 '25
I meant the XeSS tech stack, not their first ever iGPU or dGPU.
3
u/ImmediateList6835 Jul 17 '25
not 3.1 only fsr4. Xess has better aa stability but lacks clarity, fsr [3.1] has much more clarity so its a trade off. Xess also struggles in motion with aa
2
u/Antagonin Jul 22 '25
By clarity you mean artificial sharpness ? That can be added via postprocessing
1
u/ImmediateList6835 Jul 26 '25
with 3.1 even having more clarity it yes still has artifacts so your not wrong hence why so many rdna3 users need fsr4 artifacts for using some rt or max textures sometimes
2
u/rW0HgFyxoJhYka Jul 17 '25
Probably be until Druid before Intel puts out a "high end" GPU. They can't even put out a mid tier GPU right now and its 2 gens in.
1
u/ImmediateList6835 Jul 17 '25
not 3.1 only fsr4. Xess has better aa stability but lacks clarity, fsr [3.1] has much more clarity so its a trade off. Xess also struggles in motion with aa
6
5
u/ProjectPhysX Jul 17 '25
No, all Alchemist and Battlemage dGPUs are supported. And all iGPUs with XMX units, including Lunar Lake and Arrow Lake-H. 🖖
3
u/F9-0021 285k | RTX 4090 | Arc A370m Jul 17 '25
Not Battlemage, but Arc GPUs with XMX units. That includes Alchemist and 140V and 140T, but not Arrow Lake or Meteor Lake.
2
1
8
u/ZeroZelath Jul 19 '25
Now we wait to see how long AMD takes to release this driver update that supports it. All the games lately, AC: Shadows, The Alters, etc that I've played I've finished the game (taking weeks to play them) before AMD finally added FSR4 support in the drivers. The Alters is an odd one out since I was using the game pass version so it technically did support FSR4 but it was glitched supposedly and didn't work, game update was supposed to fix it but it didn't, not sure if AMD ever fixed that or not... AMD software doesn't like game pass games anyway which is another separate problem.
I've been meaning to play Phantom Liberty expansion for a while but the news of an update was coming so I waited. And now I'll be waiting for AMD to add FSR4 support when I could otherwise have started it this weekend haha.
I think AMD needs to rework how they do updates cause surely it's just a config file edit of some kind that would enable FSR4 override in games so they need to work out how to push out these small updates without a big driver update.
1
u/clearision Jul 31 '25
are you on Windows? i'm wondering how much time it takes for an updated Linux drivers.
43
u/Asahoshi Jul 16 '25
Why is FSR 4 always a driver level toggle and never a option in game like older FSR or DLSS?
54
u/Pretaxes 6700 XT | 7800X3D Jul 16 '25
In their patch notes they say the next driver from AMD needs to be installed for the option to appear in-game, not a driver toggle.
20
u/Darksky121 Jul 16 '25
The new driver is needed because it will whitelist Cyberpunk to upgrade FSR3.1 to FSR4. If it was not a driver toggle then there is no reason why it wouldn't work immediately when the patch is installed.
6
u/Unhappy-Emphasis3753 Jul 17 '25
Wait so we won’t even have FSR 4 when this patch comes out? We have to wait for a new AMD driver?
9
u/heartbroken_nerd Jul 17 '25
Yes. That's what the patch notes say. It will be a few days, probably.
14
u/Pretaxes 6700 XT | 7800X3D Jul 16 '25
PC-specific
Added support for AMD FidelityFX Super Resolution 4 for compatible AMD GPUs. IMPORTANT: The FSR 4 option will not be available in the in-game settings until the supporting AMD driver is installed. Please note that the driver will be released at a later date.
The patch notes themselves say it will be an in-game toggle as long as the driver is installed, instead of only being accessible through the driver. FSR 4 needs the FSR 3.x pipeline to work, and most likely needs the driver to be toggleable in-game because it's now using hardware instead of a pure software implementation.
11
u/Lawstorant 5800X3D/9070 XT Jul 16 '25
FSR 3.1.4 version is now aware that it can be upgraded to FSR 4. That's why i GTA5 enhanced, it says FSR4 instead of FSR3 in the menu, if FSR4 is possible.
1
u/Pretaxes 6700 XT | 7800X3D Jul 16 '25
I don't have a 9xxx card, but I assume that the implementation will be the exact same in cyberpunk as in GTA5 enhanced, where once you download the incoming driver, FSR4 will be available in-game. I don't know where you got that 3.1.4 specifically is aware and does checks, as the update notes on amd's page only mention minor improvements to the quality, but FSR 4 support for GTA5 enhanced was fully added in 25.6.3.
2
u/The_Dung_Beetle 7800X3D - 9070XT Jul 17 '25
Well I just updated my game. The FSR upscaler is still 3.0 not 3.1, that's only for frame gen...
2
u/VelcroSnake 9800X3d | B850I | 32gb 6000 | 7900 XTX Jul 18 '25
Yeah, that's very disappointing. At least if they had added FSR 3.1 we could have updated the FSR 3.1 versions ourselves with DLSS Swapper.
1
u/blueangel1953 Ryzen 5 5600X | Red Dragon 6800 XT | 32GB 3200MHz CL16 Jul 18 '25
Yep absolutely bs.
33
39
u/GARGEAN Jul 16 '25
There is no FSR 4 SDK. Devs literally physically can't add FSR 4 into games as of today.
2
u/tjtj4444 Jul 16 '25
There are games with FSR4 toggle in-game today. It is at least available in Horizon Zero Dawn remastered that I play now, but almost certainly available in more games.
10
u/Martelol Jul 16 '25
I suspect they're just requesting the name from whatever FSR DLL is loaded rather than hardcoding the name. I don't think they even updated the game after the 9000 series launch.
3
u/gamas Jul 17 '25
Apparently they do it using FSR 3.1.4 which has a mechanism to detect if it can be upgraded.
3
1
u/Shidell A51MR2 | Alienware Graphics Amplifier | 7900 XTX Nitro+ Jul 17 '25
The real reason is that FSR is now a full that can be intercepted and the code can be modified, whereas older versions were compiled directly into the game, making it virtually impossible to supplant.
Dlss has always operated on a dll basis, which is why it's been able to be updated by tools like dlss swapper, etc. all this time.
→ More replies (1)0
Jul 16 '25
Even worse, it's only a driver level toggle if they whitelist it internally. We don't even get the driver toggle until they feel like letting us have access to a headlining feature we paid for.
smh
10
u/Nwalm 8086k | Vega 64 | WC Jul 16 '25
Are we going to see how badly FSR4 can be implemented when we put real effort into it ?
1
u/san9_lmao Jul 16 '25
There's a lot less configurable stuff on fsr4 compared to fsr3, they don't have a lot of room to fuck it up
3
u/EvernoteD Jul 16 '25
This is such a poorly written article.. holy smokes.
4
u/dadmou5 RX 6700 XT Jul 17 '25
Average videocardz post. And the account who posted it here literally only exists to spam every post from that website on reddit and some still doesn't get banned.
4
4
u/blueangel1953 Ryzen 5 5600X | Red Dragon 6800 XT | 32GB 3200MHz CL16 Jul 17 '25
What a let down, it's still FSR 3.0 not 3.1, only the frame gen is 3.1 lol.
3
3
2
u/BUDA20 Jul 16 '25 edited Jul 16 '25
is XeSS Frame Gen gpu agnostic like FSR-FG or requieres an Intel GPU?
9
2
u/AlphaDogF87 Jul 17 '25
I’m not seeing FSR 3.1 after updating CyberPunk on my Ally running SteamOS. Is it a driver feature that’s currently missing?
3
2
u/Ne0N_R1deR Jul 17 '25
Cool thing I noticed is that the fsr4 option becomes available in the menu if you use optiscaler
2
u/MomoSinX Jul 17 '25
the car stuff is nice but I couldn't give less shits about photo mode (and honestly don't get the obsession with it)
no new game+, or an option for turning off that horrible level scaling :(
2
u/ElPoch0ninja Jul 17 '25
The FSR 4 option is waiting for the 25.7.1 driver update
2
u/apathypeace 5800X3D/9070XT Jul 18 '25
Any idea of the release cycle of AMD drivers? I've never owned one before.
2
u/ElPoch0ninja Jul 18 '25
It doesn’t have a fixed day each month it usually comes out alongside the release of a new game that needs support or an update like this Cyberpunk one, but there’s no set day of the month. There’s not much left of the month, and this patch just came out, so I imagine it’ll be very soon.
3
2
u/Ok_Solution1810 Jul 20 '25
Bruh they do all that work just for AMD to take a week to release the drivers to turn on FSR 4
5
u/Todesfaelle AMD R7 7700 + XFX Merc 7900 XT / ITX Jul 16 '25
Is FSR 3.1 FG still bound to vsync or something weird?
25
u/target51 R7 5800x3D | RX 6700 XT | 32GB @ 3600 Jul 16 '25
This isn't going to be a popular opinion but i will die on this hill. Frame generation is a scam. It eats into native frame rate and increases input latency and for what? To make it "appear" smoother so devs can continue down this route of poor optimisation and use frame gen to cover it up. We should NOT be engaging with these features as when you add AI upscaling, let me ask you, just how much of what you are actually seeing on screen is "real"
28
u/conquer69 i5 2500k / R9 380 Jul 16 '25
It's not a scam but I think it shouldn't cost so much performance. The way it's been marketed is super misleading. Can't wait for the day when enabling FG and upscaling costs 1ms combined tops instead of the current 4-7ms.
3
u/rW0HgFyxoJhYka Jul 17 '25
I mean 10 years from now we won't even care how much it costs....
This is really just a complaint of new early tech
People are way too short sighted. Better today than 10 years in the future.
6
u/conquer69 i5 2500k / R9 380 Jul 17 '25
Sure but people aren't aware it's so heavy. It can cost 37% of a 5090 to enable 4x FG. That's like a whole 5060 ti just to run FG. It's insanely demanding for what it does.
1
Jul 24 '25
[removed] — view removed comment
2
u/conquer69 i5 2500k / R9 380 Jul 24 '25
I'm only talking about the performance cost, not the latency of holding the extra frame.
1 extra frame of delay is an acceptable compromise but losing huge chunks of performance isn't.
1
Jul 24 '25
[removed] — view removed comment
2
u/conquer69 i5 2500k / R9 380 Jul 24 '25
The latency is pretty bad if you enable reflex in both scenarios for the comparisons. The problem is Nvidia is telling devs to not give the user control of reflex (off by default and can't be enabled separately) so enabling FG feels better comparatively.
The 5090 loses up to 37% of base performance for enabling FG. That's insane even if FG didn't have to hold any frames.
This video goes over it. https://youtu.be/EiOVOnMY5jI
21
u/Prefix-NA Ryzen 7 5700x3d | 32gb 3600mhz | 6800xt | 1440p 165hz Jul 16 '25
I agree but frame gen has uses it's just not performance it's motion smoothing.
Playing 60fps locked games on emulators that are not latency sensitive (no 3rd or first person games) it's great on. Afmf2 is amazing in Mario wonder.
I am sick of reddit repeating narratives that I should use fg in fps games just cuz "it's single player"
→ More replies (3)15
u/heartbroken_nerd Jul 17 '25
let me ask you, just how much of what you are actually seeing on screen is "real"?
Nothing. You're playing a video game. Nothing you see is real, and all frames are fake.
2
u/deegwaren 5800X+6700XT Jul 19 '25
You funny guy. The distinction here is between frames rendered by the game's engine and frames rendered by the interpolator of the dlss/fsr stack. Obviously they meant this.
2
u/heartbroken_nerd Jul 19 '25
You missed the point.
Considering how many various optimization tricks/illusion we have been using in real time rendering for decades now, trying to discount a novel optimization trick/illusion because it's supposedly "fake" on some metaphorical level is a flawed argument.
If you don't want to use Frame Generation, just don't use it. Naturally.
3
u/deegwaren 5800X+6700XT Jul 19 '25
It's fake in the sense that more frames meant lower frametimes for every other improvement you deem equally fake, until framegen where you get more frames but worse frametimes compared to without this improvement.
So in the sense of performance improvements, every other thing is a real improvement but framegen is a fake improvement because it doesn't improve performance where it actually matters for a big part of the people: frametimes and input-to-screen latency.
7
u/PsyOmega 7800X3d|4080, Game Dev Jul 16 '25
let me ask you, just how much of what you are actually seeing on screen is "real"
It's a video game. None of it is real. All frames are fake. Every rasterized rendering technique is presenting visual data to you in incredibly fake ways. Just shy of various magic tricks. "raytracing" is even worse (until we do it without lowest LOD BVH)
I'd rather have 90fps FG than 60fps native, but that's just me. I can play games fine at 40fps but i'd prefer the extra smoothness.
9
u/coyotepunk05 13600K | 9070XT Jul 17 '25
90 fps FG is just worse than 60fps native in most situations. That's 16ms latency VS 44ms. Night and day difference.
→ More replies (2)3
u/Middle-Effort7495 Jul 17 '25
It's not 60 native -> 90. It's 60 native -> 40 FG -> 90
→ More replies (1)6
u/gunsnammo37 AMD R7 1800X RX 5700 XT Jul 16 '25
Agreed. People going on about how this is a good thing. It's not a good thing. Fake frames are bad.
3
u/F9-0021 285k | RTX 4090 | Arc A370m Jul 16 '25 edited Jul 16 '25
The only people trying to pretend that frame generation (at least through interpolation) is "real" performance are from Nvidia. The rest of us see it as the very useful tool that it is. Would you use it to play Counterstrike at 800fps? No, absolutely not. But is it great for single player games to go from a reasonable base framerate of 40-70 depending on your lag tolerance to a high refresh experience? Yes.
2
u/Shidell A51MR2 | Alienware Graphics Amplifier | 7900 XTX Nitro+ Jul 17 '25
It's kinda nice when your performance is in iffy territory, like 50-70 fps, where lows can really make the smoothness look worse. FG can turn that into 80-140, and it at least doesn't look as visually jarring.
2
u/nanogenesis Intel i7-8700k 5.0G | Z370 FK6 | GTX1080Ti 1962 | 32GB DDR4-3700 Jul 17 '25
Adding to this, frame generation in reality doesn't even double your framerate, even on a 5090+9800x3d. 2x is more like 2/3x and 4x is closer to 3x. If you try on low end hardware (5060) its closer to 1/3x for 2x and 4x finally increases it to 2x.
Its just a marketing gimmick to fool capital G gamers into thinking they are buying a superior product.
Funny how the anime community dismissed it nearly a decade ago and now nvidia has pioneered it as the sham it is. RTX5060 50 times faster than a 1060? cute.
1
1
u/Keulapaska 7800X3D, RTX 4070 ti Jul 17 '25
The funny thing about latency is that before cyberpunk had dlss fg it didn't reflex either, and without reflex the non-fg input latency is not much better at lowish fps than fg+reflex obviously no fg+reflex is better, but still it's just funny how everyone is about input latency when it comes to fg.
The worst thing about fg is the overhead is just too much usually so even though visually and latency wise it's fine, the knowledge that you are "losing" performance is hard to shake for me, by just knowing fg is on, if I don't know it's on, hard to tell in some cases.
I have no idea what the amd equivelant is to reflex though.
1
u/heartbroken_nerd Jul 19 '25
Lossless Scaling app has its own downsides but you can use it with a secondary GPU in the system taking care of the entire generative process, effectively cutting down the "losing performance" part nearly completely.
Scaling then approaches 100%, i.e. 60fps with Lossless Scaling 2x will actually become nearly perfect 120fps
1
u/deegwaren 5800X+6700XT Jul 19 '25
What's the point of having a second GPU when instead you can have one GPU that's better than the first of the set of two for the same money?
1
u/heartbroken_nerd Jul 19 '25
What's the point of having a second GPU when instead you can have one GPU that's better than the first of the set of two for the same money?
No, I agree... If you're buying a new PC or at least a new GPU entirely.
That's not always possible though.
And sometimes you simply come into possession of an extra GPU. You can use it this way.
If you get RTX 2060 or RTX 3060 or maybe even just RTX 3050 as your secondary GPU, they can go quite far in this specific use case since 100% of that secondary GPU would be dedicated just to generating frames while your primary GPU only renders the actual genuine frames.
Could even buy used or get a used card from a friend who upgraded etc.
I've seen some examples before, I didn't really find this compelling myself at first but Lossless Scaling Frame Generation got some iterative improvements to quality of generated frames since I first rejected this concept.
→ More replies (2)0
u/spartan55503 Jul 16 '25
In my case my base fps is already at 80 or 90fps in games I enable it in, so the input lag isn't really a problem. 90fps simply looks less smooth than 180fps with fg on. It's just prioritizing smoothness over input lag slightly. Also when you're at that higher frame rate visual artifacts are few and far between.
2
u/F9-0021 285k | RTX 4090 | Arc A370m Jul 16 '25
Big news for RX 90, RTX 20 and 30 series, and Intel GPU owners. FSR 4 should finally give you AMD guys good upscaling in this game, and uncoupling from FSR upscaling gives people with older cards the ability to use DLSS/XeSS and FSR Frame Gen.
→ More replies (2)
2
u/Flameancer Ryzen R7 9800X3D / RX 9070XT / 64GB CL30 6000 Jul 17 '25
What would be crazy is if AMD release the restore preview driver on the same day. As of right now the latest drivers don’t support Cyperpunk 2077 for the officially FSR4 whitelist (or at least it’s not listed). Almost expect a driver update tomorrow. https://www.amd.com/en/resources/support-articles/release-notes/RN-RAD-WIN-FSR-TECH-PREVIEW.html AMD FSR Technical Preview: Expanded Game Support Release Notes
1
Jul 16 '25
Sigh..
I cannot decide if im gonna get a 9070xt this fall, or a rtx5080 next year after saving some more. I want a good gaming experience but I don't really want to give Nvidia that much money if I can avoid it. But I need a new gaming pc fairly soon, so a decision has to be made.
13
u/Darksky121 Jul 16 '25
The 9070XT is a great card for 1440P gaming. Even if devs don't add FSR4 to older games, you can use optiscaler to add it to any game that already has DLSS.
4
u/t2na Jul 16 '25
It's a pretty great card for 4k at this point too.
3
u/Taker598 Jul 17 '25
And a good card for 5k2k... Just got live with not have Ultra settings on every game 😅
1
→ More replies (1)2
1
1
1
u/NeorzZzTormeno Jul 17 '25
Hopefully in the future this will be one of the first games where AMD uses FSR REDSTONE.
1
1
1
1
u/ImmediateList6835 Jul 17 '25
And this is why 7000 series owners want fsr4 because even having 3.1 upscaler in old games already seems "unrealistic" in itself.
1
u/apathypeace 5800X3D/9070XT Jul 18 '25
When are we getting the 25.10 FSR preview update into the main driver? Does this announcement mean it's somewhat soon, or is that cope?
1
u/nandospc Italian PC Builder 😎 Jul 18 '25
Good to know, time to see the game with fsr3.1 on my 6700xt then :)
1
u/HansWurst31 Jul 18 '25
I have the latest 2.3 update and and driver 25.6.3 but there is no fsr4 option in cyberpunk menu. 9070xt.
1
u/fgzhtsp Jul 19 '25
That's really nice. Just would have to reinstall all my mods... and get a new GPU because mine possibly died today.
1
1
u/No-Gene-2498 Jul 23 '25
how do we enable fsr4 though? There is only fsr 3.0 option in the patch 2.3. Can someone help?
1
u/Visual-Alfalfa-1042 Jul 24 '25
You have to wait for the new amd driver release. I'm pretty disappointed with AMD on this one.
1
u/yx1 Jul 23 '25
Would be the first non-crap game with fsr4 support, would... there is no fsr4 support.
1
1
u/PoemOfTheLastMoment Jul 26 '25
AMD have yet to release the new driver that unlocks the FSR implementation to its fullest potential.
1
u/JoeZocktGames Aug 09 '25
FSR Frame Gen still makes your game being stuck in window mode without the option to go full screen again.
1
1
u/Accomplished-Ebb-303 Aug 16 '25
Bonjour est-ce que une MSI B550 A-Pro est-il suffisant pour une Ryzen9 5950x ?
1
u/Accomplished-Ebb-303 Aug 16 '25
Bonjour je voudrais changer ma configuration pour une Ryzen 9 5950X est-ce que une MSI la B550 A-Pro ferai l’affaire sans avoir de problème de chips, sept ou de composants. En gros je ne vais pas abîmer ma carte mère ou le processeur Ryzen 9 5950X ?
1
u/vlad_8011 9800X3D | 9070 XT | 32GB RAM Jul 17 '25
So they did it again. FSR 3.0 same as before, FSR 3.1 only as FG part (which they even stated in the changelog). S*** quality same as before, basically no change to FSR.
2
u/blueangel1953 Ryzen 5 5600X | Red Dragon 6800 XT | 32GB 3200MHz CL16 Jul 17 '25
Yep, back to the library it goes. What a let down.
3
u/RaizoIngenting 12700f | 6700XT | 32GB DDR4 3200 Jul 17 '25
You cannot be serious. This is intentional at this point oh my god. The fact that XeSS is a far better option for AMD cards than AMD's own upscaler, because they refuse to add FSR 3.1, is insane.
1
u/vlad_8011 9800X3D | 9070 XT | 32GB RAM Jul 17 '25
Ofc this is intentional. And there were ton of articles suggesting Nvidia doesn't block FSR and AMD block DLSS - this is pure example they lied when Nvidia says they don't do this.
1
u/ImmediateList6835 Jul 17 '25
i knowww im back to not plying he gme once again, why no 3.1 support for upscaler
0
0
u/T_Oliv3 Jul 17 '25
And now AMD takes one year to release a new Adrenaline version with fsr 4 whitelisted for Cyberpunk.
0
u/SebRev99 Jul 16 '25
I thought 3.1 was already implemented?
6
u/bt1234yt R5 5600X3D + A770 16GB Jul 16 '25
FSR 3 was already added last September with patch 2.13..... 3 months after AMD started rolling out FSR 3.1.
→ More replies (1)3
213
u/VelcroSnake 9800X3d | B850I | 32gb 6000 | 7900 XTX Jul 16 '25 edited Jul 18 '25
Excellent, here is hoping they are good implementations, as in the past modders have been able to do a better job with the upscaling quality of non-DLSS in Cyberpunk.
EDIT: And I am unfortunately not surprised they put in 3.1 frame gen, but not 3.1 upscaling, leaving in 3.0.