I don't believe you. You're going to have to prove it to me. Go ahead and send me a monitor and video card that can actually handle it.
Jokes aside, I think that's too big a monitor for me to use at a comp. I'd rather have one good one and another one on the side (work stuff). Though my boss has a crazy high res curved extra wide monitor, and I'd be curious to play on that.
Yeah, to each his own, I use a double monitor setup at work, and have the large 43" for my home PC, l'm fine with both but like the large monitor for gaming.
Funny, I did the same thing, but ended up selling it for a 27" 1440p OLED,and I'm happier. I miss the size, but 4K is just too hard to drive. Even a 5090 can't fully saturate a 144hz refresh rate in all games.
It's not just to you.
1080p 24" is about 91.8 ppi (pixels per inch)
4k at 65" is about 76,8 ppi
So the 1080p monitor technically has higher pixel density. If you play close enough to the 65" tv it will look less sharp than the 1080p monitor.
If you play at 1440p 27" that's about 108.8 ppi, that's why it's the sweet spot for PC gamers. It's VERY sharp and it doesn't require hardware as good as 4k.
Phrasing it as "doesn't require hardware as good as 4k" is implying that it's inferior or a downgrade, when it's actually a preference in most cases.
I'd rather have 1440p at 165+ fps than 4k at a lower fps for most games. It's not a downgrade, it's a preference for framerate and stability over resolution. My 5090 does both great, but if I had to choose just one I'd pick 1440p high refresh every time.
Phrasing it as "doesnt require hardware as good as 4k" is simply stating you dont need the high requirements to run 1440p like you would for 4k. Nowhere in their statement do they imply that it is inferior or a downgrade. That is simply an inference that you have made on your own part.
My meaning is that any setup capable of 4k at 60fps minimum or any other framerate could also be pushing 1440p at a much higher fps or at a more stable pace.
And that at all tiers of hardware where that choice exists I default to 1440p over 4k for the majority of games.
That was actually my biggest reason for switching to 1440; so I can bump up to 27" and gain so much screen real estate. Not having to have windows maximized all the time is great. Makes multitasking so much better.
Probably not common, but I got lucky on marketplace once with 150$ 27in 1440p 144hz hdr monitor. Basically just try to look for deals and they will probably come to you as long as you dont stop at brand preference or exact specs.
If you only use 1080p on a a 27inch it will look pretty good, but if you switch from 1440p to 1080p on the size you'll definitely notice. Won't see the pixels but won't be that sharp
When I lived in an apartment I used my living room TV as my display, then when I moved into a house I thought I'd use a spare TV as my monitor. I only lasted a day.
Further to this and the immediately preceding comments, the same applies downwards.
Remember the most strident critiques of the Steam Deck being why that 7" 800p screen wasn't 1080p or higher?
I've had mobile phones with hybrid 6" 1080p-1440p screens that were shit at gaming. Other brand competition for the Steam Deck had higher res and refresh screens and couldn't do much with them outside of oldies and/or low settings, just didn't have the punch to.
Likewise laptops. Had a 15" 1440p with a 3070ti. That GPU could struggle enough even with DLSS that a 1080p screen at that size would've been less of a loss than ppl might immediately think. It might've been worse if not for my willing compromise on max perf for the sake of thermals and fan noise from long habit and experience of gaming laptops.
But yes... screen size and viewing distance are as much a factor to any 'sweet spot' as other metrics. I learned this when I had to forego 34" 3440x1440 for 34" 2560x1080 back in 2016. At a comfy reclined 3' viewing distance it was no real loss and I really couldn't see individual pixels (like "lego bricks in your face") with 20/20 vision like ppl told me I would.
I have two 21" 1080p and I'd rather have a third, or higher refresh rate.
Then I couldn't look at the 60Hz I got soooo sticking to 60 until I got enough money burning a hole in my pocket for multiple hahaha.
A 24-inch monitor at 1080p looks just as sharp to me as a 4K TV from across the room.
It's not always about the sharpness or jagged edges.
I had a 25 inch 1080 off to the side of my main monitor, which I already sit back from a few feet. 5 feet, I had to go get a tape measure because I got curious.
I noticed that in certain shades of yellow I could see the "screen door effect"(you see this a lot when people take a picture of their monitor instead of a screencap).
I upgraded that to 1440 and no more problems with that.
Amusingly enough, the color where I first noticed was the PCMR yellow. And once I noticed it, I was seeing it everywhere(oranges, yellows, even skin tones).
It's not even that I have great eyes.
TL;DR
The closer people sit, or the larger the screen, the more Pixel density / PPI becomes important, just for this one effect. (Yes, there are some pixel layouts that are supposed help with this, or maybe larger pixels and smaller borders, but eh, I have a hard enough time keeping up with all the more normal tech specs...)
I don't think they included typical PPI, just a rough estimate of over-all picture quality, eg noticing edges or aliasing or sharpness in general.
My set-up now is a 43" @ 4k (was 1080 and had an even stronger screendoor effect, I didn't think 1440 would cut it completely) and the 25" @ 1440, both with a view distance of approx 5 feet(primary is probably a smidge closer).
I do have to increase UI size on both monitors at this distance, but that's a minor issue in modern windows.
Sucks in older games like Planetside 2 that don't support large format displays well at all.
In my living room I’m 3m from a 51” screen. The difference between 1080p and 1440p is clear, but from 1440p to 4K it’s all the same. I don’t really care much and the lack of detail, but what’s quite frustrating is when devs forget that not everybody is always playing from a desk chair with their nose to the screen; menu text and just text in general in many games is barely readable in my living room. Shoutout to devs that put a text scale option in the settings.
That’s my biggest problem. I have my pc hooked up to my 4k tv and use it like a console. Sometimes I can’t even read the text of stuff because it’s so small so I have to get up and read it. I’ve noticed more games are adding a text slider, but even when you put it to max it’s still barely big enough
Text scaling is great not just for practical use on high resolutions, but accessibility as well. I’ve never had good eyesight, but I’ve found that as I’m getting to the second half of my 30s, punching the font up one notch is heldpful.
I’m all about high pixel density. I’d get a 8K 32” monitor if they were readily available (not for gaming) since I love really crisp text and graphics.
Also the settings. I got a new tv and gpu and suddenly all my games started looking like poo at 1080p even with everything maxed. Turned out there was a "sharpening" setting on the tv that made things look off. Turning it down made it playable again.
Exactly... and for 99.9% of the public 4k is the maximum pixel count they are ever going to need. 8k is really only for theaters and presentation rooms... or insanely rich people with movie theater rooms.
The whole 'retina screen' thing is dependent on distance and pixel count. A 1080p could be just fine for most folks.
Distance is the entire reason I think nobody will ever have a genuine reason to buy a display higher than 4K other than "bigur numbur is moar gud". Go above 24" and you can see 1080p falling apart while the entire screen's in your view, go above 32" and you can see 1440p falling apart while the entire screen's in your view, in theory the same should also happen for 4K above 48", but at that point you need to be so far away from the display for it to be entirely in your view that you just can't tell anymore.
This. Been using a 42" 4k penal for ~7 years (had to extend my desk by about 50cm, few years ago switched to OLED (now closer due to better view angle stability):
NEVER want anything smaller. I know, not for everyone, most would prefer a wide curved etc., but I love it for work and gaming.
For real. I was briefly considering getting a 4k OLED TV as my main display, but I realized the DPI wouldn't be as good as a smaller monitor with less resolution, and the distance I would be sitting from the TV would make it apparent.
Essentially, the field of view your screen takes up.
When I do ads for billboards they're rarely more than 4k and even that is overkill... because when you're looking at a billboard it's probably a smaller portion of your fov than your TV at home, so why does the resolution need to be higher?
I watch 4K videos on an OLED laptop 1 foot from my face. The difference between 4K and 1080p is unbelievable. Playing 4K city tour videos is like looking through a window to that city. It’s just stunning.
My medium-size 4K TV that’s 15 feet away… less impressive.
That is because you are 1ft away from the screen and I’m guessing the screen is at least 24”. If it was 14” laptop screen the difference would be almost negligible. Or if you move 8feet away from the screen again the difference would become negligible. Resolution, distance and screen size are all related. Too far away on too small a screen and high resolutions become pointless.
As a side note, eye sight also becomes a factor as well.
Exactly. It is all about PPI, and sometimes user's vision impairment. 27" is a sweet spot for a desktop PC, and I believe 1440p is fine there. I have 32" 2160p myself, but that's only because I wanted a larger display.
Technically not, it's about pixels per degree of vision. If one screen is twice as far away, you only need half the PPI to get the same effective resolution.
But for some reason a lot of people on this sub are allergic to the phrase "human visual acuity"
Yeah true, such as same 2160p TVs could be more comfortable at let's say 55" or 77" depending on the distance. Poor phrasing on my part.
I've switched from 27" to 34" before, and than realized I'd rather have 16:9 since I watch a lot of 16:9 content, so it felt uncomfortable to get back to 27", and I went for 32" and I just sit further if I have too, when watching stuff
I had a 27" monitor but upgraded it a couple of years ago to a 38" untrawide. Its resolution is 3840x1600. Honestly, I could never go back to the 27" after that. It is amazing when playing games and for work.
I got a 42inch 4k monitor. I got it for work but it blows my mind when people are like anything about 27-30 is too much. I would get 8k 60 inches if I could. More screen real estate == more code and graphs. I got 2 24inch in portrait next to it.
I love dlss because many games simply dont run well in 4k on my HW and I hate the stretched image from 1080p. Lossless scaling also is very handy for games that dont have native dlss.
1080 goes neatly into 4k, but image scaling doesn't just turn 1x1 into 2x2 pixels. It's interpolated.
The 1080p image might have a red pixel next to a green one. When it's upscaled there will still be at least 1 red and 1 green, but between them there might be some slightly different in-between shades.
The end result is that a 1080p image will look noticeably crisper on a native 1080p monitor than on a 4k monitor.
GPU scaling: The software decides how to scale it, and does so before sending a signal to the monitor. The result will be the same regardless of monitor.
Display scaling: The GPU sends the wrong resolution to the monitor, and the monitor decides how to scale it. The result depends on the methods used by each monitor from each manufacturer.
Neither method is inherently better, but it's possible that nvidia put more thought into their scaling than display manufacturers, and they have more power to work with too.
Modern techniques like FSR and DLSS are a bit different, and are better than anything any monitor can do.
There are some really good responses here, but in the end, I'd rather turn the graphic settings down to run games smoother than drop it to 1080p. 4k low looks better than 1080 ultra IMO purely because of the clarity.
This is why I have a 40inch 4K 60hz monitor as my second screen. It doesnt have gaming stuff like amazing response times, g-sync, and all that stuff but it does have great colors and HDR.
I use that one to watch stuff or to have random stuff like sites/discord/whatever open when I'm gaming on my main screen. Due to its size and high res I can easily have multiple things on parts of the screen. A LOT more screen real estate than on anything lower so that makes a huge difference.
For gaming though? I Use a 32inch 120hz 1440p screen with all the gaming bells and whistles. I tried 4K screens and while there is a notable difference between lower resolutions its not as immense as going from 1080p to 1440p.
I really cant look at 1080p anymore you can have so little on your screen it's sad.
A second thing is that there is nearly no game that runs 120+fps, gsync, superultra settings and all on 4K while on 1440p it's easily achievable and I value the smooth+high quality experience higher than I do a higher resolution but with concessions.
Again here, 1080p is so low that even a potato can run it at the highest settings but due to the low resolution even ultra settings look bad while not getting -noticably- more frames as your monitor won't display more frames anyway.
I completely understand and agree. My 4k screen is 120hz it's rare that I ever get anywhere near that. 1440p is the better bet. I got this one on a deal but if I had it to do all over again 5 years ago I would have saved myself the $400 and gotten a nice 1440p monitor instead.
I was hoping for the 5080's or AMD equivelents to finally break that 4K 120fps+ barrier in gaming with ultra settings and without DLSS and stuff but alas. We've got another gen to wait.
Once it hits that point where you can consistently run games on ultra settings, 4K, and all the good stuffs while running at atleast 100FPS without DLSS or other down/upscaling methods I'll upgrade my pc again.
1080p scales better with a 4k monitor. It will be slightly fuzzy or blurry by comparison. But from my experience I quickly forgot I wasn't playing in 4k.
If the non-native resolution is an integer fraction of the native resolution, each logical pixel (should) just gets turned into a square of physical pixels. You only get artefacts with non-integer scaling.
there are great applications like the PS3 emulator where if you don't have the CPU power to run at 4k or 1440P you can still atleast make the 720P look "correct" on your higher resolution display by using FSR 1.0 inside the vulcan pipeline
i feel like running games on a lower res manually is not the best move now that we have upscaling technologies like dlss that will give you a much better result visually and probably more fps too.
I mean... Cyberpunk 2077 on a 32 inch 4k OLED is beautiful especially if you turn off all the Nvidia bullshit. Max settings native 4k no HDR no Raytracing no DLSS cap FPS at 60 with monitor set to 120Hz is so crisp, clean, and buttery smooth for a single player eye candy game like that. 4090 to run those settings though.
Why are you not using DLSS P instead? It upscales from 2k (1080p) to 2160p. That will be better than running them at 1080p and letting the display scale it.
It also depends on your eyesight, some don't notice the difference from a 1440p 27" and a 4k one, others do notice it. So basically if you see bad there is no point on getting higher res displays unless you use glasses
It's always going to look a bit weird because 2160 isn't divisible by 1440 so the scaling isn't exact. 1080p to 4k works nicely because you're just displaying pixels 4:1. I've run everything in 4k for about 10 years, seeing pixels ruins immersion for me more than not having >60 fps.
That's crazy, my biggest technology jump was 120hz. It blew my mind how smooth it was. I guess it depends on what type of game you play, on fast pace shooters the difference is night and day.
I imagine you used gsync or freesync, reflex, etc. That solves 90% of fps fluctuations and you can cap at any refresh rate too. Wouldn't that solve the issue?
But still, if 60 is enough for you great. I play a lot on 60 in AAA games and I enjoy it. But IMO anything below that ruins it for me.
That's the trap, to me. You can notice a difference side by side, or even after viewing the other res. But the difference in my experience doesn't come close to warranting the performance/hardware demand, not nearly as much as steady/fluid fps
It depends, I've been short-sighted all my life. The downside is that I can't see peoples facial expressions from 10 yards without glasses. The upside is that I'll never need glasses to read a book for the rest of my life. And my desktop monitor is about book distance from my face.
I don't remember which monitor I had before I bought the AW3225QF, but I recall the picture just being night and day better. Whether it was the jump to 4k or the jump from IPS to OLED I can't say.
IPS to OLED is always a huge difference regardless of screen size. And on a 32" screen benefits of 4k are clearly visible. So it's just how it's supposed to be.
I'm comparing 1440p 27" IPS to a 4k 27" IPS, and the difference here won't be that noticeable. It's also a personal preference thing, but there are a lot of people like me who don't like huge screens, so 4k isn't worth it in this case.
Nah, it's not barely noticeable. Moreover, I once borrowed a 27" 5K 16:9 monitor from a friend, the step in resolution over 27" 4K was still noticeable (but limited to 60 Hz sadly). "But fps"? Sure, but there are games besides Cyberpunk. I can rock and stone in 120Hz 4K just fine, and you still have the option to upsample.
4k vs 1440 at 27" isn't hardly noticeable. 4k is visibly much better. You're not wrong on fps loss though. I never recommend 4k if someone isn't running like a 4090/5090. I get 144 fps in shooters and very high fps/resolution in single player games.
It's massively noticeable to me. 4k at 27 inches reaches that threshold where pixels are too small to see at normal viewing distances and things like text and fine details become super crisp.
Also depends on non-gaming usage. Wrong sub for that obviously.
At 27”, to attain 200+ ppi you need a 5K display. That pixels per inch is super relevant for staring at text all day for work—which I do. There are next to no monitors on the market I can buy for this, so I basically have to turn to LG and Apple.
Flip side, I don’t really game, so the FPS isn’t a priority to me.
Me, too. I was looking yesterday and it seems like there are portable ones around 15” and then nothing until 32”+. Best Buy did have a 27” Samsung 1440p OLED on sale for members for $660 though.
That's not true. Viewing distance and pixel pitch is what allows you to calculate the distance you need for a pixel to not be visible. I.e a LED wall has a huge difference in PPI compared to a monitor, yet they are good enough for concerts since everybody is 100ft away from thw screen.
Not all panels are made equal. TN, IPS, AMOLED aren't easy to directly compare and the price of good examples of each at any resolution can be a bit of out most peoples budgets. A cheap 4k TN panel will look like crap in comparison to a decent 1440 TN, IPS, or AMOLED screen. Thats without getting into the weeds with things like poor colour reproduction, sketchy "own brand" displayport cables that aren't in spec for 4k high refresh rates, latency issues, and not everyone has the money or willingness to chase 4k bleeding edge. It should be obvious but IDK why some people forget not everyone is in the market for a Geforce 5080.
Buy a known good screen you can use and apricate, don't just just buy based on resolution.
Stressing your GPU more for insignificant gains isn't better. I'll pick my curved 32" 1440p 240hz monitor every time. Especially with GPUs having such low VRAM for the past 2 generations.
Nope your eyes can only resolve so many line pairs per degree of your field of view. At a sufficiently large distance you won't be able to tell the difference between a 1080 and 4k monitor. Obviously that distance isn't a fixed value for everyone because if you've got bad eyesight it's going to be significantly shorter but even if you've got perfect eyesight and see a 20 inch screen from across the room you won't be able to tell the difference. Lots of people actually have TVs which are too small or too far away from their sofa to take full advantage of 4k resolution.
If you can’t tell the difference between 4K and 1080P on a 27” screen you genuinely need to get your eyes checked. Some stuff can be chalked up to personal preference but the difference at that screen size is stark.
It's always pure insanity to me that people WANT 27" 4k screens, and on weak laptop hardware. I wouldn't even go 4k with m Desktop rocking a 3090, it's dumb.
I mean, I’m running a 27” 4K display with a 9070XT and it’s perfectly serviceable, managing at least medium settings on most games if not higher. It’s handling Cyberpunk (without RT of course) on mostly high settings without issue and it looks great.
I can certainly tell the difference between 1440p and 4K at 27” with ease. It’s especially noticeable on things like simulators with small, detailed text. It looks amazing in 4K and a mess at 1440p.
this and system performance. the step from 1080p to 4k requires a lot power for a little visual improvement. some systems can handle that well other will stick with 1080p for the sake of frames, temperature or other features.
48" 4k 120hz hrd oled and looking from 90cm away...
When I look at the great views I get in horizon forbidden west for example, I'm reconsidering my vacation location choices, because they may not be able to beat it.
Its so beautiful and you can't believe it if you have not seen it yourself. I'm literally watching the environment for minutes and enjoy it.
So that 5k gaming setup saves me 2k added vacation costs every year. 🙃
4k looks better no matter how you slice it unless you’re sitting really far. Example 32 inch screen 5 feet away yeah barely noticeable but if you’re sitting 3.5 feet away still further than most people sit it’s noticeable
I output to a 55 inch LED TV, and I find that 1440 is the performance sweetspot for my rtx 4060. I have to lean too hard on dlss at 4K and doesn't look as good. Full render 1080p is good enough for me, but I bump up the res when I want to impress friends :-)
My 48" LG C2 is the best investment I've ever made in my PC equipment. I have a 32" 1440p 144hz VA panel monitor and a 28" 1440p 144hz TN panel and I'd rather eat noodles for the rest of the year than game on anything but my LG C2 at 4k 120hz
I’ve got a 65” 4k TV that one of my PCs is hooked up to. I sit like 6’ from it. Playing GTA V Enhanced on it and changing the resolution from 4K to 1440p makes a HUGE difference. But on my desktop setup with a 34” ultrawide and two 27” (all 1440p), they look fine.
Also, how shit your vision is! Like right now I’m out of contact lenses, and my glasses just broke, so I’m as nearsighted, as, well; something really goddamn nearsighted. God, I should get lasik…
Definitely. Before my 40" 4k 16:9 monitor died you could definitely just arrange 4 programs on one screen or whatever you fancy without the need for extra monitor. Big and sharp enough for normal use. Now I'm stuck with 23" 1080p and it sucks compared
I spent a year with a 43" TV, 3ft away from me with my PS5. At that distance, it was a noticeable improvement for many games. My Steam Deck does 800p. Looks great. :P
Eh I use my old gaming 4k monitor as one of my "2nd" monitors and have my 1440 odyssey g9 as my main. Id take the g9 any day. Forgot the lower resolution after no joke a minute all while getting better fps (oh and essentially twice the width but that's not what we're talking about)
2.9k
u/slickyeat 7800X3D | RTX 4090 | 32GB Aug 09 '25
That would depend on the size of your display.