I mean, it could… but if you run the math on a 4k vs an 8k monitor, you’ll find that for most common monitor and tv sizes, and the distance you’re sitting from them…
It basically doesn’t actually make any literally perceptible difference.
Human eyes have … the equivalent of a maximum resolution, a maximum angular resolution.
You’d have to have literally superhuman vision to be able to notice a difference in almost all space scenarios that don’t involve you owning a penthouse or mansion, it really only makes sense if you literally have a TV the size of an entire wall of a studio apartment, or use it for like a Tokyo / Times Square style giant building wall advertisement, or completely replace projection theatres with gigantic active screens.
This doesn’t have 8k on it, but basically, buying an 8k monitor that you use at a desk is literally completely pointless unless your face is less than a foot away from it, and it only makes sense for like a TV in a living room if said TV is … like … 15+ feet wide, 7+ feet tall.
While it is pretty subtle, and colour depth and frame rate are easily way more important, I can easily tell the difference between an 8k and a 4k computer monitor from usual seating position. I mean it’s definitely not enough of a difference for me to bother upgrading my 2k monitor 😂, but it’s there. Maybe I have above average vision though, dunno though: I’ve never done an eye test.
Well, i have +1.2 with glasses (which is a lot) and i do not see a difference between FHD (1980) and 4k on a 15" laptop. What i did notice though, background was a space image, the stars got flatter while switching to FHD. My guess is, the Windows driver of Nvidia tweaks Gamma & light, to encourage buying 4k devices, because they needed external GPU back then. The colleague later reported that he switched to FHD, because the laptop got too hot 😅. Well, that was 5 years ago.
Interesting: in that sense 4k would have slightly better brightness gradiation: if you average the brightness of 4 pixels there’s a lot more different levels of brightness that can be represented by 4 pixels vs 1, which might explain the difference perceived even when you can’t see the individual pixels.
The maximum and minimum brightness would still be the same though, so wouldn’t really help with the contrast ratio, or black levels, which are the most important metrics in terms of image quality imo.
I mean, it could… but if you run the math on a 4k vs an 8k monitor, you’ll find that for most common monitor and tv sizes, and the distance you’re sitting from them…
It basically doesn’t actually make any literally perceptible difference.
Human eyes have … the equivalent of a maximum resolution, a maximum angular resolution.
You’d have to have literally superhuman vision to be able to notice a difference in almost all space scenarios that don’t involve you owning a penthouse or mansion, it really only makes sense if you literally have a TV the size of an entire wall of a studio apartment, or use it for like a Tokyo / Times Square style giant building wall advertisement, or completely replace projection theatres with gigantic active screens.
This doesn’t have 8k on it, but basically, buying an 8k monitor that you use at a desk is literally completely pointless unless your face is less than a foot away from it, and it only makes sense for like a TV in a living room if said TV is … like … 15+ feet wide, 7+ feet tall.
That graph is fascinating, thank you!
Yes. This. Resolution is already high enough for everything, expect maybe wearables (i.e. VR goggles).
HDMI 2.1 can already handle 8k 10-bit color at 60Hz and 2.2 can do 240Hz.
While it is pretty subtle, and colour depth and frame rate are easily way more important, I can easily tell the difference between an 8k and a 4k computer monitor from usual seating position. I mean it’s definitely not enough of a difference for me to bother upgrading my 2k monitor 😂, but it’s there. Maybe I have above average vision though, dunno though: I’ve never done an eye test.
Well, i have +1.2 with glasses (which is a lot) and i do not see a difference between FHD (1980) and 4k on a 15" laptop. What i did notice though, background was a space image, the stars got flatter while switching to FHD. My guess is, the Windows driver of Nvidia tweaks Gamma & light, to encourage buying 4k devices, because they needed external GPU back then. The colleague later reported that he switched to FHD, because the laptop got too hot 😅. Well, that was 5 years ago.
Interesting: in that sense 4k would have slightly better brightness gradiation: if you average the brightness of 4 pixels there’s a lot more different levels of brightness that can be represented by 4 pixels vs 1, which might explain the difference perceived even when you can’t see the individual pixels.
The maximum and minimum brightness would still be the same though, so wouldn’t really help with the contrast ratio, or black levels, which are the most important metrics in terms of image quality imo.
doesn’t this suggest that my 27" monitor I sit a foot away from should just be 480p? that seems a little noticable to me
I think you got this wrong. If you sit 0.3 m away, more than 4k is worth it.
you’re right lol. at first glance I thought the y axis was in inches not feet. make sense now lol
Btw, https://en.m.wikipedia.org/wiki/Angular_resolution
And already 4k tv would have to be, what, 2m diagonale, in usual viewing distance of 3m+.