• sp3ctr4l@lemmy.zip
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    3
    ·
    edit-2
    9 days ago

    I mean, it could… but if you run the math on a 4k vs an 8k monitor, you’ll find that for most common monitor and tv sizes, and the distance you’re sitting from them…

    It basically doesn’t actually make any literally perceptible difference.

    Human eyes have … the equivalent of a maximum resolution, a maximum angular resolution.

    You’d have to have literally superhuman vision to be able to notice a difference in almost all space scenarios that don’t involve you owning a penthouse or mansion, it really only makes sense if you literally have a TV the size of an entire wall of a studio apartment, or use it for like a Tokyo / Times Square style giant building wall advertisement, or completely replace projection theatres with gigantic active screens.

    This doesn’t have 8k on it, but basically, buying an 8k monitor that you use at a desk is literally completely pointless unless your face is less than a foot away from it, and it only makes sense for like a TV in a living room if said TV is … like … 15+ feet wide, 7+ feet tall.

    • modality@lemmy.myserv.one
      link
      fedilink
      English
      arrow-up
      8
      ·
      8 days ago

      Yes. This. Resolution is already high enough for everything, expect maybe wearables (i.e. VR goggles).

      HDMI 2.1 can already handle 8k 10-bit color at 60Hz and 2.2 can do 240Hz.

    • Venator@lemmy.nz
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      8 days ago

      While it is pretty subtle, and colour depth and frame rate are easily way more important, I can easily tell the difference between an 8k and a 4k computer monitor from usual seating position. I mean it’s definitely not enough of a difference for me to bother upgrading my 2k monitor 😂, but it’s there. Maybe I have above average vision though, dunno though: I’ve never done an eye test.

      • MonkderVierte@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        8 days ago

        Well, i have +1.2 with glasses (which is a lot) and i do not see a difference between FHD (1980) and 4k on a 15" laptop. What i did notice though, background was a space image, the stars got flatter while switching to FHD. My guess is, the Windows driver of Nvidia tweaks Gamma & light, to encourage buying 4k devices, because they needed external GPU back then. The colleague later reported that he switched to FHD, because the laptop got too hot 😅. Well, that was 5 years ago.

        • Venator@lemmy.nz
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          8 days ago

          Interesting: in that sense 4k would have slightly better brightness gradiation: if you average the brightness of 4 pixels there’s a lot more different levels of brightness that can be represented by 4 pixels vs 1, which might explain the difference perceived even when you can’t see the individual pixels.

          The maximum and minimum brightness would still be the same though, so wouldn’t really help with the contrast ratio, or black levels, which are the most important metrics in terms of image quality imo.