A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)

  • arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    11 hours ago

    An overly compressed 4k stream will look far worse than a good quality 1080p. We keep upping the resolution without getting newer codecs and not adjusting the bitrate.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      7 hours ago

      This is true. That said, if can’t tell the difference between 1080p and 4K from the pixels alone, then either your TV is too small, or you’re sitting too far away. In which case there’s no point in going with 4K.

      At the right seating distance, there is a benefit to be had even by going with an 8K TV. However, very few people sit close enough/have a large enough screen to benefit from going any higher than 4K:


      Source: https://www.rtings.com/tv/learn/what-is-the-resolution

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 hours ago

      I went looking for a quick explainer on this and that side of youtube goes so indepth I am more confused.

      • HereIAm@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        41 minutes ago

        For an ELI5 explanation, this is what happens when you lower the bit rate: https://youtu.be/QEzhxP-pdos

        No matter the resolution you have of the video, if the amount of information per frame is so low that it has to lump different coloured pixels together, it will look like crap.

      • starelfsc2@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 hours ago

        On codecs and bitrate? It’s basically codec = file type (.avi, .mp4) and bitrate is how much data is sent per second for the video. Videos only track what changed between frames, so a video of a still image can be 4k with a really low bitrate, but if things are moving it’ll get really blurry with a low bitrate even in 4k.

      • null_dot@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        The resolution (4k in this case) defines the number of pixels to be shown to the user. The bitrate defines how much data is provided in the file or stream. A codec is the method for converting data to pixels.

        Suppose you’ve recorded something in 1080p (low resolution). You could convert it to 4k, but the codec has to make up the pixels that can’t be computed from the data.

        In summary, the TV in my living room might be more capable, but my streaming provider probably isn’t sending enough data to really use it.