A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)

  • TheFeatureCreature@lemmy.ca
    link
    fedilink
    English
    arrow-up
    244
    arrow-down
    2
    ·
    22 days ago

    Kind of a tangent, but properly encoded 1080p video with a decent bitrate actually looks pretty damn good.

    A big problem is that we’ve gotten so used to streaming services delivering visual slop, like YouTube’s 1080p option which is basically just upscaled 720p and can even look as bad as 480p.

    • Feyd@programming.dev
      link
      fedilink
      English
      arrow-up
      92
      arrow-down
      2
      ·
      22 days ago

      Yeah I’d way rather have higher bitrate 1080 than 4k. Seeing striping in big dark or light spots on the screen is infuriating

    • woelkchen@lemmy.world
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      2
      ·
      22 days ago

      A big problem is that we’ve gotten so used to streaming services delivering visual slop, like YouTube’s 1080p option which is basically just upscaled 720p and can even look as bad as 480p.

      YouTube is locking the good bitrates behind the premium paywall and even as a premium users you don’t get to select a high bitrate when the source video was low res.

      That’s why videos should be upscaled before upload to force YouTube into offering high bitrate options at all. A good upscaler produces better results than simply stretching low-res videos.

      • azertyfun@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 days ago

        I think the premium thing is a channel option. Some channels consistently have it, some don’t.

        Regular YouTube 1080p is bad and feels like 720p. The encoding on videos with “Premium 1080p” is catastrophic. It’s significantly worse than decently encoded 480p. Creators will put a lot of time and effort in their lighting and camera gear, then the compression artifacting makes the video feel like watching a porn bootleg on a shady site. I guess there must be a strong financial incentive to nuke their video quality this way.

    • notfromhere@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      22 days ago

      I can still find 480p videos from when YouTube first started that rival the quality of the compressed crap “1080p” we get from YouTube today. It’s outrageous.

      • IronKrill@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        19 days ago

        Sadly most of those older YouTube videos have been run through multiple re-compressions and look so much worse than they did at upload. It’s a major bummer.

    • deranger@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      3
      ·
      22 days ago

      HEVC is damn efficient. I don’t even bother with HD because a 4K HDR encode around 5-10GB looks really good and streams well for my remote users.

    • SaharaMaleikuhm@feddit.org
      link
      fedilink
      English
      arrow-up
      10
      ·
      21 days ago

      This. The visual difference of good vs bad 1080p is bigger than between good 1080p and good 4k. I will die on this hill. And Youtube’s 1080p is garbage on purpose so they get you to buy premium to unlock good 1080p. Assholes

      • TheFeatureCreature@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        21 days ago

        The 1080p for premium users is garbage too. Youtube’s video quality in general is shockingly poor. If there is even a slight amount of noisy movement on screen (foliage, confetti, rain, snow, etc) the the video can literally become unwatchable.

    • Omega_Jimes@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      21 days ago

      I’ve been investing in my bluray collection again and I can’t believe how good 1080p blurays look compared to “UHD streaming” .

  • treesquid@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    6
    ·
    22 days ago

    4k is way better than 1080p, it’s not even a question. You can see that shit from a mile away. 8k is only better if your TV is comically large.

  • fritobugger2017@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    1
    ·
    21 days ago

    The study used a 44 inch TV at 2.5m. The most commonly used calculator for minimum TV to distance says that at 2.5m the TV should be a least 60 inches.

    My own informal tests at home with a 65 inch TV looking at 1080 versus 4K Remux of the same movie seems to go along with the distance calculator. At the appropriate distance or nearer I can see a difference if I am viewing critically (as opposed to casually). Beyond a certain distance the difference is not apparent.

    • markko@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      21 days ago

      Exactly. This title is just clickbait.

      The actual study’s title is “Resolution limit of the eye — how many pixels can we see?”.

      • SaveTheTuaHawk@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 days ago

        Exactly why big box stores force you to look at TVs in narrow aisles, not at typical distances at home. They also adjust pictures on highest margin models properly.

      • definitemaybe@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 days ago

        Can’t believe I had to scroll down this far to find this:

        Here’s the gut-punch for the typical living room, however. If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish. The scientists made it crystal clear: once your setup hits that threshold, any further increase in pixel count, like moving from 4K to an 8K model of the same size and distance, hits the law of diminishing returns because your eye simply can’t detect the added detail.

        On a computer monitor, it’s easily apparent because you’re not sitting 2+ m away, and in a living room, 44" is tiny, by recent standards.

  • Surp@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    1
    ·
    21 days ago

    8k no. 4k with a 4k Blu-ray player on actual non upscaled 4k movies is fucking amazing.

    • Stalinwolf@lemmy.ca
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      21 days ago

      I don’t know if this will age like my previous belief that PS1 had photo-realistic graphics, but I feel like 4k is the peak for TVs. I recently bought a 65" 4k TV and not only is it the clearest image I’ve ever seen, but it takes up a good chunk of my livingroom. Any larger would just look ridiculous.

      Unless the average person starts using abandoned cathedrals as their livingrooms, I don’t see how larger TVs with even higher definition would even be practical. Especially if you consider we already have 8k for those who do use cathedral entertainment systems.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        21 days ago

        (Most) TVs still have a long way to go with color space and brightness. AKA HDR. Not to speak of more sane color/calibration standards to make the picture more consistent, and higher ‘standard’ framerates than 24FPS.

        But yeah, 8K… I dunno about that. Seems like a massive waste. And I am a pixel peeper.

        • JigglySackles@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          21 days ago

          For media I highly agree. 8k doesn’t seem to add much. For computer screens I can see the purpose though as it adds more screen real estate which is hard to get enough of for some of us. I’d love to have multiple 8k screens so I can organize and spread out my work.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            21 days ago

            Are you sure about that? You likely use DPI scaling at 4K, and you’re likely limited by physical screen size unless you already use a 50” TV (which is equivalent to 4x standard 25” 1080p monitors).

            8K would only help at like 65”+, which is kinda crazy for a monitor on a desk… Awesome if you can swing it, but most can’t.


            I tangentially agree though. PCs can use “extra” resolution for various things like upscaling, better text rendering and such rather easily.

            • JigglySackles@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              21 days ago

              Truthfully I haven’t gotten a chance to use an 8k screen, so my statement is more hypothetical “I can see a possible benefit”.

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                21 days ago

                I’ve used 5K some.

                IMO the only ostensible benefit is for computer type stuff. It gives them more headroom to upscale content well, to avoid anti aliasing or blurry, scaled UI rendering, stuff like that. 4:1 rendering (to save power) would be quite viable too.

                Another example would be editing workflows, for 1:1 pixel mapping of content while leaving plenty of room for the UI.

                But for native content? Like movies?

                Pointless, unless you are ridiculously close to a huge display, even if your vision is 20/20. And it’s too expensive to be worth it: I’d rather that money go into other technical aspects, easily.

        • SpacetimeMachine@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          21 days ago

          The frame rate really doesn’t need to be higher. I fully understand filmmakers who balk at the idea of 48 or 60 fps movies. It really does change the feel of them and imo not in a necessarily positive way.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            21 days ago

            I respectfully disagree. Folk’s eyes are ‘used’ to 24P, but native 48 or 60 looks infinitely better, especially when stuff is filmed/produced with that in mind.

            But at a bare minimum, baseline TVs should at least eliminate jitter with 24P content by default, and offer better motion clarity by moving on from LCDs, using black frame insertion or whatever.

    • HugeNerd@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      21 days ago

      I think you’re right but how many movies are available in UHD? Not too many I’d think. On my thrifting runs I’ve picked up 200 Blurays vs 3 UHDs. If we can map that ratio to the retail market that’s ~1% UHD content.

  • OR3X@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    21 days ago

    ITT: people defending their 4K/8K display purchases as if this study was a personal attack on their financial decision making.

    • treesquid@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      21 days ago

      My 50" 4K TV was $250. That TV is now $200, nobody is flexing the resolution of their 4k TV, that’s just a regular cheap-ass TV now. When I got home and started using my new TV, right next to my old 1080p TV just to compare, the difference in resolution was instantly apparent. It’s not people trying to defend their purchase, it’s people questioning the methodology of the study because the difference between 1080p and 4k is stark unless your TV is small or you’re far away from it. If you play video games, it’s especially obvious.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 days ago

        Old people with bad eyesight watching their 50" 12 feet away in their big ass living room vs young people with good eyesight 5 feet away from their 65-70" playing a game might have inherently differing opinions.

        12’ 50" FHD = 112 PPD

        5’ 70" FHD = 36 PPD

        The study basically says that FHD is about as good as you can get 10 feet away on a 50" screen all other things being equal. That doesn’t seem that unreasonable

    • Nalivai@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      21 days ago

      Right? “Yeah, there is a scientific study about it, but what if I didn’t read it and go by feelings? Then I will be right and don’t have to reexamine shit about my life, isn’t that convenient”

    • michaelmrose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      21 days ago

      They don’t need to this study does it for them. 94 pixels per degree is the top end of perceptible. On a 50" screen 10 feet away 1080p = 93. Closer than 10 feet or larger than 50 or some combination of both and its better to have a higher resolution.

      For millennials home ownership has crashed but TVs are cheaper and cheaper. For the half of motherfuckers rocking their 70" tv that cost $600 in their shitty apartment where they sit 8 feet from the TV its pretty obvious 4K is better at 109 v 54

      Also although the article points out that there are other features that matter as much as resolution these aren’t uncorrelated factors. 1080p TVs of any size in 2025 are normally bargain basement garbage that suck on all fronts.

  • lepinkainen@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    21 days ago

    4k with shit streaming bitrate is barely better than high bitrate 1080p

    But full bitrate 4k from a Blu-ray IS better.

  • arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    21 days ago

    An overly compressed 4k stream will look far worse than a good quality 1080p. We keep upping the resolution without getting newer codecs and not adjusting the bitrate.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      20 days ago

      This is true. That said, if can’t tell the difference between 1080p and 4K from the pixels alone, then either your TV is too small, or you’re sitting too far away. In which case there’s no point in going with 4K.

      At the right seating distance, there is a benefit to be had even by going with an 8K TV. However, very few people sit close enough/have a large enough screen to benefit from going any higher than 4K:


      Source: https://www.rtings.com/tv/learn/what-is-the-resolution

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      21 days ago

      I went looking for a quick explainer on this and that side of youtube goes so indepth I am more confused.

      • Redex@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 days ago

        I’ll add another explanation for bitrate that I find understandable: You can think of resolution as basically the max quality of a display, no matter the bitrate, you can’t display more information/pixwls than the screen possess. Bitrate, on the other hand, represents how much information you are receiving from e.g. Netflix. If you didn’t use any compression, in HDR each pixel would require 30 bits, or 3.75 bytes of data. A 4k screen has 8 million pixels. An HDR stream running at 60 fps would require about 1.7GB/s of download wihout any compression. Bitrate is basically the measure of that, how much we’ve managed to compress that data flow. There are many ways you can achieve this compression, and a lot of it relates to how individual codecs work, but put simply, one of the many methods effectively involves grouping pixels into larger blocks (e.g. 32x32 pixels) and saying they all have the same colour. As a result, at low bitrates you’ll start to see blocking and other visual artifacts that significantly degrade the viewing experience.

        As a side note, one cool thing that codecs do (not sure if literally all of them do it, but I think most by far), is that not each frame is encoded in its entirety. You have, I, P and B frames. I frames (also known as keyframes) are a full frame, they’re fully defined and are basically like a picture. P frames don’t define every pixel, instead they define the difference between their frame and the previous frame, e.g. that the pixel at x: 210 y: 925 changed from red to orange. B frames do the same, but they use both previous and future frames for reference. That’s why you might sometimes notice that in a stream, even when the quality isn’t changing, every couple of seconds the picture will become really clear, before gradually degrading in quality, and then suddenly jumping up in quality again.

      • starelfsc2@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 days ago

        On codecs and bitrate? It’s basically codec = file type (.avi, .mp4) and bitrate is how much data is sent per second for the video. Videos only track what changed between frames, so a video of a still image can be 4k with a really low bitrate, but if things are moving it’ll get really blurry with a low bitrate even in 4k.

      • HereIAm@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 days ago

        For an ELI5 explanation, this is what happens when you lower the bit rate: https://youtu.be/QEzhxP-pdos

        No matter the resolution you have of the video, if the amount of information per frame is so low that it has to lump different coloured pixels together, it will look like crap.

      • null_dot@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 days ago

        The resolution (4k in this case) defines the number of pixels to be shown to the user. The bitrate defines how much data is provided in the file or stream. A codec is the method for converting data to pixels.

        Suppose you’ve recorded something in 1080p (low resolution). You could convert it to 4k, but the codec has to make up the pixels that can’t be computed from the data.

        In summary, the TV in my living room might be more capable, but my streaming provider probably isn’t sending enough data to really use it.

  • the_riviera_kid@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    7
    ·
    21 days ago

    Bullshit, actual factual 8k and 4k look miles better than 1080. It’s the screen size that makes a difference. On a 15inch screen you might not see much difference but on a 75 inch screen the difference between 1080 and 4k is immediately noticeable. A much larger screen would have the same results with 8k.

        • Soup@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          21 days ago

          Literally this article is about the study. Your “well-known” fact doesn’t hold up to scrutiny.

          • the_riviera_kid@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            21 days ago

            The other important detail to note is that screen size and distance to your TV also matters. The larger the TV, the more a higher resolution will offer a perceived benefit. Stretching a 1080p image across a 75-inch display, for example, won’t look as sharp as a 4K image on that size TV. As the age old saying goes, “it depends.”

            literally in the article you are claiming to be correct, maybe should try reading sometime.

            • Soup@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              21 days ago

              Yes, but you got yourself real pissy over it and have just now admitted that the one piece of criticism you had in your original comment was already addressed in the article. Obviously if we start talking about situations that are extreme outliers there will be edge cases but you’re not adding anything to the conversation by acting like you’ve found some failure that, in reality, the article already addressed.

              I’m not sure you have the reading the comprehension and/or the intention to have any kind of real conversation to continue this discussion further.

        • JigglySackles@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          21 days ago

          So I have a pet theory on studies like that. There are many things out there that many of us take for granted and as givens in our daily lives. But there are likely equally as many people out there to which this knowledge is either unknown or not actually apparent. Reasoning for that can be a myriad of things; like due to a lack of experience in the given area, skepticism that their anecdotal evidence is truly correct despite appearances, and on and on.

          What these “obvious thing is obvious” studies accomplish is setting a factual precedent for the people in the back. The people who are uninformed, not experienced enough, skeptical, contrarian, etc.

          The studies seem wasteful upfront, but sometimes a thing needs to be said aloud to galvanize the factual evidence and give basis to the overwhelming anecdotal evidence.

    • mean_bean279@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      21 days ago

      I like how you’re calling bullshit on a study because you feel like you know better.

      Read the report, and go check the study. They note that the biggest gains in human visibility for displays comes from contrast (largest reason), brightness, and color accuracy. All of which has drastically increased over the last 15 years. Look at a really good high end 1080p monitor and a low end 4k monitor and you will actively choose the 1080p monitor. It’s more pleasing to the eye, and you don’t notice the difference in pixel size at that scale.

      Sure distance plays some level of scale, but they also noted that by performing the test at the same distance with the same size. They’re controlling for a variable you aren’t even controlling for in your own comment.

      • SeriousMite@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 days ago

        This has been my experience going from 1080 to 4K. It’s not the resolution, it’s the brighter colors that make the most difference.

        • M0oP0o@mander.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          20 days ago

          And that’s not releated to the resolution yet people have tied higher resolutions to better quality.

      • Corhen@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 days ago

        Have a 75" display, the size is nice, but still a ways from a theater experience, would really need 95" plus.

  • 4am@lemmy.zip
    link
    fedilink
    English
    arrow-up
    19
    ·
    22 days ago

    Highly depends on screen size and viewing distance, but nothing reasonable for a normal home probably ever needs more than 8k for a high end setup, and 4K for most cases.

    Contrast ratio/HDR and per-pixel backlighting type technology is where the real magic is happening.

  • deranger@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    22 days ago

    If you read RTINGS before buying a TV and setting it up in your room, you already knew this. Screen size and distance to TV are important for determining what resolution you actually need.

    Most people sit way too far away from their 4K TV.

    • 7U5K3N@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 days ago

      My father in law loves to sit and research… that’s his thing… made a career out of it yadda yadda yadda…

      He asked me about a new TV… I was like…well have you seen rtings.com?

      My MIL had to remind him to eat… lmfao

      He just rabbit holed for days. It was like he clicked a TV tropes link or something.

      Anyway, he made a very informed decision and loves his TV. Haha

  • Baggie@lemmy.zip
    link
    fedilink
    English
    arrow-up
    17
    ·
    21 days ago

    Honestly after using the steam deck (800p) I’m starting to wonder if res matters that much. Like I can definitely see the difference, but it’s not that big a deal? All I feel like I got out of my 4k monitor is lower frame rates.

    • floquant@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      21 days ago

      Pixel density is what makes content appear sharp rather than raw resolution. 800p on a 7" screen is plenty, if you think about it a 50" 1080p TV is almost 10x the size more than 50x the size with a ~25% increase in (vertical) resolution

      • pirat@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 days ago

        if you think about it

        I tried that, and I’m not totally sure about the correctness of my numbers, but your numbers intuitively seem off to me:

        a 50" 1080p TV is almost 10x the size [of a 7" screen]

        How did you arrive at this? I’d argue a 50" screen is much more than 10 times the size of a 7" screen.

        The inches are measured diagonally, and I see how 50" is somewhat “almost 10x” of 7", as 49" would be 7 times longer diagonally than a 7", and 7.something is " almost" 10.

        But if we assume both screens have a 16:9 ratio, the 50" screen has a width of ≈110.69 cm and height of ≈62.26 cm, while the 7" is only ≈15.50 by ≈8.72 cm.

        The area of the 7" is 135.08 cm² while for the 50" it’s ≈6891.92 cm². The ratio between these two numbers is ≈51.02, which I believe means the 50" screen is more than 51x the physical size.

        At least, that number seems more realistic to me. I’m looking at my 6.7" phone screen right now and comparing it to my 55" TV screen, and it seems very possible that the phone screen could fit more than 50 times inside the TV screen, not just “almost 10x”.

        If I totally misunderstood you, please explain what you mean.

        My numbers for width and height were calculated using this display calculator site that someone else mentioned somewhere under this post, and I rounded the decimals after doing the calculations with all decimals included.

        • floquant@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          21 days ago

          Haha no, you have not misunderstood at all! I was just driving a point and I did no calculations whatsoever, by that «50" is almost 10x 7"» I did mean that 50 is “almost” 70 and nothing else x) As your calculations show, it’s actually a much bigger difference in area, but that stat seemed enough to make my point and easier to understand :)

          Thank you for actually thinking about it and taking the time to do the math ^^

          • pirat@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            21 days ago

            Oh, I see. But yeah, it’s a pretty big difference.

            You’re welcome. I like to think that I like thinking about things and stuff.

      • Baggie@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        19 days ago

        Yes, but wouldn’t we be using % of your vision vs pixels in display? Steam deck being right in front of my face and tv 5 or 6 metres away etc.

        Absolutely higher res does look sharper though, which is great for movies etc. I’m more coming from a performance vs visual fidelity ratio. What I’m trying to express is that given 800p still looks surprisingly good, I’m starting to question the industry pushing higher resolution displays for gaming applications.

  • w3dd1e@lemmy.zip
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    21 days ago

    I didn’t get why HD tv was relevant at all. I really did not understand that for a couple years.

    Then I got glasses.

    I suspect 4k matters for screens of a certain size or if you sit really close, but most of us don’t so it doesn’t matter.

  • Sauvandu60@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    21 days ago

    i suspect screen size would make the difference. you won’t notice 4K or 8K on small screens.

  • wizzor@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    22 days ago

    I can barely tell the difference between 720p and 1080p. I will probably never buy another TV.

    Maybe I need glasses?

    • Lumidaub@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      22 days ago

      I do wear glasses and I came here to post exactly your first sentence. There probably is a difference, sure, but I personally can’t see it unless I put both files next to each other and really try to see it.

      I’ve been digitising our movie collection so I played around with resolutions to minimise the storage space needed - I did settle on doing everything in 1080p but mostly because it feels weird to use a resolution the internet tells me is bad and I’m vulnerable to peer pressure (voice in the back of my head “oooh but what if anyone ever looks at those files?? What’ll they think???” type nonsense).

      I also had a few files that came in much higher resolution that I re-encoded to fractions of their file sizes and honestly same effect.

    • Fondots@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      22 days ago

      I’m pretty much in the same boat, 720p looks fine to me in the vast majority of cases, and while I’m not great at going to my eye doctor regularly, the last time I had my vision checked it was fine, and it was right around the time I was shopping for a new TV and upgraded from 1080 to 4k, and still had a 720p in my bedroom.

      If I looked really hard at them, I could tell the difference from the 720 to the 4k, but truth be told, I’m just not scrutinizing the picture quality of my TV that much.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      22 days ago

      Yeah, I can definitely tell the difference between 720p and 1080p, but the difference isn’t so large as to make me use it everywhere, so I default to 720p unless I need a bit more definition (usually for text).

      My TV is 4k but I can’t remember the last time I actually displayed 4k content, almost everything we have is old DVDs (so 480p?) and 1080p Blurays. I don’t see the point in paying extra for Ultra HD when the picture isn’t that much better at our viewing distance.

    • Electric@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 days ago

      I was like that too before I got glasses (I knew I had vision issues, but not how severe it was). I can’t stand 720p anymore.

    • frongt@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      22 days ago

      If you haven’t been tested, or are a couple years overdue, yeah probably. If you put a new 4k TV (with an actual 4k video, not Netflix) side by side with you current one, you’d notice. Especially if it’s OLED, because they can turn off the emitters to make blacks actually black.