Screens keep getting faster. Can you even tell? | CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wo…::CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wonder how we ever put up with ‘only’ 240Hz displays?

  • aaaantoine@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    8 months ago

    On one hand, 360hz seems imperceptibly faster than 240hz for human eyes.

    On the other hand, if you get enough frames in, you don’t have to worry about simulating motion blur.

    • Ms. ArmoredThirteen@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      That also depends on the person. Save for really fast moving things I can barely tell the difference between 30 and 60fps, and I cap out at 75 before I can’t notice a difference in any situation. One of my friend’s anything less than 75 gives them headaches from the choppiness.

      • Clam_Cathedral@lemmy.ml
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        8 months ago

        Yeah, personally playing games at 30fps feels disruptively laggy at least for the first few minutes. 60 is good, but the jump to 120 is night and day. I was shocked that going from 120 to 240 was just as noticeable an improvement as the last to me, especially when so many people say they don’t notice it much. Hard to find newer games that give me that much fps though.

  • vext01@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    3
    ·
    8 months ago

    Reminiscent of the hi-res audio marketing. Why listen at a measly 24bit 48khz when you can have 32/192?!

    • vividspecter@lemm.ee
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      8 months ago

      These have an actual perceivable difference even if subtle. Hires audio, however, is inaudible by humans.

      • vext01@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        10
        ·
        8 months ago

        I tend to agree, but the audiophiles always have an answer to rebuttal it with.

        I’m into audio and headphones, but since I’ve never been able to reliably discern a difference with hi-res audio, I no longer let it concern me.

        • PastyWaterSnake@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          8 months ago

          I’ve bought pretty expensive equipment, tube amplifier, many fancy headphones, optical DACs. A library full of FLAC files. I even purchased a $500 portable DAP. I’ve never been able to reliably tell a difference between FLAC and 320k MP3 files. At this point, it really doesn’t concern me anymore either, but I at least like to see my fancy tube amp light up.

          I will say, though, $300 seems to be the sweet-spot for headphones for me.

          • vividspecter@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 months ago

            I’ve never been able to reliably tell a difference between FLAC and 320k MP3 files

            I just keep FLAC around so I can transcode them to new lossy formats as they improve. And so I can transcode aggressively for my mobile when I’m streaming from home, and don’t need full transparency.

          • barsoap@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Blackmail – Evon. That’s the one song where I ever heard a difference, though that was ogg, dunno what bitrate I used back then but it was sufficient for everything else. Listening on youtube yep that’s mushy. The noisy goodness that kicks in at 0:30, it’s crisp as fuck on CD.

            …just not the kind of thing those codecs are optimised for I’d say. Also it still sounds fine, just a bit disappointing if you ever heard the uncompressed thing. Which is also why you should never try electrostatic headphones.

          • pete_the_cat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Yeah there’s a clear difference between a pair of $25 or $50 headphones and a pair that cost a few hundred. When I first got my Sony WH1000-XM3s I let my coworker try them and he said “Wow, I didn’t know music could sound this good!”. When I upgraded to the XM4s a few years later I let my brother try them and he was similarly impressed.

            Beyond a few hundred and the thousand dollar range you hit diminishing returns.

        • bitwolf@lemmy.one
          link
          fedilink
          English
          arrow-up
          5
          ·
          8 months ago

          Imo the biggest bump is from mp3 to lossless. The drums sound more organic on flacs whereas on most mp3s they sound like a computer MIDI sound.

          The biggest bump for me was the change in headphones. It made my really old aac 256kbps music sound bad.

          • vext01@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Tried flac vs 192 vorbis with various headphones. E.g. moondrop starfield, fiio fa1, grado sr80x…

            Can’t tell a difference. Kept using vorbis.

        • vividspecter@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          I’d somewhat call myself an audiophile, just one that cares about actual measurements and audibility, and not snake oil. Haven’t heard a good term for that yet, though.

          Audiophiles also tend to care about some some sort of audio purity, but I’m willing to go wild with EQ, room correction, and impulse responses, which is pretty much the opposite of purity.

      • otp@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        8 months ago

        They have tests you can take to see if you can hear the difference. A lot of people fail! Lol

        • Lesrid@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          Usually percussion is where it’s easiest to notice the difference. But typically people prefer the relatively more compressed sound!

      • Sombyr@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        I’d thought I could hear a difference in hires audio, but after reading up on it I’m starting to think it may have been some issue with the tech I was using, whether it be my headphones or something else, that made compressed audio sound veeeery slightly staticky when high notes or loud parts of the track played.
        Personally though, even if it wasn’t, the price for the equipment wasn’t worth it for a difference that was only perceptible if I was listening for it. Not to mention it’s near impossible to find hires tracks from most bands. Most claiming to be hires are just converted low res tracks and thus have no actual difference in sound quality, the only difference being the file is way larger for no good reason.

  • morrowind@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    ·
    8 months ago

    Well no, because most people aren’t getting them. It’s nice but it’s difficulty to justify spending hundreds on a lightly better screen

    • 1984@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      This tech trickles down to mainstream in a few years. That’s always how it is.

      • TAYRN@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Cool, so in a few years we’ll have a screen which isn’t better in any noticeable way?

        • Plopp@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Don’t be so negative, imagine a phone screen at 480 Hz. It’ll be great for when you have too much charge left in your battery and need to drain some.

  • Lemminary@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    8 months ago

    Finally, a screen with the refresh rate that my cat can enjoy! He sure is gonna love that Tom & Jerry like no other cat that ever lived.

  • ColeSloth@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    7
    ·
    8 months ago

    I don’t need or want a phone over 90hz, and a pc screen over 180hz. A phone is a waste of battery and a pc screen over that is a waste of money.

    • Vlyn@lemmy.zip
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      3
      ·
      8 months ago

      Then don’t buy them? With better screens coming out the ones you do want to buy get cheaper.

      Back in the day 144hz screens cost a premium, now you can have them for cheap.

      • Potatos_are_not_friends@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        8 months ago

        I stopped buying tvs from 2000 until like two years ago, when i saw them on sale for like $200. Been living off of projectors & a home server. I skipped so many “innovations” like curve, flat, HD, 4K, trueColor.

        Weird that it has a OS and that was a shocker.

        I look forward to what TVs bring in 2040.

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          8 months ago

          I mean OLEDs are damn amazing image quality wise, but I’m also not a fan of “smart” TVs. The apps can be useful (like native Netflix, Amazon video and so on), but 90% of the time I use my PC over HDMI.

        • devfuuu@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          8 months ago

          You know what’s hot? 3D televisions!!

          I’m so glad that hype died out with people understanding it was stupid. Just thinking about all the ones who bought one.

  • Snoopey@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 months ago

    All I want is a 27/28 inch oled 4k monitor with good hdr. I don’t care about the refresh rate as long a it’s 60Hz+

    • dai@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 months ago

      Minimum for me would be 120hz, i’ve been using 120hz since 2012 (12 years… man) and anything less feels like a massive step backwards. My old S10+ and my cheapie laptop feel sluggish in any animated / transmission scenario.

    • bitwolf@lemmy.one
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      8 months ago

      I’m sticking out with IPS until MicroLED matures enough for me to afford.

      OLED was never designed to be used as a computer monitor and I don’t want a monitor that only lasts a couple years.

      Researchers just designed a special two layer (thicker than current OLED) that doubles the lifespan to 10,000hours at 50% brightness without degrading.

      I’m totally with you on good HDR though. When it works, it’s as night -and-day as 60 -> 144hz felt for me.

      • Snoopey@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Burn in is a non-issue for regular all-day use. As long as you aren’t displaying a static image at 100% for literally years and actively stopping the screen from running preventative measures, you’ll be fine.

        • bitwolf@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Can desktop computers do those preventative measures? I haven’t seen any desktop interface for the mitigations Samsung puts on it’s phones.

          Desktops also display static images 100% of the time, unless you change your usage behavior to use full screen all the time.

  • Jumi@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    8 months ago

    I splurged on a 4k 144hz monitor when I worked constant night shifts in covid times and I don’t think I will ever need something else.

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 months ago

      What is the idea behind 144? It seems to particular a number to be arbitrary. 24, 60 and 120 seem to be based on other techs and related media.

      • Darthjaffacake@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        8 months ago

        I found people online saying it’s because it’s 24 frames (standard frame rate) higher than 120 meaning it can be used to watch movies using integer scaling (1:6 ratio of frame rate rather than 1:5.5 or something strange), take that with a massive grain of salt though because lots of people say there’s other reasons.

        • Humanius@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          8 months ago

          If consuming media with integer scaling is the main concern, then 120Hz would be better than 144Hz, because it can be divided by 5 to make 24Hz (for movies) and divided by 2 or 4 to make 30/60Hz (for TV shows).

          144Hz only cleanly divides into 24Hz by dividing it by 6. In order to get to 60Hz you need to divide by 2.4, which is not an integer.

          And with either refresh rate 25/50Hz PAL content is still not dividable by a nice round integer value

          • Darthjaffacake@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Yeah as I said take what I said with a massive grain of salt, some people are saying it’s because of a limit of hdmi data sending so it could be that.

        • Squizzy@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 months ago

          Oh man those maths didn’t click with me, of course it’s just another 24 frames.

      • Lojcs@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        8 months ago

        I think it’s because 120hz + overclock can get to 144 so someone probably started selling factory OCd 120hz screens at 144hz and then it caught on. Then someone did the same to native 144hz and we got 165hz. I’m more curious about why 165 was chosen, its not a nice number like 144. Maybe since vrr is widespread now they didn’t need nice numbers

      • Jumi@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        I honestly have no idea but so far I never really reached 144 fps or 4k, much less both simultaneously.

  • Donkter@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 months ago

    This says “can you tell?” Like I don’t get a new screen once every 10 years maybe and even then the last one I got was used.

    • Im_old@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      one of my two screens is aiming for 20 years of (intensive!) service. It’s even still in 4:3 format. I will probably replace it in the next couple of years, if the magic smoke doesn’t escape first!

    • H0neyc0mb@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      8 months ago

      You don’t sound like a mindless consumer, unfortunately, that isn’t most people.

    • PLAVAT🧿S@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 months ago

      Maybe this is what Jaden Smith meant when he famously stated:

      How Can Mirrors Be Real If Our Eyes Aren’t Real

      Wow, still blown away…

  • BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    8 months ago

    It won’t matter until we hit 600. 600 integer scales to every common media framerate so frametimings are always perfect. Really they should be focusing on better and cheaper variable refresh rate but that’s harder to market.

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      8 months ago

      Well, not really, because television broadcast standards do not specify integer framerates. Eg North America uses ~59.94fps. It will take insanely high refresh rates to be able to play all common video formats including TV broadcasts. Variable refresh rate can fix this only for a single fullscreen app.

    • Vlyn@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      I mean the 240 I use already does that. So would 360 or 480. No clue why you fixate on 600.