• 6nk06@sh.itjust.works
    link
    fedilink
    arrow-up
    105
    ·
    3 days ago

    Given these positive signals

    Those idiots waited for 4 years because they followed the hype of the moment. I’m glad I removed Google from my life.

    • panda_abyss@lemmy.ca
      link
      fedilink
      arrow-up
      42
      ·
      3 days ago

      This must be your first time seeing what Google support looks like

      This is pretty standard unless you can get an exec’s personal attention.

    • wischi@programming.dev
      link
      fedilink
      arrow-up
      56
      arrow-down
      3
      ·
      3 days ago

      Feel free to use floppy disks. Btw if you are online, you use WebP and PNG all the time 🤣

      • Aequitas@feddit.org
        link
        fedilink
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        3 days ago

        If you are using Firefox:

        1. Enter the following in the address bar: about:config
        2. Search for: image.webp.enabled
        3. Set it to false Websites are delivering JPG/PNG instead of WebP again.
        • SleeplessCityLights@programming.dev
          link
          fedilink
          arrow-up
          3
          ·
          2 days ago

          Maybe this should come with a warning. The purpose of WebP is to quickly serve images to the user without grabbing the entire image data. Without WebP all images will be fully loaded, in the right conditions a page could load real slow.

          • [object Object]@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            2 days ago

            I love webp, but your explanation is a bit confused. Webp is typically lossy, just as jpeg — only, it’s more efficiently compressed, meaning smaller size for the same image quality. So there’s no ‘entire image data’, there are only different approximations of the original image and different compressed files. Full-blown lossless images in PNG or other formats take several times more data.

            Disabling webp in favor of jpeg would use like 20-40% more data, in comparison. Which still sucks, but not as much.

            Edit: maybe more than 40%, actually. Iirc I’ve seen webps that were half the size of jpegs. It’s a good format, shame it’s adopted rather poorly.

            • SleeplessCityLights@programming.dev
              link
              fedilink
              arrow-up
              1
              ·
              2 days ago

              I wasn’t going to get into the whole lossyness of the formats and just simplified to full image instead of compressed formatted. That is interesting that it is only saving 20%-40%. I was under the impression that the page only rendered the image size necessary to fit the layout and not the full resolution image. Forcing it to less lossy or lossless would mean that the larger image would always be available to be served to be rendered without any web request.

              • [object Object]@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                2 days ago

                That’s a rather interesting consideration as to whether rendering at smaller sizes skips decoding parts of the image.

                First, the presented file is normally always loaded in full, because that’s how file transfer works over the web. Until lately, there were no different sizes available, and that only became widely-ish spread because of Apple’s ‘Retina’ displays with different dots-per-inch resolution, mostly hidpi being two times the linear size of the standard dpi. Some sites, like Wikipedia, also support resizing images on the fly to some target dimensions, which results in a new image of the JPEG or other format. In any case, to my somewhat experienced knowledge, JPEG itself doesn’t support sending every second row or anything like that, so you always get a file of a predetermined size.

                First-and-a-half, various web apps can implement their own methods for loading lower- or higher-res images, which they prepare in advance. E.g. a local analogue to Facebook almost certainly loads various prepared-in-advance low-res images for viewing in the apps or on the site, but has the full-res images available on request, via a menu.

                Second, I would imagine that JPEG decoding always results in the image of the original size, which is then dynamically resized to the viewport of the target display — particularly since many apps allow zooming in or out of the image on the fly. Specifically, I think decoding the JPEG image creates a native lossless image similar to BMP or somesuch (essentially just a 2d array of pixel colors), which is then fed to the OS’s rendering capabilities, taking quite a chuck of memory. Of course, by now this is all accelerated by the hardware a whole lot, with the common algorithms being prepared to render raw pixels, JPEG, and a whole bunch of other formats.

                It would be quite interesting if file decoding itself could just skip some part of the rows or columns, but I don’t think that’s quite like the compression works in current formats (at least in lossy ones, which depend on the previous data to encode later data). Although afaik JPEG encodes the image in rectangles like 16x16 or something like that, so it could be that whole chunks could be skipped altogether.

      • Endymion_Mallorn@kbin.melroy.org
        link
        fedilink
        arrow-up
        5
        arrow-down
        14
        ·
        3 days ago

        No, I have WebP blocked in my about:config. And I use Pale Moon, which actually blocks the things unlike modern FF. And I don’t load PNG either.

  • reddig33@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    24
    ·
    3 days ago

    I would be more excited about JPEG XL if it was backward compatible. Not looking forward to yet another image standard that requires OS and hardware upgrades simply so servers can save a few bytes.

    • Laser@feddit.org
      link
      fedilink
      arrow-up
      42
      arrow-down
      1
      ·
      3 days ago

      How would a new format be backwards-compatible? At least JPEG-XL can losslessly compress standard jpg for a bit of space savings, and servers can choose to deliver the decompressed jpg to clients that don’t support JPEG-XL.

      Also from Wikipedia:

      Computationally efficient encoding and decoding without requiring specialized hardware: JPEG XL is about as fast to encode and decode as old JPEG using libjpeg-turbo

      Being a JPEG superset, JXL provides efficient lossless recompression options for images in the traditional/legacy JPEG format that can represent JPEG data in a more space-efficient way (~20% size reduction due to the better entropy coder) and can easily be reversed, e.g. on the fly. Wrapped inside a JPEG XL file/stream, it can be combined with additional elements, e.g. an alpha channel.

      • reddig33@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        25
        ·
        3 days ago

        All you have to do is add a small traditional JPEG image at the start of the file. It doesn’t have to be high resolution or more than a couple of kb. The new format decoder would know this, and skip the traditional jpeg “header”, rendering the newer file format embedded in the image.

        • wischi@programming.dev
          link
          fedilink
          arrow-up
          40
          arrow-down
          2
          ·
          3 days ago

          Would completely defeat the purpose of making a new smaller file format if we prefix if with the old format.

          • reddig33@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            16
            ·
            3 days ago

            If you’re really saving 20% in file size with XL, adding back a very compressed preview image that takes up one or two percent isn’t going to cost you much.

    • LordKitsuna@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      3 days ago

      It requires neither of those upgrades though? Unless you’re still using Windows XP I guess for some reason. It’s just an update to the image decoder

    • REDACTED@infosec.pub
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      What does backward compability in image format even means? Being able to open it in windows image viewer?