• SatyrSack@lemmy.sdf.org
    link
    fedilink
    arrow-up
    279
    arrow-down
    6
    ·
    3 days ago

    Immediately get noticed

    Realistically, though, we are only aware of that one because it was noticed in that unlikely scenario and then widely reported. For all we know, most open source backdoors are alive and well in our computers, having gone unnoticed for years.

    • towerful@programming.dev
      link
      fedilink
      arrow-up
      103
      ·
      3 days ago

      Yup.
      But in open source it CAN be noticed, by anyone determined enough to dig into its side effects.
      Proprietary software? You file a regression bug that startup takes 500ms longer, and it might get looked at.

      Also, backdoors that are discovered in open source software improve automated software auditing.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        2 days ago

        Yeah, you open a bug like that in proprietary software and it will immediately get rationalized away as having no business case to address, likely with a person with zero direct development responsibility writing a bs explanation like the small impact was due to a number of architectural changes.

        Speaking as someone with years of exposure to business managed issue handling.

      • alaphic@lemmy.world
        link
        fedilink
        arrow-up
        58
        ·
        3 days ago

        500ms longer, and it might get looked at.

        Why would you even lie to the poor fellow like that? 🤣 lol

      • magic_lobster_party@fedia.io
        link
        fedilink
        arrow-up
        21
        ·
        3 days ago

        The flaw also highlighted a social engineering exploit. It’s not the first time some vulnerability has entered open source software due to social pressure on the maintainer. Notably EventStream exploit.

        This is difficult to account for. You can’t build automated tooling for social engineering exploits.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      Evidence suggests this isn’t the case.

      We know of so many more closed source backdoors despite them being harder to notice in practice. Either before they became a problem or after they have been used in an attack. So we know backdoors can get noticed even without access to source code.

      Meanwhile we have comparatively fewer backdoor type findings in major open source software, despite and thanks to increased scrutiny. So many people want to pad their resume with “findings” and go hit up open source software relentlessly. This can be obnoxious because many of the findings are flat out incorrect or have no actual security implications, but among the noise is a relatively higher likelihood that real issues get noticed.

      The nature of the xz attack shows the increased complexity associated with attempting to back door open source. Sneaking a malicious binary patch into test data, because the source code would be too obvious, and having to hide asking the patch in an obfuscated way in build scripts that would only apply in theory under specific circumstances. Meanwhile the closed source backdoors have frequently been pretty straightforward but still managed to ship and not be detected.

      Even if we failed to detect unused backdoors, at some point someone would actually want to use their backdoor, so they should be found at some point.

    • CodexArcanum@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      25
      ·
      3 days ago

      I feel like its a mixed bag. Certainly there’s an infinitely higher chance of someone randomly noticing a backdoor in OSS than in closed source simply because any OSS project in use has someone looking at it. Many closed systems have dusty corners that haven’t had programmer eyes on them in years.

      But also, modern dev requires either more vigilance than most of us have to give or more trust than most of us would ideally be comfortable offering. Forget leftpad, I’ve had npm dependencies run a full python script to compile and build sub dependencies. Every time I run npm update, it could be mining a couple of bitcoins for all I know in addition to installing gigs and gigs of other people’s code.

      The whole industry had deep talks after leftpadgate about what needed to be done and ultimately, not much changed. NPM changed policy so that people couldn’t just dissapear their packages. But we didn’t come up with some better way.

      Pretty much every language has its own NPM now, the problem is more widespread than ever. With Rust, it can run arbitrary macros and rust code in the build files, it can embed C dependencies. I’m not saying it would be super easy to hide something in cargo, i haven’t tried so I don’t know, but i do think the build system is incredibly vulnerable to supply chain attacks. A dependency chain could easily pull in some backdoor native code, embed it deep into your app, and you might never realize it’s even there.

      • SatyrSack@lemmy.sdf.org
        link
        fedilink
        arrow-up
        52
        ·
        3 days ago

        Yes, this particular incident.

        https://en.wikipedia.org/wiki/XZ_Utils_backdoor

        In February 2024, a malicious backdoor was introduced to the Linux build of the xz utility within the liblzma library in versions 5.6.0 and 5.6.1 by an account using the name “Jia Tan”.[b][4] The backdoor gives an attacker who possesses a specific Ed448 private key remote code execution through OpenSSH on the affected Linux system. The issue has been given the Common Vulnerabilities and Exposures number CVE-2024-3094 and has been assigned a CVSS score of 10.0, the highest possible score.[5]

        Microsoft employee and PostgreSQL developer Andres Freund reported the backdoor after investigating a performance regression in Debian Sid.[8] Freund noticed that SSH connections were generating an unexpectedly high amount of CPU usage as well as causing errors in Valgrind,[9] a memory debugging tool.[10]

    • pinball_wizard@lemmy.zip
      link
      fedilink
      arrow-up
      9
      ·
      2 days ago

      For all we know…

      This isn’t something we need to speculate about. The vulnerability histories of popular closed and open source tools are both part of public data sets.

      Looking into that data, the thing that stands out is that certain proprietary software vendors have terrible security track records, and open source tools from very small teams may be a mixed bag.

        • limer@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 days ago

          Which in itself is worrying to me; given that there are now tens of thousands of in-use libraries and millions of programmers, the chances are high that someone tried at least once more than we have heard about .

          And I know there have been several attempts, but there seems to be a lack of information about them all in one easy to read place

      • SatyrSack@lemmy.sdf.org
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        There doesn’t need to be any evidence. This is something that is impossible to prove one way or the other, like Last Thursdayism.