• rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    17 hours ago

    Meanwhile my NixOS install had a failure to mount an encrypted swap at boot costing me 1 and a half minutes of downtime on every boot that only took 30 seconds to fix but 6 months to get around to.

  • bender223@lemmy.today
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    1 day ago

    Open source and proprietary software development have very different goals. Open source is generally about making software that’s useful. Proprietary software’s goal is to make money by any means necessary. Viewing it from that angle, open source devs and the community are more motivated to keep an eye out for backdoors. While proprietary software, they won’t give a fuck until something affects their bottom line. Just because of that, I feel safer using open source software in general.

    • douglasg14b@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      The sad part is is that you’re right.

      And the reason that it’s sad is that most of the individual veneers on proprietary projects deeply about a project itself and have the same goals as they do with open source software, which is just to make something that’s useful and do cool shit.

      Yep, the business itself can force them not take care of problems or force them to go in directions that are counter to their core motivations.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 day ago

      Viewing it from that angle, open source devs and the community are more motivated to keep an eye out for backdoors.

      I think it is less an issue of motivation and more an issue of selection bias. Lots of open source projects fall out of support. Lots of them are riddled with bugs. Lots of them have clunky interfaces and high latency and a myriad of other problems that never get solved, because the original designers never put in the leg work.

      But the ones that do have a lively community and a robust design are the ones that get mainstream adaptation. And this produces a virtuous cycle of new users, some of whom become new contributors, who expand functionality, and attract more new users. When you have a critical mass of participants, they collectively have an interest in seeing the project get resources to improve and overcome obstacles and keep the project alive.

      Private developers also have an elephant’s graveyard of failed software. But they don’t subsist on the same kind of critical mass of participation. A private development company really only needs one or two whale clients to sustain themselves. Microsoft had IBM. Oracle had Exxon. TurboTax has the IRS. Look at how LLM developers like OpenAI stick around with billions in funding despite enjoying no real revenue stream.

      I would say that the maxim “If you’re not the client then you’re the product” technically holds in both instances. There’s no particular reason why a social media platform like Facebook or TikTok couldn’t be open source and still ruthlessly data mine its end-users. In the same vein, a private firm like Palantir or Fidelity or AT&T has ample incentive to keep their systems secure because security is at the heart of their bottom line.

  • Plesiohedron@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    6
    ·
    2 days ago

    This is why open source, total transparency, radical free speech and democracy is the one and only way. Because if there’s even one little shadow there will be a scorpion hiding in it.

    • SanityRequired@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      edit-2
      1 day ago

      Is this still true in the age of targeted social media propaganda?

      Seems to me that radical free speech without moderating for basic accuracy or malicious disinfo has pretty much kicked of the downfall of the American experiment

        • Sconrad122@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          1 day ago

          Is this not just “the free market of ideas”? Which has the same pitfalls as the free market of money where if consumers are not educated and motivated to prune out bad actors, the market is easily subverted by malicious actors? Relying on people to regulate their information diets is betting on individuals with limited resources and motivation to defend themselves and the collective against concerted, well-resourced, and well-organized efforts to abuse the market of ideas because there is immense money and power to gain from doing so

  • SatyrSack@lemmy.sdf.org
    link
    fedilink
    arrow-up
    277
    arrow-down
    6
    ·
    2 days ago

    Immediately get noticed

    Realistically, though, we are only aware of that one because it was noticed in that unlikely scenario and then widely reported. For all we know, most open source backdoors are alive and well in our computers, having gone unnoticed for years.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      23 hours ago

      Evidence suggests this isn’t the case.

      We know of so many more closed source backdoors despite them being harder to notice in practice. Either before they became a problem or after they have been used in an attack. So we know backdoors can get noticed even without access to source code.

      Meanwhile we have comparatively fewer backdoor type findings in major open source software, despite and thanks to increased scrutiny. So many people want to pad their resume with “findings” and go hit up open source software relentlessly. This can be obnoxious because many of the findings are flat out incorrect or have no actual security implications, but among the noise is a relatively higher likelihood that real issues get noticed.

      The nature of the xz attack shows the increased complexity associated with attempting to back door open source. Sneaking a malicious binary patch into test data, because the source code would be too obvious, and having to hide asking the patch in an obfuscated way in build scripts that would only apply in theory under specific circumstances. Meanwhile the closed source backdoors have frequently been pretty straightforward but still managed to ship and not be detected.

      Even if we failed to detect unused backdoors, at some point someone would actually want to use their backdoor, so they should be found at some point.

    • CodexArcanum@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      25
      ·
      2 days ago

      I feel like its a mixed bag. Certainly there’s an infinitely higher chance of someone randomly noticing a backdoor in OSS than in closed source simply because any OSS project in use has someone looking at it. Many closed systems have dusty corners that haven’t had programmer eyes on them in years.

      But also, modern dev requires either more vigilance than most of us have to give or more trust than most of us would ideally be comfortable offering. Forget leftpad, I’ve had npm dependencies run a full python script to compile and build sub dependencies. Every time I run npm update, it could be mining a couple of bitcoins for all I know in addition to installing gigs and gigs of other people’s code.

      The whole industry had deep talks after leftpadgate about what needed to be done and ultimately, not much changed. NPM changed policy so that people couldn’t just dissapear their packages. But we didn’t come up with some better way.

      Pretty much every language has its own NPM now, the problem is more widespread than ever. With Rust, it can run arbitrary macros and rust code in the build files, it can embed C dependencies. I’m not saying it would be super easy to hide something in cargo, i haven’t tried so I don’t know, but i do think the build system is incredibly vulnerable to supply chain attacks. A dependency chain could easily pull in some backdoor native code, embed it deep into your app, and you might never realize it’s even there.

    • towerful@programming.dev
      link
      fedilink
      arrow-up
      101
      ·
      2 days ago

      Yup.
      But in open source it CAN be noticed, by anyone determined enough to dig into its side effects.
      Proprietary software? You file a regression bug that startup takes 500ms longer, and it might get looked at.

      Also, backdoors that are discovered in open source software improve automated software auditing.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        23 hours ago

        Yeah, you open a bug like that in proprietary software and it will immediately get rationalized away as having no business case to address, likely with a person with zero direct development responsibility writing a bs explanation like the small impact was due to a number of architectural changes.

        Speaking as someone with years of exposure to business managed issue handling.

      • alaphic@lemmy.world
        link
        fedilink
        arrow-up
        57
        ·
        2 days ago

        500ms longer, and it might get looked at.

        Why would you even lie to the poor fellow like that? 🤣 lol

      • magic_lobster_party@fedia.io
        link
        fedilink
        arrow-up
        21
        ·
        2 days ago

        The flaw also highlighted a social engineering exploit. It’s not the first time some vulnerability has entered open source software due to social pressure on the maintainer. Notably EventStream exploit.

        This is difficult to account for. You can’t build automated tooling for social engineering exploits.

    • pinball_wizard@lemmy.zip
      link
      fedilink
      arrow-up
      9
      ·
      1 day ago

      For all we know…

      This isn’t something we need to speculate about. The vulnerability histories of popular closed and open source tools are both part of public data sets.

      Looking into that data, the thing that stands out is that certain proprietary software vendors have terrible security track records, and open source tools from very small teams may be a mixed bag.

      • SatyrSack@lemmy.sdf.org
        link
        fedilink
        arrow-up
        52
        ·
        2 days ago

        Yes, this particular incident.

        https://en.wikipedia.org/wiki/XZ_Utils_backdoor

        In February 2024, a malicious backdoor was introduced to the Linux build of the xz utility within the liblzma library in versions 5.6.0 and 5.6.1 by an account using the name “Jia Tan”.[b][4] The backdoor gives an attacker who possesses a specific Ed448 private key remote code execution through OpenSSH on the affected Linux system. The issue has been given the Common Vulnerabilities and Exposures number CVE-2024-3094 and has been assigned a CVSS score of 10.0, the highest possible score.[5]

        Microsoft employee and PostgreSQL developer Andres Freund reported the backdoor after investigating a performance regression in Debian Sid.[8] Freund noticed that SSH connections were generating an unexpectedly high amount of CPU usage as well as causing errors in Valgrind,[9] a memory debugging tool.[10]

        • limer@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          Which in itself is worrying to me; given that there are now tens of thousands of in-use libraries and millions of programmers, the chances are high that someone tried at least once more than we have heard about .

          And I know there have been several attempts, but there seems to be a lack of information about them all in one easy to read place

      • SatyrSack@lemmy.sdf.org
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        There doesn’t need to be any evidence. This is something that is impossible to prove one way or the other, like Last Thursdayism.

  • MrMobius @sh.itjust.works
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    2 days ago

    Makes me remember, wasn’t there a well respected dev who, out of the blue, decided to add a vulnerability in a linux package last year?

    • mic_check_one_two@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      27
      ·
      edit-2
      1 day ago

      That’s what this meme is referencing. That was the XZ Utils backdoor. The contributor spent 5 years gaining the lead dev’s trust, waited for the lead dev to get busy with other things, then basically bullied the lead dev into handing over control of the project. They quietly pushed an SSH backdoor.

      And then they were almost immediately called out by a dude who was running benchmarks and realized that his SSH requests were taking like 5ms longer than they should. That delay was because the backdoor was checking the SSH request against a table of backdoor requests, to see if it should allow the connection even if the UN/PW was wrong.

      The big concern was that the SSH system was used all over the world. But rolling back to a previous version was easy, and most systems hadn’t updated yet anyways.

  • pemptago@lemmy.ml
    link
    fedilink
    English
    arrow-up
    41
    ·
    2 days ago

    Also, many proprietary softwares rely on open source libraries. So unless they catch, patch, and do not contribute those fixes, proprietary will be at least as vulnerable as the oss they depend on.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        23 hours ago

        Sorry, there’s no business case for rebasing those dependencies. Please focus story points on active marketing requirements instead.

  • plyth@feddit.org
    link
    fedilink
    English
    arrow-up
    14
    ·
    2 days ago

    The proprietary backdoors come with spies doing much more gymnastics to gain access to those who know the secrets to access those backdoors.

  • Samskara@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    arrow-down
    36
    ·
    2 days ago

    Open source software is full of bugs and security vulnerabilities. Most code doesn’t get read by more than two people.

    • MTK@lemmy.world
      link
      fedilink
      arrow-up
      46
      ·
      edit-2
      2 days ago

      Your statement is even more true about closed source. As someone who worked in multiple companies, I can tell you that 99% of the code is written, PRed, QAed, and then ignored forever.

      • nieminen@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        2 days ago

        I can confirm. Unless the code causes issues people notice, nobody thinks about it after the PR.

        OSS has the benefit of people WANTING to do the work, so I feel they make more effort to make sure it’s stable and efficient. Taking the extra time for testing and random scenarios, whereas people in corporate software will more often than not simply meet the reqs of the request, and then do minimal testing, send it off to the corporate machine.

        OSS also has the benefit of randos across the whole world being able to view and audit changes.

        • squaresinger@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 day ago

          OSS on the other side has the downside of being free.

          That means it’s:

          • massively underfunded because nobody donates
          • no SLA-style contracts to hold anyone accountable
          • most of the time no 3rd party security audits because free software (especially libraries or system tools) don’t go through procurement and thus don’t require them
          • everyone expects that “someone” will have already reviewed it becouse the code is open and used by millions of projects, while in reality they are maintained by some solitary hero hacking away in his basement

          If stuff like OpenSSL was CSS, it would be at least a mid-sized company making lots of revenue (because it’s used everywhere, even small license fees would rack up lots of revenue), with dozens of specialists working there, and since it would go through procurement there would be SLAs and 3rd party security audits.

          But since it’s FOSS, nobody cares, nobody donates and it was a singular developer working at it until heartbleed. Then some of the large corporations which based their whole internet security on this singular dude’s work realized that more funding was necessary and now it is a company with multiple people working there.

          But there are hundreds of other similarly important FOSS projects that are still maintained by a solitary hero not even making minimum wage from it. Like as shown with the .xz near miss.

          Just imagine that: nobody in their right mind would run a random company’s web app with just one developer working in their spare time. That would be stupid to do, even though really nothing depends on that app.

          But most of our core infrastructure for FOSS OSes and internet security depends on hundreds of projects maintained by just a single person in their free time.