• idefix@sh.itjust.works
    link
    fedilink
    arrow-up
    61
    arrow-down
    2
    ·
    7 months ago

    I know it’s a joke, I hate to be that guy. But this meme feels old and obsolete now. I can’t remember the last time I had to tweak my Linux. The fun is gone

    • De_Narm@lemmy.world
      link
      fedilink
      arrow-up
      41
      ·
      7 months ago

      For real. I recently had to swap my window manager to xmonad just to feel something again.

    • RadicalEagle@lemmy.world
      link
      fedilink
      arrow-up
      21
      ·
      7 months ago

      Cool! Maybe I can challenge you. Can you help me figure out how I can get my Hyprland session back on my Arch install? I have a Radeon 7700 XT and I recently installed an RTX 4070 to assist with some compute tasks. With both cards installed GDM doesn’t populate the Hyprland option. If I remove the 4070 everything goes back to normal.

      (This is also a joke, you don’t need to help me troubleshoot this.)

      (Unless you actually know how in which case I can pay you $20 for your time)

        • RadicalEagle@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          7 months ago

          Haha, I was hoping that because all my monitors are plugged into my AMD card that it wouldn’t cause as many issues, but I was mistaken.

          I’m looking at it as an opportunity to learn more about the Linux kernel, the order that certain modules are being loaded in, and environment variables.

          • Refurbished Refurbisher@lemmy.sdf.org
            link
            fedilink
            arrow-up
            9
            arrow-down
            1
            ·
            edit-2
            7 months ago

            You should consider passing through your Nvidia GPU to a virtual machine in order to do compute tasks on; that way, your host machine won’t be infected with proprietary Nvidia drivers (I’m assuming you need CUDA for your compute tasks). The only performance differences you’ll notice is less available system RAM (you will have access to all of your VRAM), and very slightly less CPU performance, due to running two operating systems at the same time (barely even noticable, TBH). This is the option that I would personally recommend.

            If you want to try a super hacky solution which might not work for everything you need, you can try using the open source, recently released ZLUDA translation layer to perform CUDA tasks on your AMD GPU.

            https://github.com/vosen/ZLUDA

            The reason Hyprland doesn’t work with proprietary Nvidia drivers is due to Nvidia refusing to implement the accepted Wayland standard in favor of their own, home-rolled solution which is incompatible. AFAIK, only GNOME and KDE implement that standard.

            • nexussapphire@lemm.ee
              link
              fedilink
              English
              arrow-up
              6
              ·
              7 months ago

              Hyprland works fine on Nvidia, I’ve been using it for about a year now. It’s only going to improve now that Nvidia hired people from the Nouveau team to work on Nouveau and Nvidia is making the open drivers the default in version 560. Can’t wait for the 555 drivers they’ve been working on with the Wayland team and most of the major desktops to implement explicit sync etc.

              An option would be to only install the CUDA toolkit without the drivers but distros like Ubuntu just don’t support it. You could also switch display managers to sddm because Hyperland recommends it, might work better. Hyprland prints information in the tty if you launch it with Hyprland. I’m just thinking it’s gdm being weird tbh.

            • RadicalEagle@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              7 months ago

              Ah, I like this solution. Thanks for the suggestion! I set up GPU passthrough for a VM on a build years ago with QEMU. I’m sure I’ll be able to figure that out again.

    • MentalEdge@sopuli.xyz
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      7 months ago

      By an Index and get into VR gaming on Linux. We livin on the edge ovar her. Shit breaks every day and there’s a wonky python script you have to use if you wanna be able to put the base stations into sleep mode 👍

    • JustEnoughDucks@feddit.nl
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      Last week for me lol.

      AMD DRM bug in the kernel that prevents certain 3D rendering or something. Most games through WINE/proton was broken. Had to downgrade the kernel.

      Wouldn’t call that fun as it prevented one of the very few days per month I get to play games with some of my friends

      • idefix@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        It’s interesting to read people’s issues on Linux. It seems almost all of them come from the graphic stack and gaming. Using an Intel card I haven’t seen an issue in forever.

        • JustEnoughDucks@feddit.nl
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          Yeah I think current graphic development is going at breakneck pace, but of course that means a lot more bugs…

    • UFO@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      Can i interest you on the deep customization of nixos?

      Jokes aside. I don’t really use the deep patching nix enables. The area of customization i want: look and feel of applications. It’s not something that’s doable really. Desktops are just different ways to launch a web browser T_T

  • rsuri@lemmy.world
    link
    fedilink
    arrow-up
    46
    arrow-down
    5
    ·
    edit-2
    7 months ago

    Do you have multiple monitors?
    Yes - Don’t buy a mac
    No - Still don’t buy a mac

      • lud@lemm.ee
        link
        fedilink
        arrow-up
        6
        arrow-down
        3
        ·
        7 months ago

        I dont think it’s even possible to use more than two monitors on a M series computer (maybe except if you spend extra for the max edition)

        • KoalaUnknown@lemmy.world
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          7 months ago

          That is only the case on the base model chips. The Pro, Max, and Ultra chips all support multiple monitors.

              • lud@lemm.ee
                link
                fedilink
                arrow-up
                9
                arrow-down
                1
                ·
                7 months ago

                It’s still ridiculous to limit it.

                Pretty much any modern computer should be able to output to more monitors than that.

                • becausechemistry@lemm.ee
                  link
                  fedilink
                  arrow-up
                  5
                  arrow-down
                  1
                  ·
                  7 months ago

                  limit it

                  There isn’t some software limitation here. It’s more that they only put two display controllers in the base level M-series chips. The vast, vast majority of users will have at most two displays. Putting more display controllers would add (minimal, but real) cost and complexity that most people won’t benefit from at all.

                  On the current gen base level chips, you can have one external display plus the onboard one, or close the laptop and have two externals. Seems like plenty to me for the cheapest option.

                • areyouevenreal@lemm.ee
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  2
                  ·
                  7 months ago

                  Not really. There is a compromise between output resolution, refresh rate, bit depth (think HDR), number of displays, and the overall system performance. Another computer might technically have more monitor output, but they probably sacrificed something to get there like resolution, HDR, power consumption or cost. Apple is doing 5K output with HDR on their lowest end chips. Think about that for a minute.

                  A lot of people like to blame AMD for high ideal power usage when they are running multi-monitor setups with different refresh rates and resolutions. Likewise I have seen Intel systems struggle to run a single 4K monitor because they were in single channel mode. Apple probably wanted to avoid those issues on their lower end chips which have much less bandwidth to play with.

    • acockworkorange@mander.xyz
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      7 months ago

      I mean, yeah, don’t ever buy a Mac, but what’s up with the multiple monitors? Do they struggle with it?

      • efstajas@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        7 months ago

        macOS out of the box fucking sucks for monitor scaling with third party monitors. It’s honestly laughable for a modern OS. You can install some third party software that fixes it completely, but it really shouldn’t be necessary. I use an (admittedly pretty strange) LG DualUp monitor as a secondary, and out of the box macOS can only make everything either extremely tiny, extremely large, or blurry.

        Other than that, I’ve had no problems at all, and the window scaling between different DPI monitors is a lot smoother than it was with Windows previously.

      • rsuri@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        7 months ago

        For me it’s that compared to windows and linux, handling multiple windows between screens is always problematic, and is made worse by alt-tab bringing up all the windows for an application, which means they pop up in the other monitors too which isn’t usually what I want. Maximizing is usually not as straightforward as one would hope, and the dock moves to any window if you leave your pointer at the bottom which can get annoying fast. As some point out apparently there’s 3rd party software that allows you to fix these issues, but that’s not an option for me because I use a locked-down Mac for work and can’t install 3rd party software, so I’m stuck with the annoying base behavior.

      • KoalaUnknown@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        3
        ·
        edit-2
        7 months ago

        The base model chips only supports 2 monitors. The Pro, Max, and Ultra chips all support multiple monitors.

          • areyouevenreal@lemm.ee
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            7 months ago

            They all support two monitors (one internal and one external for macbooks, and two external for desktops). It’s not an artificial restriction. Each additional monitor needs a framebuffer. That’s an actual circuit that needs to be present in the chip.

            • ditty@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              7 months ago

              TIL, thanks! 🌝

              I use a Plugable docking station with DisplayLink with a base-level M1 MacBook Air and it handles multiple (3x 1080p) displays perfectly. My (limited) understanding is that they do that just using a driver. So at a basic level, couldn’t Apple include driver support for multiple monitors natively, seeing as it has adequate bandwidth in practice?

              • areyouevenreal@lemm.ee
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                7 months ago

                Sigh. It’s not just a fricking driver. It’s an entire framebuffer you plug into a USB or Thunderbolt port. That’s why they are more expensive, and why they even need a driver.

                A 1080p monitor has one quarter of the pixels of a 4K monitor. The necessary bandwidth increases with the pixels required. Apple chooses instead to use the bandwidth they have to support 2 5K and 6K monitors, instead of supporting say 8 or 10 1080p monitors. That’s a design decision that they probably thought made sense for the product they wanted to produce. Honestly I agree with them for the most part. Most people don’t run 8 monitors, very few have even 3, and those that do can just buy the higher end model or get an adapter like you did. If you are the kind of person to use 3 monitors you probably also want the extra performance.

                • ditty@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  7 months ago

                  Thank you for taking the time to reply, and for further sharing your expertise to our conversation! I understand different resolutions, that the docking station has its own chipset, and why the Plugable is more expensive than other docking stations as a result. I now have a more nuanced understanding of frame-buffers and how DisplayLink interfaces with an OS like MacOS.

                  Allow me to clarify the point I tried to make (and admittedly, I didn’t do a good job of expressing it previously). Rather than focusing on the technical specs, I had intended to have a more general conversation about design decisions and Apple’s philosophy. They know that consumers will want to hook up a base tier MacBook Air to two external displays, and intentionally chose not to build-in an additional frame-buffer to force users to spend more. I sincerely doubt there’s any cost-saving for the customer because Apple doesn’t include that out of the box.

                  Apple’s philosophy has always been that they know what’s best for their users. If a 2020 M1 MacBook Air supports both the internal 2K display and a single external 6K display, that suggests to me it should have the horsepower to drive two external 1080p displays (that’s just a feeling I have, not a known fact). And I’ll acknowledge that Apple has improved this limitation for the newer MBAs, which allow you to disable the built-in display and use two external displays.

                  My broader point is that Apple “knows what’s best” for their users: they want customers to buy an Apple display rather than to just stick with the 1080p LCDs they already own, because they’re not Retina®. Which do you honestly think is a more common use-case for a MacBook Air user: wanting to connect to two monitors (home office, University classroom system, numerous board room settings I’ve worked in, etc), or to connect their $1200 MBA to a $1600-$2300+ Studio Display? For that, anyone with an iota of common sense would be using a MBP etc since they’re likely a creative professional who would want the additional compute and graphics power for photo/video-editing, etc.

                  I don’t disagree with your explanation of the thought-process behind why Apple may have made this hardware decision for MBAs, but it is effectively an arbitrary, non cost-saving decision that will certainly impede customers who expect two displays to just work, since they can do that on their 10-year-old Toshiba Satellite or w/e.

                  Thanks, and have a great day

              • becausechemistry@lemm.ee
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                7 months ago

                What percentage of people who buy the least expensive MacBook do you think are going to hook it up to more than two displays? Or should they add more display controllers that won’t ever be used and charge more for them? I feel like either way people who would never buy one will complain on behalf of people who are fine with them.

                • Zangoose@lemmy.one
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  7 months ago

                  The least expensive MacBook is still $1000, closer to $1500 if you spec it with reasonable storage/ram. It really isn’t that much of a stretch to add $100-300 for a 1080/1440p monitor or two at a desk.

              • areyouevenreal@lemm.ee
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                7 months ago

                Not necessarily. The base machines aren’t that expensive, and this chip is also used in iPads. They support high resolution HDR output. The higher the number of monitors, resolution, bit depth, and refresh rate the more bandwidth is required for display output and the more complex and expensive the framebuffers are. Another system might support 3 or 4 monitors, but not support 5K output like the MacBooks do. I’ve seen Intel systems that struggled to even do a single 4K 60 FPS until I added another ram stick to make it dual channel. Apple do 5K output. Like sure they might technically support more monitors in theory, but in practice you will run into limitations if those monitors require too much bandwidth.

                Oh yeah and these systems also need to share bandwidth between the framebuffers, CPU, and GPU. It’s no wonder they didn’t put 3 or more very high resolution buffers into the lower end chips which have less bandwidth than the higher end ones. Even if it did work the performance impacts probably aren’t worth it for a small number of users.

  • rtxn@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    29
    ·
    7 months ago
    $ pacman -Si god
    error: package 'god' was not found
    

    Take that, theists!

  • justme@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    29
    arrow-down
    5
    ·
    7 months ago

    I can be as rich as god and wouldn’t go for windows or apple. I would rather invest the money in good Foss development

    • chatokun@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      3
      ·
      7 months ago

      Being a support person, if I was rich enough to frivolously buy systems, I’d have at least one of each as a reference system. Yes, I know, vms, but that’s for saving money/space. Especially MAC I’d have some hardware too. Definitely not a main system though. I currently have a broken Mac and a cheap chromebook for that reason, though due to being broken the Mac is rather useless now. When it worked I often used it to help test/troubleshooting customer stuff.

    • fruitycoder@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      For real I’ve put in a fraction supporting OpenSource then I had if I had to buy unfree software and I’ve gotten way more out of it.

      I see it as a difference between owning and renting.

  • BCsven@lemmy.ca
    link
    fedilink
    arrow-up
    23
    ·
    7 months ago

    I have tried templeOS. It is amazing one guy built all that. It feels like it needs training sessions to make better use of it, and also it is wacky as hell

  • TheDemonBuer@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    7 months ago

    Everyone always forgets the “it just works,” easy, normie distributions like Fedora. I guess people figure if you’re looking for an OS like that, you might as well just use Windows, but I’d rather not.

    • TalesOfTrees@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      Oh man, I’m so disappointed. The defaults don’t even port over any of the “really good” MLP themes. Might be because DE versions have changed, but I remember there being a fairly well done (for a stupid gag anyways) Rainbow Dash theme for KDE4.

      I mean, if I’m going to embrace the inner brony, why would I want the same generic looking “dark” theme every other offering has…

  • flop_leash_973@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    7 months ago

    Haha, it amuses me to no end that ever since I watched a “Down the Rabbit Hole” video on Youtube about TempleOS a few years back I have seen it crop up in varies places from time to time as I don’t remember ever seeing anything about it before.

    Makes me wonder if it was always there and I just didn’t notice it until I was familiar with it.