Edit: here is some context because people are getting mad

the reason i asked because my friend asked me to install linux on his laptop because he wanted to look like a cringey hacker so i installed it but after i installed linux few days after he reinstalled windows(i am not sure why but he said he can’t run bluestack, i suggested other VMs but he wouldn’t have any other way but that’s not reason i think he switched he was being dismissive) and now his mic and web cam is not working and some other stuff, so he’s asking me again to reinstall linux constantly and i don’t want to do that again (why? My school is far from my home around 9km/5.5 and i go there by my bicycle so after school I don’t wanna waste my time installing linux)so i was just ranting didn’t expect to make people mad

  • dragontamer@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    5
    ·
    edit-2
    1 year ago

    Because it isn’t inferior.

    Ubuntu barely can run programs from 5 years ago, backwards compatibility is terrible. Red Hat was doing good but it just shit the bed. To have any degree of consistenty, you need to wrap all apps inside of a Docker and carry with you all the dependencies (but this leads you to obscure musl bugs in practice, because musl has different bugs than glibc).

    For better or worse, Windows programs with dependency on kernel32.dll (at the C++ level) have remained consistently deployed since the early 1990s and rarely break. C# programs have had good measures of stability. DirectX9, DirectX10, DirectX11, and DirectX12 all had major changes to how the hardware works and yet all the hardware automatically functions on Windows. You can play Starcraft from 1998 without any problems despite it being a DirectX6 game.

    Switch back over to Ubuntu land, and Wayland is… maybe working? Eventually? Good luck reaching back to programs using X.org dependencies or systemd.


    Windows is definitely a better experience than Ubuntu. I think Red Hat has the right idea but IBM is seemingly killing all good will built up to Red Hat and CentOS. SUSE linux is probably our best bet moving forward as a platform that cares about binary stability.

    Windows networking stack is also far superior for organizations. SAMBA on Linux works best if you have… a Windows Server instance holding the group-policies and ACLs on a centralized server. Yes, $1000 software works better than $0 software. Windows Server is expensive but its what organizations need to handle ~50 to ~1000 computers inside of a typical office building.

    Good luck deploying basic security measures in an IT department with Linux. The only hope, in my experience, is to buy Windows Server, and then run SAMBA (and deal with SAMBA bugs as appropriate). I’m not sure if I ever got a Linux-as-Windows-server ever working well. Its not like Linux development community understands what an ACL is in practice.

    • Shdwdrgn@mander.xyz
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      You might have some valid points here, however bringing Ubuntu into the argument and saying things are broken is exactly like saying you are running last night’s upload and complaining that you can’t get it to start. You run Ubuntu if you don’t care one cent about having a stable computer. If you want a system that will work reliably even with older software then use something like Debian that doesn’t push beta releases to end users.

      • dragontamer@lemmy.ca
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        like saying you are running last night’s upload

        If only. I’m running old stuff, not by choice either.

        Ubuntu 18.04 and up literally fails to install on one of my work computers. I’ve been forced to run Ubuntu 16.04. BIOS-incompatibility / hardware issues happen man. It forces me to an older version. On some Dell workstations I’ve bought for my org, Ubuntu 22 fails to install and we’re forced to run Ubuntu 20.04 on those.

        Software compiled on Ubuntu 16.04 has issues running on Ubuntu 20.04, meaning these two separate computers have different sets of bugs as our developers run and test.

        I’m running old LTS Ubuntu instances, not because I want to mind you. But because I’ve been forced with hardware incompatibility bugs to do so. At least we have Docker, so the guy running Ubuntu 20.04 can install docker and create an Ubuntu 16.04 docker to run the 16.04 binaries. But its not as seemless as any Linux guy thinks.


        CentOS is too stable and a lot of proprietary code is designed for Ubuntu instead. So while CentOS is stable, you get subtle bugs when you try to force Ubuntu binaries onto it. If your group uses CentOS / RedHat, that’s great. Except its not the most popular system, so you end up having to find Ubuntu boxes somewhere to run things effectively.

        There’s plenty of Linux software these days that forces me (and users around me) to use Linux in an office environment. But if you’re running multiple Linux boxes (This box is Ubuntu, that one is Ubuntu 16 and can’t upgrade, that other box is Red Hat for the .yum packages…), running an additional Windows box ain’t a bad idea anyway. You already were like 4 or 5 computers to have this user get their job done.


        Once you start dealing with these issues, you really begin to appreciate Windows’s like 30+ years of binary compatibility.

          • dragontamer@lemmy.ca
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            1 year ago

            You know that doesn’t matter when commercial software often only releases and tests their software on Ubuntu and RedHat, right?

            I run Ubuntu / Red Hat / etc. etc. because I’m forced to. Do you think I’m creating a lab with a billion different versions of Linux for fun?


            Linux kinda-sorta works if you’ve got the freedom to “./configure && make && make install”, recreating binaries and recompiling often. Many pieces of software are designed to work across library changes (but have the compiler/linker fix up minor issues).

            But once you start having proprietary binaries moving around (and you’ll be facing proprietary binaries cause no office will give you their source code), you start having version-specific issues. The Linux-community just doesn’t care very much about binary-compatibility, and they’ll never care about it because they’re anti-corporate and don’t want to offer good support to binary code like this. (And prefers to force GPL down your throats).

            There’s certainly some advantages and disadvantages to Linux’s choice here (or really, Ubuntu / Red Hat / etc. etc. since each different distro really is its own OS). But in the corporate office world, Linux is a very poor performer in practice.

            • chickenf622@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Fair point. I think I’m just too used to dealing with the bullshit of building the packages myself cause I find it fun. Definitely not viable for commercial use.

        • Shdwdrgn@mander.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          And here I just installed Debian 11 on a poweredge 860. No trouble at all, it just works. I’ve installed newer versions of debian of older laptops as well, I have a thinkpad T42p sitting here that runs it although it hasn’t seen much use lately. I dunno, it doesn’t sound like you have a linux problem, more like an ubuntu problem.

          Maybe I’m just sour with ubuntu because they literally ignored a network driver bug for a decade before closing it despite the kernel driver working fine, rebuilding the kernel with their own instructions worked fine, and yet the problem persisted through all of their releases during that period. I completely ditched ubuntu a few months later but I kept getting the emails from the bug tracker over the years as other people continued to experience the same problem. Using ubuntu in a business environment is just insanity.

      • dragontamer@lemmy.ca
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Yeah, cause the Win32 API + DirectX is more stable than the rest of Linux.

        There’s a reason why Steam games prefer to emulate Win32 API on Linux, rather than compiling to Linux binary native. Wine is more stable than almost everything else, and Windows’s behavior (both documented, and undocumented) has legendary-levels of compatibility.

      • dragontamer@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Honestly, Docker is solving the problems in a lot of practice.

        Its kinda stupid that so many dependencies need to be kept track of that its easier to spin up a (vm-like) environment to run Linux binaries more properly. But… it does work. With a bit more spit-shine / polish, Docker is probably the way to move forward on this issue.

        But Docker is just not there yet. Because Linux is Open Source, there’s no license-penalties to just carrying an entire Linux-distro with your binaries all over the place. And who cares if a binary is like 4GB? Does it work? Yes. Does it work across many versions of Linux? Yes (for… the right versions of Linux with the right compilation versions of Docker, but yes!! It works).

        Get those Dockers a bit more long-term stability and compatibility, and I think we’re going somewhere with that. Hard Drives these days are $150 for 12TB and SSDs are $80 for 2TB, we can afford to store those fat binaries, as inefficient as it all feels.


        I did have a throw-away line with MUSL problems, but honestly, we’ve already to incredibly fat dockers laying around everywhere. Why are the OSS guys trying to save like 100MB here and there when no one actually cares? Just run glibc, stop adding incompatibilities for honestly, tiny amounts of space savings.