this post was submitted on 17 Dec 2024
22 points (82.4% liked)

Linux

8339 readers
56 users here now

Welcome to c/linux!

Welcome to our thriving Linux community! Whether you're a seasoned Linux enthusiast or just starting your journey, we're excited to have you here. Explore, learn, and collaborate with like-minded individuals who share a passion for open-source software and the endless possibilities it offers. Together, let's dive into the world of Linux and embrace the power of freedom, customization, and innovation. Enjoy your stay and feel free to join the vibrant discussions that await you!

Rules:

  1. Stay on topic: Posts and discussions should be related to Linux, open source software, and related technologies.

  2. Be respectful: Treat fellow community members with respect and courtesy.

  3. Quality over quantity: Share informative and thought-provoking content.

  4. No spam or self-promotion: Avoid excessive self-promotion or spamming.

  5. No NSFW adult content

  6. Follow general lemmy guidelines.

founded 2 years ago
MODERATORS
 

Part of what's making learning Linux so difficult for me, is the idea of how fragmented it is. You can install programs with sudo apt get (program). You can get programs with snaps. You can get programs with flatpaks. You can install with tar.gz files. You can install with .deb files. You can get programs with .sh files. There's probably more I don't know about.

I don't even know where all these programs are being installed. I haven't learned how to uninstall them yet. And I'm sure that each way has a different way to uninstall too.

So that brings me to my main question. Why not consolidate all this? Sure, files CAN be installed anywhere if you want, but why not make a folder like /home/programs/ where it's assumed that programs would be installed?

On windows the programs can be installed anywhere, but the default is C:/windows/Program Files x86/ or something like that. Now, you can change it all you want when you install the programs. I could install it to C:/Fuckfuckfuck/ if I wanted to. I don't want to, so I leave it alone because C:/Windows/Program Files x86/ is where it's assumed all the files are.

Furthermore, I see no benefit to installing 15 different programs in 7 different folders. I begrudgingly understand why there's so many different installation methods, but I do NOT understand why as a collective community we can't have something like a standardized setting in each distro that you can set 1 place for all your installation files.

Because of the fragmentation of distros, I can understand why we can't have a standardized location across all distros like Windows has. However I DON'T see why we can't have a setting that gets set upon each first boot after installation that tells each future installation which folder to install to.

I would personally pick /Home/Programs/, but maybe you want /root/Jamies Files/ because you're Jamie, and those are your files.

In either case, as we boot up during the install, it would ask us where we want our program files installed. And from then on, no matter what method of install you chose, it would default to whatever your chosen folder was.

Now, you could still install other places too, but you would need to direct that on a per install basis.

So what's the benefit of having programs each installed in seperate locations that are wildly different?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 1 week ago

You can install programs with sudo apt get (program). You can get programs with snaps. You can get programs with flatpaks. You can install with tar.gz files. You can install with .deb files. You can get programs with .sh files.

APT, the Advanced Packaging Tool, uses .deb files. Speaking technically, APT is the app store part of the system, the part that actually installs packages is called dpkg. If you have a .deb file and you want to install it, you can directly invoke dpkg. I have encountered things like printer drivers where the vendor will provide in .deb format. For one reason or another they're doing it the way the Debian/Ubuntu standard package manager does it, but not publishing the files to the repositories.

Snaps, flatpaks and appimages are intended to solve the problem of fragmentation. See, APT with it's .deb files are the Debian system used by most if not all Debian/Ubuntu forks. DNF with .rpm files are Red Hat/Fedora's system. Arch uses Pacman with .pkg.tar files. SuSe, Nix, and a couple others have their own systems, and none of them interoperate.

One thing these do have in common is where they put the files they install; they are sorted by type and function into several directories in the root directory. Most of them you'll find in the /usr directory, such as /usr/bin, /usr/sbin, /usr/lib or /usr/share. bin for binary files, this is where executables like sh or bash or cat or grep. sbin is for binary files that are only for those with super user privileges; you have to switch users to root or use something like sudo to execute those. Non-executable files like libraries, data, assets etc. will be variously stored in lib, local, share and so forth. You may find there are duplicate folders, like /bin and /usr/bin. In the olden days of minicomputers these were often located on separate drives, but nowadays /bin is a symlink (basically a shortcut) of /usr/bin. Some of that is maintained for backward compatibility or in case you still need to use the differences. It gets a lot more nuanced than I'm talking about here.

They aren't installed in /home/$USER because if you have multiple users on the machine you probably want them all to have access to the software. Say you've got a husband, wife and two kids, would you want to have to install four copies of LibreOffice?

If you do want to install software in the user's home directory, the place to do that is probably /home/$USER/.local/bin

=====

This does indeed make it a pain in the ass to publish software for Linux. So they wanted one system that would work on most if not all distros. Of course three competitors arrived.

Snap is Canonical/Ubuntu's attempt which IIRC started with their embedded systems but they've been pushing as their whole deal. It's the closest thing to an Apple App Store there is; it's owned by Canonical, the back end is proprietary so no one else can host the Snap repository, etc. A lot of folks don't like that they kept some of it proprietary, plus there are performance issues with how it's implemented; it mounts virtual disks for each app installed from Snap (or something similar to that) so it gets messy.

Flatpak I think sprung up in or around the Fedora side of the world but has always had the goal of being omnicompatible. It is totally open, you can host your own flatpak repo if you want, though the de facto standard repository is Flathub. At this point, I think Flathub has achieved the goal of being the one place where, if you have a commercial app you want to also publish a Linux version of, publishing it as a Flatpak will reach most of the Linux audience in one shot. There are a couple downsides to it; for example Flatpak isn't a great way to distribute command line tools, but for GUI-based applications it works fine. Flatpaks are weird; they tend to install in /var somewhere, I think for reasons surrounding permissions.

Appimage is also around; the technique here is it's basically a .iso file that has everything the app will need to run in it, so you mount the .iso and run the executable file inside. No infrastructure or installer needed. There are package managers for it, but they aren't required. I find appimage to be the one to use if you're making some small niche software that 200 people in the world will ever want and you don't want to bother with the repositories, but I don't like using them for main distribution.

=====

Let's change gears and talk about .tar.gz or .run or .sh files. These are non-standard ways of installing software on Linux and I would avoid them unless it's a last resort type of situation.

A .tar.gz file is functionally similar to a .zip file on Windows; it's a TAR file (Tape ARchive. A bunch of files packaged into one continuous file, initially designed to be written to magnetic tape) which has been compressed with GZip. When software is distributed this way, it is up to the user to extract and store it...somewhere. Probably the best place to do this is in /opt, which is the standard location for pre-compiled software manually added to the machine by the administrator.

.run files have basically no standard and can do damn near anything and are a bad idea. It could be written in any language and do anything. You just don't know. The one I encountered the most was Simplify3D's installer; what they did was directly translate their Windows installer to Linux, the way Windows install wizards work, rather than do things in a Linux way. Hopefully you don't have to deal with that.

a .sh file is a shell script. Once again these can do anything but it'll be a text file full of shell commands that you can read and understand. It'll probably consist of a series of commands like wget or curl which download files from the internet and then commands to put them where they need to go in the file system. Once again this tends to be the approach for the "I'm usually a Windows guy but here's my Linux version" folks. This is going to choose where to put software for you, hopefully in /usr/local.

=====

You may also find yourself compiling software from source code, which is what's happening when you're told to go to Github and then run a series of instructions like git clone, make, make install. Make is basically a scripting system specifically for compiling software, it will figure out what to do. I believe locally compiled software ends up in /usr/local/bin.

===

Summary: There are a lot of directories where software ends up, and it's done the way it is so you can tell by where it is stored what it is, what it does, who has access to it, who manages it and how it arrived on the system. It also keeps automatic processes like the package manager from interfering with software manually installed or compiled by the users or system administrators.