In that case makepkg isn't compiling anything, it's just packaging the existing binaries so that they can be more easily installed and recognized by your package manager.
ipacialsection
Open-source software is distributed primarily as source code in a human-readable programming language. Computers can't actually read these programming languages directly; they need to be translated into the machine language of their CPU (such as x86_64). For some languages, like Python, code can be "interpreted" on the fly; for others, like C, programs must be "compiled" into a separate file format. Additionally, most programs consist of multiple files that need to be compiled and linked together, and installed in certain folders on your system, so the compiler and additional tools work to automate that process.
Most users of Linux rarely if ever have to compile anything, because the developers of Linux distros, and some third parties like Flathub, curate collections of (mostly) open-source software that has already been compiled and packaged into formats that are easy to install and uninstall. As part of this process, they usually add some metadata and/or scripts that can automate compiling and packaging, so it only requires a single command (makepkg on Arch, dpkg-buildpackage on Debian.) However, some newer or more obscure software may not be packaged by your distribution or any third-party repo.
How to compile depends on the program, its programming language and what tools the developers prefer to use to compile it. Usually the README file included with source code explains how to compile the software. The most common process uses the commands ./configure; make; sudo make install
after installing all of the program's dependencies and cd-ing to the source code directory. Other programs might include the metadata needed for something like makepkg to work, be written in an interpreted language and thus require no compilation, or use a different toolchain, like CMake.
I hope these shots aren't a good representation of the whole episode, because if they are, it's a 15 minute scene of L'Ak and Moll on a planet followed by 35 minutes of close-up shots of Rayner
Last time I used Elementary OS, it was great if you were only using the official apps, with insane degrees of polish, but things like LibreOffice were surprisingly hard to configure the way I wanted. That was a while ago, though.
There was still Wine, and PlayOnLinux helped further, but when I looked for a game I wanted to play on WineDB, there was no guarantee it even had an entry, and if it wasn't listed as "platinum", the chance of you experiencing any reported issue was very high.
Not to mention, playing Steam games that weren't native was an impossibility.
Thankfully I was more of a console gamer at the time, and I got a lot of enjoyment out of the few games that received Linux ports - like Team Fortress 2!
It's just a future, and can be changed. I'm not aware of any objective in-universe measure of what is and isn't the prime timeline, it's really just what the writers choose to depict as such, which events are altered by time travelers and which ones stand. Since we've seen a full two seasons in this version of the 32nd century, it's more likely that future shows will try to keep consistent with it, but it's also possible they'll be "retconned" into an "alternate reality".
I still follow Planet KDE and Planet Debian, and can vouch for both. They're great for both learning about the development processes of those projects, and finding interesting blogs on unrelated topics that happen to have been written or linked by the contributors.
I unfortunately haven't found that many I can remember. But a comment on Busybox cat that linked to a talk titled "cat -v considered harmful" did send me down a rabbit hole once.
This really bothers me. Closed standards locked behind a licensing fee may as well not be standards at all, in my opinion.
I find it really fun to browse the Debian repository and its source code with their dedicated websites for doing so ( https://packages.debian.org/ and https://sources.debian.org/ ), to find all the obscure utilities, and silly code comments.
I was dualbooting 2 Linuxes for a long time. All that you need is to install GRUB once on one of the distros, but having two or more bootloaders in the same EFI System partition is generally harmless, and might happen due to how some distros' installers are written. In that case the BIOS boot order will decide which one to use. Either way, you only need one boot partition.
It is safe to delete all partitions on your hard drive if and only if you have backed up any important data on them. It's basically the same as installing a new hard drive. The installer for your distro will be able to re-create all of them.
I have personally never used a shared /home between multiple distros, but based on my experience switching desktop environments, there are likely to be conflicts between files that lead to bugs. Arch and Pop!_OS will have vastly different versions of most software, and it's possible that changes to a config file in one distro may break the program in the other. Shared /home is better for if you have just one OS installed, and reinstall it occasionally.
Yeah, basically. makepkg automates the process of creating an Arch package, and while usually that involves compiling source code, sometimes it just means converting proprietary software that has already been compiled into a different format.