this post was submitted on 25 Jun 2023
276 points (98.3% liked)

Selfhosted

40326 readers
688 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
276
submitted 1 year ago* (last edited 1 year ago) by betternotbigger to c/selfhosted
 

I've never had so much fun self-hosting. A decade or so ago I was hosting things on Linode and running all kinds of servers for myself but with the rise of cloud services, I favored just giving everything to Google. I noticed how popular this community was on Reddit/Lemmy and now it's my new addiction.

I'm a software engineer and have plenty of experience deploying to AWS/GCP so my head has been buried in the sand with these cloud providers. Now that I'm looking around there are things like NextCloud, Pihole, and Portainer all set up with Cloudflare Zero Trust... I feel like I'm living the dream of having the convenience to deploy my own services with proper authentication and it's so much fun.

Reviving old hardware to act as local infra is so badass it feels great turning on old machines that were collecting dust. I'm now trying to convince my brother to participate in doing hard-drive swaps on a monthly basis so I have some backup redundancy off-site without needing to back up to the cloud.

Sorry if this feels ranty but I just can't get over how awesome this is and I feel like a kid again. Cheers to this awesome community!

EDIT: Just also found Fission and OpenFaaS, selfhosted serverless functions, I'm jumping with joy right now!

top 47 comments
sorted by: hot top controversial new old
[–] [email protected] 46 points 1 year ago* (last edited 1 year ago)

Yea between the enshitificaiton of the internet and how far selfhost software has come it is a great time to selfhost and will just keep getting better.

Selfhosting, reddit drama, kbin, all this just makes it seem like the internet is having a sort of grassroots, back to basics movement which I'm all for lol.

[–] Tired8281 24 points 1 year ago (3 children)

Docker is hurting my progress. I just can't seem to wrap my head around it. Is there a Docker for Dummies?

[–] pontiffkitchen0 12 points 1 year ago (1 children)

Is there a specific part that you’re having trouble with? Is it more how it works under the hood, or more about using it to spin up containers? I can try to answer any questions and post some how tos for you.

[–] Tired8281 8 points 1 year ago (3 children)

I think I just need a general overview. Something about the concept isn't clicking for me, and it makes it hard for me to learn how to use it when I fundamentally don't get it. Is there a really good "Introduction to Docker and the tools people use with it" that I haven't found?

[–] Glitchington 8 points 1 year ago (1 children)

Say you install with apt, and the app needs a dependency that breaks your setup. You use docker to utilize your os, but containerize dependencies. You can also better organize which containers use your computer's network, and which use a virtual network where you can redirect an incoming port to avoid conflicts.

Containers are like VMs, but for an application instead of a whole OS, though you can put multiple apps in one container. Good for if they need to share files.

For a more visual approach, look into Portainer. It gives you an admin page you can open in your browser to manage docker containers.

[–] Tired8281 2 points 1 year ago (2 children)

I actually have Portainer set up and running, and I even spun up a few simple containers in it. Unfortunately I did so by following a guide to complete a specific task. I completed the task successfully, but now I have a Portainer install that I don't understand in the slightest, and don't know how to update it or any of the containers in it, or really do anything that wasn't covered in the guide I followed (which I now cannot find). I found a YouTube video that tries to explain Portainer, but I don't know the terminology of Docker enough to understand what they are saying, and I haven't found a Docker video simple enough to bring me up to speed.

[–] [email protected] 7 points 1 year ago

The easiest way to think about docker is to consider it a type of virtual machine, like something you'd use VirtualBox for.

So let's say you run Windows, but want to try out Linux. You'd could install Ubuntu in a VirtualBox VM, and then install software that works on Ubuntu in that VM, and it's separate from Windows.

Docker is similar to this in that a docker container for a piece off software often includes an entire operating system within it, complete with all of the correct versions of drivers that the software needs to function. This is all in a sandbox/container that does not really interact with the host operating system.

As to why this is convenient: Let's say that you have a computer running Ubuntu natively/bare metal. It has a certain version of python installed that you need to run the applications you use. But there's some new software you want to try that uses a later version of python that will break your other apps if you upgrade.

The developer of that software you want to try makes a docker version available. There's a docker-compose.yml file that specifies things like the port the application will be available on, the time zone your computer is in, the location of the docker files on dockerhub, etc. You can modify this file if you like, and when you are done, you type docker compose up -d in the terminal (in the same directory as the docker-compose.yml file).

Docker will then read the compose file, download the required files from the repository, extract them, set up the network and the web server and configure everything else specified in the compose file. Then you open a browser, type in the address of the machine the compose file is on, followed by the port number in the compose file (ex: http://192.168.1.100:5000), and boom, there's your software.

You can use the new software with the newer version of python at the same time as the old stuff installed directly on your machine.

You can leave it running all the time, or bring it down by typing docker compose down. Need to upgrade to a new version? Bring the container down, type docker compose pull, which tells docker to pull the latest version from the repository, then docker compose up -d to bring the updated version back up again.

Portainer is just a GUI that runs docker commands "under the hood".

[–] Glitchington 2 points 1 year ago

Most of what I learn comes from watching videos, and when I don't understand a term I pull up the docs and search for it. Super useful in expanding your understanding of a tool.

Docker docs in case you're feeling lazy.

[–] [email protected] 5 points 1 year ago (2 children)

I think the real benefits of Docker don't become unquestionably obvious untill you've ever tried to manage more than one installation of some kind of server software in the same machine and inevitably learn the hard way that this comes with a lot of problems and downsides.

  • From simple things like if the environment needs a restart, you can just restart the container, without rebooting the machine, interrupting other applications.
  • To seriously dangerous and problematic things, like configuring your system to work with your new application only to realize that this configuration is breaking your other server software.
[–] Tired8281 3 points 1 year ago (1 children)

So far I've avoiding learning about Docker by just buying a new old end-of-life Chromebook when I wanted to run anything. Works pretty well, except for the giant pile of Chromebooks behind my TV.

[–] Tayphix 7 points 1 year ago

I would really recommend just playing around with Docker until you understand it rather than buying old hardware for each service.

[–] [email protected] 1 points 1 year ago

Have a look at portainer?

[–] Karlmit 2 points 1 year ago

I learned the basics of docker by using synology and unraid. They make it really easy setting up docker apps.

[–] betternotbigger 1 points 1 year ago

Are you having trouble learning it or understanding what it's used for? Much of learning Docker also comes with understanding some basics of software deployment like environment variables, ports and volumes. Happy to help answer any questions because it's an extremely powerful tool once it starts clicking.

[–] AusatKeyboardPremi 16 points 1 year ago (2 children)

Saw this post on “All”. Last I checked (sometime in 2019), self-hosting was a fairly involved process.

Has the process simplified enough for a complete beginner like me to begin self-hosting services on, say, a raspberry pi?

If yes, can you please point me to a good resource/wiki?

[–] [email protected] 18 points 1 year ago (2 children)
  1. Follow docker install guide for raspi
  2. Browse awesome-selfhosted and find services that seem interesting to you or ask for recs here.
  3. Follow the projects guide to do a docker install
  4. (Bonus) Setup a reverse proxy like nginx proxy manager so you can access your services with urls
  5. (Bonus) Setup domain and a service such as Tailscale so you can access your services safely from outside your home.
[–] AusatKeyboardPremi 4 points 1 year ago (3 children)

Thanks for the steps!

I remember steps 4 & 5 were the ones that made me drop the idea. It involved a lot of configuration.

I will take a look once again, hopefully these have become simple enough.

[–] [email protected] 7 points 1 year ago (1 children)

I set up wireguard vpn and took down all my reverse proxies as it feels more secure and is easier to maintain.

From what I’ve heard tailscale is a step easier as well. So you could vpn into your network rather than accessing the services via URL.

[–] [email protected] 4 points 1 year ago

Yep, good point!

And yes Tailscale is super simple and beginner friendly, it literally installs and is ready to use in seconds.

[–] [email protected] 4 points 1 year ago (1 children)

Np, I would say dm me if you have any questions but I dunno if you can message between lemmy and kbin haha

[–] AusatKeyboardPremi 2 points 1 year ago

Thanks for the support. :-)

Will surely DM you or create a post here if I am lost during the setup process.

[–] rolaulten 2 points 1 year ago (1 children)

Getting a domain is sounds more scary then it really is. In reality you fork over some small amount of cash to a company (like cloudflare, AWS, etc) and they give you a domain.

For the reverse proxy, 95% of the time it's a basic set of files you drop into the correct folders (or pass into your container if using a containerized solution). The other 5% of the time the final app require something slightly less cut and dry (but generally still understood).

If you need help/want some pointers dm me and I can get you going in the right direction.

[–] AusatKeyboardPremi 2 points 1 year ago

Thank you so much, I will keep this in mind when I start tinkering next weekend. In the meantime, I will search for my old Pi 3. :-)

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

Honestly I've never used docker properly and one time I tried for the *arr stack I ran into many issues with access to storage drives and connectivity between different services. Does it actually help with anything on rpi? I thought it's good enough to just install the rpi OS and then install other services normally on it?

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

Nope, do whatever suits you!

I would say tho the example you made is one of the infamous cases where docker is more difficult to setup than without due to the file locations of your movies, etc needing to match between dockers. When I set it up I found a really good guide that not only explained how to set it up but they also explained the logic and reason behind the issue.

https://wiki.servarr.com/docker-guide#consistent-and-well-planned-paths

Another good guide about the issue:
https://trash-guides.info/Hardlinks/How-to-setup-for/Docker/

The reason I’d initially recommend docker to a beginner is it keeps everything clean and organized, it’s easy to undo mistakes while learning, and I feel some apps are easier to setup with docker because they come with the dependencies already installed and configured properly.

[–] [email protected] 2 points 1 year ago

If I were to setup docker the way the guide explains it, could I then just backup that and reuse it every time I wanted to refresh my *arr stack installation and configuration? Can I make it first on PC and then transfer to rpi, would that be "in the spirit" of docker? In my head they are always black boxes that should work regardless of their environment but it never turned out to be like that in practice when I'd try actually using them, so I'm still not sure of the use case 😄. I get it it's useful when you have to deploy to many different hardware configurations in prod but that's not an issue with self hosting at home?

[–] dustojnikhummer 2 points 1 year ago (1 children)

Actually, I would argue the simplest way to self host today is TrueCharts.

The problem is when it breaks, you are SOL because you didn't build it yourself so you got no clue how it works

[–] AusatKeyboardPremi 1 points 1 year ago

Thanks for the pointer. Will evaluate it as well.

[–] [email protected] 14 points 1 year ago* (last edited 1 year ago) (3 children)

If you want to host things and be accessible from outside your home then I'd start with getting a domain and static IP, point the DNS at your IP, make sure your DNS provider is supported by Let's Encrypt DNS authentication.

Then setup nginx as a reverse proxy and get Let's Encrypt setup with auto renewal. That way you can have secure https connections to your home.

Then install docker compose, fire up a service and configure nginx to proxy to it

[–] [email protected] 11 points 1 year ago (1 children)

I usually cut down on domain/DNS cost by using a free dynamic DNS service called duckdns. It works super well, provides Let's Encrypt support and sub-sub-domains. (for example your could have https://git.$username.duckdns.org)

I've found as a IT noob that Caddy 2 provides as much "batteries inside" and "boiler-plate free" to support me, because I have no Idea what I'm doing. So I just let caddy handle my encryption and reverse proxy to my actual server.

I'm an embedded software dev, who only discovers ethernet protocols on a surface level, because we hadn't need it yet in previous projects, so I'm a bit lost on how to do cloud stuff. So having all these great tools for free for me to try out and connect from outside to my media servers and stuff is awesome!

[–] SpaceAape 2 points 1 year ago (1 children)

My old cheap Asus N66u router has a free dyndns service built-in. Super easy to setup. I use it to host a jellyfin setup. Bout to setup a torrent server and a NextCloud server. Used to run a owncloud server a few years back and loved having it.

[–] dustojnikhummer 1 points 1 year ago

Don't forget to put your torrent client behind Gluetun!

[–] shroomato 5 points 1 year ago

https://hub.docker.com/r/linuxserver/swag is a nice image that gets you an nginx reverse proxy and Let's Encrypt automatic cert creation/renewal set up out of the box, with a bunch of sample configs for popular self-hosted services.

[–] ramblechat 1 points 1 year ago

I'm a tailscale convert - tried nginx and cloudflare and let's encrypt but was never happy with it. Tailscale is a lot easier IMHO

[–] [email protected] 13 points 1 year ago (1 children)

And it can get really low effort too.

I do very little maintenance as I just don't have time at the moment. Everything just runs.

I love paperless, immich and meanie as my top apps, with nginx proxy manager dealing with the proxying.

[–] ech0 1 points 1 year ago

Meanie? Did you mean Mealie perhaps?

I used all the apps you mention and they are fantastic at what they do.

[–] [email protected] 12 points 1 year ago

Absolutely! I've been enjoying it a lot too. Hosting Mastodon, Matrix, kbin, and a couple of game servers now from my basement 🙂

[–] [email protected] 7 points 1 year ago

Glasfiber home internet has also really improved the available upload speed, which is great for self-hosters.

[–] [email protected] 3 points 1 year ago (2 children)

Haven't looked into it too much other than running wireguard to get around CG-NAT. How hard is it to deal with SSH certificates when setting up your own hosting?

[–] kaktus 4 points 1 year ago

Managing certificates is fairly easy with let's encrypt and certbot. Just get a free subdomain from duckdns and give it a try. The only thing I wish I knew earlier is, that you don't need the whole snapd thing to install certbot, like they tell you in the official Dokumentation, but can just install it from the debian repository (and I assume the same goes for Ubuntu)

[–] rambos 3 points 1 year ago (1 children)

How do you get around CG-NAT with wireguard. I dont know much about that, but when ISP enabled CG NAT on my service my wireguard stopped working. I fixed that by asking them to turn it off, but would be nice to know whats workaround. Duckdns was running all the time in docker container, but didnt work with CG NAT

[–] [email protected] 1 points 1 year ago

I had to do a fair bit of googling but it was something like https://www.vultr.com/docs/set-up-wireguard-vpn-on-ubuntu-20-04/

This guide was also great: https://www.wireguard.com/quickstart/

Basically I got a $6 a month vps (with $250 credit for some reason) and configured it to be a tunnel to computers on my network that ran Web and community radio stuff.

[–] [email protected] 2 points 1 year ago (3 children)

Noob here in terms of self-hosting. How do you self-host multiple apps? Wouldn't it get unmanageable at some point?

[–] Alexffjeg 2 points 1 year ago

Yeah, if you hosted them all as installed services then it would be pretty hard to manage, but if you're running them as containers and have some management software its easy. I have a very simple setup with portainer and docker-compose and it's no problem for me to manage about 10 services. I don't think I'll be adding more in the near future, but even if I would, it still wouldn't be a problem.

[–] ramblechat 1 points 1 year ago

There are some good tools to manage them - portainer, Heimdall, uptime kuma are ones I use.

load more comments
view more: next ›