this post was submitted on 14 Oct 2023
43 points (90.6% liked)

Selfhosted

40385 readers
495 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Is it a bad idea to use my desktop to self host?

What are the disadvantages?? Can they be overcome?

I use it primarily for programming, sometimes gaming and browsing.

all 26 comments
sorted by: hot top controversial new old
[–] [email protected] 34 points 1 year ago (1 children)

It's a terrible idea - do it anyway. Experimentation is how we learn.

If you have a reasonably modern multi-core system you probably won't even notice a performance hit. The biggest drawback is that you have a single thing that is holding all your eggs. So if an upgrade goes wrong, or you're taking things down for maintenance then everything is affected. And there can be conflicts between required versions of libraries, OS, etc. that each service needs.

Separating services, even logically, is a good idea. So I'd recommend you use containers or VMs to make it easier to just "whelp, that didn't work" and throw everything away or start from scratch. It also makes library dependencies much easier to deal with.

[–] [email protected] 5 points 1 year ago (1 children)

So I already host a lot of stuff on a raspberry pi 4B. But when I tried to host Jellyfin, encoding was trouble on it, so I used my desktop to host Jellyfin as a quick solution, but using sshfs from the raspberry pi to access the media files. So now I wonder, is it worth it moving Jellyfin to something else? Is it worth it moving the media files to the desktop?

[–] [email protected] 2 points 1 year ago (1 children)

Is it performing well as is? sshfs isn't very high performance, but if it's working it's fine - nfs would likely perform better though. I run jellyfin in a vm with an nfs mount to my file server and it works fine. Interface is zippy and scanning doesn't take too long. I don't get GPU acceleration but the CPU on that system (10th gen i7 I think) is fast enough that I haven't had much trouble with transcoding (yet).

[–] [email protected] 2 points 1 year ago (1 children)

It's actually not bad, surprisingly. I have had issues sometimes, but they're network issues related to my router. I haven't had them in a while.

[–] [email protected] 1 points 1 year ago

If it's working - that's fine. Creating dependencies can make things more complex (you now need two systems running for one service to work) - but also isolating 'concerns' can be beneficial. Having a single "file server" lets me re-build other servers without worrying about losing important data for example. It separates system libraries and configuration from application data. And managing a file-server is pretty simple as the requirements are very basic (Ubuntu install with nfs-utils - and nothing else). It also lets me centralize backups as everything on the file server is backed-up automatically.

Things can be as simple or as complex as you want. I will re-iterate that keeping a "one server per service" mindset will pay off in the long-run. If you only have your desktop and a Pi then docker can help with keeping your services well isolated from each other (as well as from your desktop).

[–] [email protected] 33 points 1 year ago

It'll effect performance. It will need to be always on. It risks having interaction between your normal applications and server services. Also all your eggs in one basket if something goes wrong. That said it shoul be fine. Just take frequent snapshots and backups for important data

[–] SGG 19 points 1 year ago

If you are going to use your desktop, I would suggest putting all of the self-hosted services into a VM.

This means if you decide you do want to move it over to dedicated hardware later on, you just migrate the VM to the new host.

This is how I started out before I had a dedicated server box (refurb office PC repurposed to a hypervisor).

Then host whatever/however you want to on the VM.

[–] [email protected] 17 points 1 year ago (1 children)

I mean, I use a regular desktop computer that I just installed Ubuntu on and plugged it into an ethernet cable in the closet and closed the door. Now it's my server. RGB and all.

[–] [email protected] 2 points 1 year ago

RGB and all.

Proper server

[–] CriticalMiss 11 points 1 year ago (1 children)

By hosting services on your desktop, you are increasing your threat surface. Every additional software that you run increases your potential to catch malware. It also requires powering a beefy machine 24/7 to keep the service up, when in reality anything that isn't a media server can run on 3rd gen Intel CPUs that have relatively low TDP.

[–] [email protected] 2 points 1 year ago (1 children)

conversely you could also run them on a low end chip of a current/recent gen and get even lower power draw for equivalent or better performance

[–] CriticalMiss 2 points 1 year ago

Not false, but older parts tend to be cheaper.

[–] Blaster_M 9 points 1 year ago

And if you're allergic to buying used, there's always the mini computers.

[–] [email protected] 8 points 1 year ago

Is power consumption a consideration? I want my self hosted server on 24/7, so a low-power single board is much more economical for me.

Also, are resources a problem? If your game is maxing out your rig and some batch job on a self hosted service starts, that could be annoying


or it could be a non-issue, just depends on your usage both as a desktop and a server.

[–] [email protected] 8 points 1 year ago

I would not recommend using your primary desktop for self hosting. If you just absolutely have to, install Virtual Box or some other hypervisor solution and run your servers in separate VM's.

Use a dedicated host. It can be a desktop, server, Raspberry Pi, etc. Depending on your needs. Sooner or later you'll find that hosting on a workstation that you use for other things is horribly inconvenient. Depending on what you're self hosting, it can consume lots of resources. If you become dependent on the services you're hosting, which is the point of self hosting to begin with, even really small things like rebooting your workstation can become really inconvenient.

I've got an old Dell PowerEdge ticking away in my basement that runs all my VM's. I can reboot my desktop without interrupting any of my self hosted services. It also makes it easier to back up my VM's and I can easily spin up a new one if needed. You have to be careful if you use server hardware though. The T430 that I have is pretty efficient but some servers can be thirsty little space heaters.

[–] [email protected] 6 points 1 year ago

Convenience is the main issue. AFAIK, as long as you secure your device, it'll do the job

[–] mrpibb 6 points 1 year ago

What do you want to self host? To learn or experiment buy a cheap old x86 box. I get mine at goodwill auction. Otherwise desktop is good if you want something that needs more compute and that you’d spin up as needed vs always on.

[–] iwasgodonce 6 points 1 year ago

I do and it's fine.

I used to have a separate machine for server stuff but it just cost more in electricity since I would leave them both on 24x7 anyway.

I've got 64G of ram and I often use up to 48 of it with various VMs. I wouldn't get any power savings with a separate server since I have a cron job to transcode everything that plex recorded off of TV during the day to av1 for disk space savings (usually turns 3GB of mpeg2 into 700MB of av1), so I would need a server with a moderately powerful cpu anyway for that.

I have a ryzen 3700X. got it since it was the highest performance that was still 65w tdp at the time, didn't want to spend a ton on electricity and extra air conditioning since I would be leaving it running 24x7.

The only time I notice a performance impact during gaming is if my windows 11 vm is running, I don't really need that one running 24x7 so I shut that one down if it happens to be running at the time.

[–] Tylerdurdon 4 points 1 year ago

I would learn about mtbf if I were you. Everything has a failure point (generally buried deep in product info), and when you start keeping it on 24/7, the hours will burn. Those fans on your GPU will give it up way before they normally would.

Finding a used server is not a good idea either. You aren't in a data center and servers are super loud. Also, they chew up electricity like a hungry dog and his dinner.

As others have said, find a used desktop somewhere and a cheap KVM switch so you can use the same peripherals for both. It doesn't need to be beefy by any measure (maybe drive space), just affordable.

[–] fuckwit_mcbumcrumble 4 points 1 year ago* (last edited 1 year ago)

I use a "regular" desktop as my server. It uses much less power than most servers and still has plenty of horsepower for what I do.

Remote management and (cheap) ECC ram are the biggest reason to get a server. But those usually aren't issues for most work loads, especially at home.

Shit I used to run my stuff off of a laptop with maxed out ram, and some people just have a raspberry pi and call it a day.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

My first homelab was a synology NAS, and my gaming PC with a DIY linux hypervisor as the main OS, a linux VM for hosting servers, and a Windows/Mac/Linux VM trio (each with GPU passthrough) that I would switch between for my workstation. I lost performance for sure, but it taught me a lot without the need to purchase more hardware.

If you consider it temporary, it's not a bad way to learn.

[–] equidamoid 2 points 1 year ago

Whatever works for you. Just do it. It is convenient as f when you are just starting. You can always improve incrementally later on when (if) you encounter a problem.

Too much noise/power costs to run a small thing - get a pi and run it there. Too much impct on your desktop performance - okay, buy a dedicated monster. Want to deep dive into isolating things (and VMs are too much of a hassle) - get multiple devices.

No need to spend money (maybe sponsoring more e-waste) and time until it's justified for your usecases.

[–] [email protected] 2 points 1 year ago (1 children)

i think its kinda silly

i see workstation graveyards in closets and garages that would make perfectly good white box servers.. yeah theyre retail shit that mostly lack the 'always on' resilience of server level hardware, but a dedicated box for a server process is always better than a shared user/server environment.

youd prolly be hard pressed to find any old shitty retail box that you couldnt slap a nix variant.

keep good backups.. in my experience, hardware dies in this order: spinning drive, power supply, motherboard

[–] Alami 1 points 1 year ago

I remember someone in this community selfhosting on an android phone, I think samsung s20

[–] Alami 2 points 1 year ago* (last edited 1 year ago)

I did the other way round. An old nuc from 2013 turned into a gui-less debian selfhosted server (yunohost) for 2.5 years, until my old laptop (2008) died. Then I installed xfce on top, plugged a wireless keyboard/touchpad combo + hdmi to my TV, and use it as a desktop mainly for web browsing, open office and some gimp processing. But just once a week or so.