this post was submitted on 22 Oct 2024
48 points (98.0% liked)
Hardware
663 readers
226 users here now
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- Augmented Reality - [email protected]
- Gaming Laptops - [email protected]
- Laptops - [email protected]
- Linux Hardware - [email protected]
- Mechanical Keyboards - [email protected]
- Microcontrollers - [email protected]
- Monitors - [email protected]
- Raspberry Pi - [email protected]
- Retro Computing - [email protected]
- Single Board Computers - [email protected]
- Virtual Reality - [email protected]
Icon by "icon lauk" under CC BY 3.0
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The two things I see people forget when crunching the numbers:
Cloud is a recurring cost. So you have to factor TCO over the life of the application which realistically is gonna be greater than 6 years. In those 6 years usually you’ve deprecated you’re on-premise hardware in 3-5, so those last few years are gravy, whereas with Cloud you’ll be paying the same amount year after year
Application sizing and density. Quite often people will break out workloads on the cloud to offer better “segmentation”, but you’re paying for that division. On premise you can better utilize your resources through sharing at the hypervisor level. Couple this with people not sizing their workloads correctly (or at all) and they end up wasting a lot of money
Cloud is good for rapid prototyping, or if you don’t have the resources to spend yourself, but if you have the size and means, on-premise will almost always win the cost argument.
ETA: Another factor that businesses liked was that Cloud was a OpEx spend vs on-premise hardware being a CapEx spend and it props up your EBITDA, but that doesn’t really matter much unless you’re looking at selling your company or are a highly traded public company, but a lot of them got sucked in with that view too.
(Source: I did these calculations and pointed it out to my leadership years ago and now am finally getting the “I told you so”)