this post was submitted on 21 Feb 2024
314 points (95.1% liked)
Technology
59709 readers
5673 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's the metric system and it's standard now. 1 kilobyte is 1000 bytes, just like 1 kilometer is 1000 meters. It is much easier to convert 20.415.823 bytes into megabytes - 23.4 MB.
Only windows insists on mislabeling the base 1024 kibibyte as kilobyte. The metric unit is much easier to use.
What? Every BIOS in the world still uses the same system. Same thing for me on Linux.
Only hard driver manufacturers used a different system to inflate their numbers and pushed a market campaign, a lot of people who didn't even use computers said "oh that makes sense - approved"
People who actually work with computer, memory, CPU, and other components in base 8 just ignores this non-sense of "x1000"
I never knew the whole thing was considered part of the metric system, makes sense though.
I love the metric system to death because its so simple and easy, and it links different measurements together ( 1l of water = 1kg etc ).
That said, a computer works differently and because we work in factors of 2, 1000 bytes being a kilobyte makes no sense once you start working with bits and low level stuff. Other than that, i can see why the stuff was redefined.
Also, i think linux also works in factors of 1024, but id need to check
There is nothing to keep you from using factors of 1024 (except he slightly ludicrous prefix "kibi" and "mebi"), but other than low level stuff like disc sectors or bios where you might want to use bit logic instead of division it's rather rare. I too started in the time when division op was more costly than bit level logic.
I'd argue that any user facing applications are better off with base 1000, except by convention. Like a majority of users don't know or care or need to care what bits or bytes do. It's programmers that like the beauty of the bit logic, not users. @[email protected]
I agree with what you said, and its imo why the discussion of a factor of 1000 and 1024 will always rage on. Im a developer, and do embedded stuff in my free time. Everything around me is factor 1024 because of it, and i hate the factor 1000. But from a generic user standpoint, i agree its a lot more user friendly, as they are used to the metric system of a factor of 10
It is user friendly, and technically incorrect, since nothing ever lines up with reality when you use 1000 because the underlying system is base 8.
Or you get the weird non-sense all around "my computer has 18.8gb of memory"...