I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.
Feedback is very much welcome. Thank you.
You don’t have to know. It does not matter because your 8GB stick can’t fit 16 512MB files anyway. Funny enough it might fit 500MB files if it is FAT32.
Being consistent with base10 systems does not matter in real world usage. Literally nobody cared before the asshats changed it.
Edit: i also understand si, down to its history. I don’t live in an inch country. Computing is different then physical measurements. In computing 1024 is more “correct”.