Similar Posts

4 Comments

  1. You might want to change the typo since the purpose of this page is to help clear up confusion regarding these terms. Saying that 1 KB = 100 bytes will just create unnecessary confusion due to a missing zero, since one KB = 1000 and not 100. Thanks for the useful explanation nonetheless.

  2. So Microsoft engineers were taught wrong, and nobody corrected them in 35 years. Is that what you’re implying?

    The decimal prefix “mega-” was introduced in resolution 12 of the 11th CGPM (1960), and it applied to any unit within the scope of that document. Byte is neither a SI unit, nor SI-derived unit. Thus, `M=10^6` definition from SI does not need to apply to it.

    Prefix “mega-” does not always mean “a million”. It can mean different things in different contexts. One “megaproject” isn’t equal to million projects (and one “kitten” isn’t equal to thousand “ittens”). If you ever see kilofeet or centipounds being thrown around, you’d be wise to check exactly what prefix means in case by case basis. Why? Because they are not SI units.

    Computers operate with bits. Transistor has two states, not 10. Information and entropy are defined in bits (there’s log2 in there, not log10). Thus, in computer science decimal SI prefixes are not used in relation to bits or bytes. Reason is pretty straightforward: 1000 is not a round number. If you have 10 bits to address a memory buffer, its size will be 1024 cells. No one is going to call it 1000 and have 24 left over.

    So what exactly does “megabyte” mean? Commonly cited source for this is JESD100B.01, which defines mega (M) as a prefix to units of semiconductor storage capacity to be a multiplier of 1,048,576 (2^20 or K^2, where K = 1024), while acknowledging its ambiguity.

    In other words, this is current status quo:
    – `MiB = 1,048,576 B` – recommended usage
    – `MB = 1,048,576 B` – acceptable usage
    – `MB = 1,000,000 B` – plain wrong

    As for now, decimal megabyte (metric megabyte if you will) is only used by hard disks manufacturers acting in bad faith. I’d much rather prefer not seeing it anywhere else.

  3. Hahaha

    Taught wrong?

    This change occurred when the Europeans (SI) decided to create this new standard. Where base 2 sizes are now based on base 10 sizes (which makes no sense). See ISO 1000:1992/AMD 1:1998.

    So prior, 1024 bytes was universally accepted as a MB. Microsoft brought the changes over in the early 2000’s, and UNIX never did.

    This means YOU were taught wrong, and the Europeans just wanted to be different, or conform to the metric system (base 10), which doesn’t apply to units of computing.

    This change cause havoc for consumers purchasing RAM, hard drives and other peripherals.

Leave a Reply

Your email address will not be published. Required fields are marked *

Are you a human? *