Arabic notation / base ten seems to work really well with our brains. Having that many fingers probably helps. I couldn't imagine a modern world running on Roman Numerals.
I don't think the choice of base is really dependent on finger encoding efficiency. Going 0-5 on a hand has the advantage that we only really care about the "last" finger.
Anyway, if we really want to go crazy, the knuckle and first joint on (my, at least) fingers can be controlled somewhat independently, so I think we can fit in 4 states per finger:
In a base-10 system the explanation is as simple as holding up both hands and then holding up the number of fingers for each hand you want them to add. The kid can count and intuitively learn how to add up two numbers. It’s really simple and really efficient.
The idea of counting to 12 by pointing to knuckles is something I hadn't encountered before, and really neat.
Kinda surprised base-12 didn't end up dominating. Having so many factors would have made division so much more convenient for mental math, and it is only 2 extra characters to remember.
It occurred to me once that the big/little endian issues people have are really due to the conflict between Arabic numbers going in the opposite direction from English letters.
If both ran right to left, or left to right, then there would be no difference.
Most of us are programmers here, you are just as used to hex.
Also technically we don't have that many fingers. We have one more finger. We don't have a number digit for ten right? They go zero to nine. But out fingers go zero to ten.
If our numeric base matched our fingers, we should've used base eleven. Not many people think this through :)
I think the real breakthrough is having "order of magnitude" in numbers. So indeed the Roman Numerals suck and probably wouldn't last regardless.
Right. It lets you see the bits more easily: 0-F is a good representation of 4 bits.
Say i were to name a random hex value like #$9C right now it would take me a few seconds to convert that to decimal in my head though... 156 took me a few seconds to sort out. I don't have to think about what 156 means in decimal because i just know what it is.
I'm not quite there, but close to being bilingual (binumeral?) between decimal and hex, and I think it's all about developing better intuition for each digit and their relationships.
For instance, you say 0x9C... that's just over half (0x80) of 0x100, close to 2/3rds (0xAA). Given in embedded we're often using a byte to represent a quantity, that gives enough feel.
I should practice multiplying hex by hand, I reckon that would assist in getting there.
Replying to sibling since we've maxxed out comment depth...
> It was $62 degrees Fahrenheit yesterday. I can't just go displaying that in a program. Nor is it meaningful to me without a decimal conversion.
It's just as meaningless to me even if you do the conversion to base10 for display... I don't do deg F intuitively and would have to convert to Celsius in my head. It's all about what we are familiar with.
Right but the world runs on base 10 is all i'm trying to say. It's needlessly difficult to use anything else (aside from hex or binary in very specific situations). In some college sophomore philosophy class you could argue for base 27 but it doesn't make your system usable or intelligible.
> It's needlessly difficult to use anything else (aside from hex or binary in very specific situations)
Totally agree. I'm a programmer, so I do need to know those, and as an embedded developer, even more. The average person not so much. I thought that's what this particular thread was all about.
Sure, I can work it out, and I do use F when talking to friends from the USA. My only point was that I deal with hex numbers all day, so they're more intuitive to me than Fahrenheit is.
I think that under certain circumstances the reply link doesn't show past a certain depth, but you can still (unless the comment is dead) click on the time link to get the page for the comment and reply there.
It's interesting that octal used to be popular and isn't any more. I'm not familiar with how that culture shift happened, but I remember learning C in the 80s and thinking it was odd that it supported octal when I'd never seen it anywhere else.
I think their point is that we use our 10 fingers to count to a value of 10. We can use our 10 fingers to represent 0 (no fingers) through 10 (all 10 fingers). This is essentially base 11 if you try to assign a specific digit to each finger.
While I agree with that viewpoint I think it's missing the point. As humans with 10 fingers it's easy for us to group things into increments of 10, so base 10 comes naturally. Think about how you count a quantity over 10: once you run out of fingers you mark down (or remember) that you've already counted one quantity of 10, now you're counting the next quantity of 10, etc...
It's more like a shifted base 10 where we represent digits 1-10 instead of 0-9.
Everyone is thinking about a "shifted" base 10, yes.
But every base starts with 0, there's no such thing as "shifted base" because then you literally can't represent 0.
Also "zero fingers" is still a thing that exists in this shifted base 10. So it remains base 11.
This is like the classic "0-based indexing" vs. "1-based indexing" dilemma. The "first" thing is represented by 1, we think.
But the "first" year of our life, we're zero years old. The "first" hour after midnight is 0 o'clock. Building your "first" million as a business is the period before you have 1 million. And so on.
A base 10 with symbols only for 1 through 10, where zero is represented with an empty set is a Bijective Base 10 numeration. The columns in excel are bijective base 26.