Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

20 percent means, literally, 20 per hundred; it's equivalent to 0.2 or 2/10 or 1/5 or whatever, of course, but if `percentage==0.2` then that fairly clearly, on the face of it, should mean "0.2 per hundred", ie 0.2% or 0.002.


It really shouldn't. 20% means _literally_ 20 / 100 so if you need to express that numerically (as you do in code since % is reserved for modulo) you write that as 0.2. That is still a percentage, just in numerical decimal form instead of in the form of a fraction, the value is exactly the same and it didn't stop being a percentage.

If I write 0.2 in a piece of paper and give it to someone and tell them that's a percentage it should be pretty obvious that means it's 20%. If you do the same but you write 0.2% then of course it's 0.2%.

If they really wanted to they could've written the comparison using the numbers as fractions in the comparisons such as percentage < 10/100 which would be perfectly reasonable, but again, that resolves to 0.1, so you might as well right it in decimal form already.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: