greater than, smaller than, will cast the type so it will be 0>0 which is false, ofcourse. 0>=0 is true.
Now == will first compare types, they are different types so it's false.
Also I'm a JavaScript Dev and if I ever see someone I work with use these kind of hacks I'm never working together with them again unless they apologize a lot and wash their dirty typing hands with.. acid? :-)
Not a JavaScript dev here, but I work with it.
Doesn't "==" do type coercion, though? Isn't that why "===" exists?
As far as I know the operators ">=" and "<=" are implemented as the negation of "<" and ">" respectively. Why: because when you are working with sticky ordered sets, like natural numbers, those operators work.
I know it’s a joke, but it’s an old one and it doesn’t make a lot of sense in this day and age.
Why are you comparing null to numbers? Shouldn’t you be assuring your values are valid first? Why are you using the “cast everything to the type you see fit and compare” operator?
Other languages would simply fail. Once more JavaScript greatest sin is not throwing an exception when you ask it to do things that don’t make sense.
parseInt is meant for strings so it converts the number there into a string. Once the numbers get small enough it starts representing it with scientific notation. So 0.0000001 converts into "1e-7" where it then starts to ignore the e-7 part because that's not a valid int, so it is left with 1
In J's equality is usually checked in a way that variables are casted to the type of the other one. "25" == 25 evaluates to truey because the string converted to int is equal to the int and the other way around.
You can however check if the thing is identical, using "25" == 25 which skips type conversion and would evaluate as false.
I assume the same thing happens here, null is casted to int, which gets the value 0.
My only thought here might be >= is usually the same as !< and maybe thats how it is defined in javascript and since < is false than >= == !false == true