And then JSON doesn’t restrict numbers to any range or precision; and at least when I deal with JSON values, I feel the need to represent them as a BigDecimal or similar arbitrary precision type to ensure I am not losing information.
And then JSON doesn’t restrict numbers to any range or precision; and at least when I deal with JSON values, I feel the need to represent them as a BigDecimal or similar arbitrary precision type to ensure I am not losing information.
That’s because the nearest representable float to 0.99999999999999 is 1.0 - not because Python is handling rationals correctly.
This is a float imprecision issue that just happens to work out in this case.
It’s worth wondering why, if Python is OK with “/“ producing a result of a different type than its arguments, don’t they implement a ratio type. e.g. https://www.cs.cmu.edu/Groups/AI/html/cltl/clm/node18.html#SECTION00612000000000000000
How would you implement this in code?
JavaScript is truly a bizarre language - we don’t need to go as far as arbitrary-precision decimal, it does not even feature integers.
I have to wonder why it ever makes the cut as a backend language.
You have to explicitly check if the return value is an error and propagate it. You write the same boilerplate
if (err) return err
over and over again, which just litters your code.
That’s only true in crappy languages that have no concept of async workflows, monads, effects systems, etc.
Sad to see that an intentionally weak/limited language like Go is now the counterargument for good modeling of errors.
Yes, it is a huge pain, especially if you want to have round-trip interoperability with humans using markup. Wikipedia had a major challenge with this when they decided to add a rich text editor alongside wiki markup.
It was a PowerPC OS way before it was ever an (Intel x86) PC OS. First on dedicated hardware, later adding support for PowerPC Macs.
Could be a crypto key, or a randomly distributed 64-bit database row ID, or a memory offset in a stack dump of a 64 bit program