• 37 Posts
  • 74 Comments
Joined 1 year ago
cake
Cake day: July 29th, 2023

help-circle


  • Why restrict to 54-bit signed integers?

    Because number is a double, and IEEE754 specifies the mantissa of double-precision numbers as 53bits+sign.

    Meaning, it’s the highest integer precision that a double-precision object can express.

    I suppose that makes sense for maximum compatibility, but feels gross if we’re already identifying value types.

    It’s not about compatibility. It’s because JSON only has a number type which covers both floating point and integers, and number is implemented as a double-precision value. If you have to express integers with a double-precision type, when you go beyond 53bits you will start to experience loss of precision, which goes completely against the notion of an integer.











  • That way we’ll just find maintainers went near extinct over time, just like COBOL developers that are as rare as they are expensive.

    Care to take a shot at figuring out why COBOL is still used today?

    I mean, feel free to waste your time arguing for rewrites in your flavor of the month. That’s how many failed projects start, too, so you can have your shot at proving them wrong.

    But in the meantime you can try to think about the problem, because “rewrite it in Rust” is only reasonable for the types who are completely oblivious to the realities of professional software development.



  • Yeah, because the new tools are never actually better, right?

    Well, yes. How many fads have come and went? How many next best things already died off? How many times have we seen the next best thing being replaced by the next best thing?

    And yet, most of the world still runs on the same five languages: C, Java, C++, C#, JavaScript.

    How do you explain that, with so many new tools being so much better than everything?

    Might it be because fanboys tend to inflate their own definition of “actually better”, while turning a blind eye to all the tradeoffs they need to pretend aren’t there?






  • The whole idea to check the donations came from stumbling upon this post which discussed costs per user.

    Things should be put into perspective. The cost per user is actually the fixed monthly cost of operating an instance divided by the average number of active users.

    In the discussion you linked to, there’s a post on how Lemmy.ml costs $80/month + domain name to serve ~2.4k users. If we went through opex/users metric, needlessly expensive setups with low participation would be a justification to ask for more donations.

    Regardless, this is a good reminder that anyone can self-host their own Lemmy instance. Some Lemmy self-host posts go as far as to claim a Lemmy instance can be run on a $5/month virtual private server from the likes of scaleway.





  • lysdexic@programming.devOPtoProgramming@programming.devCode Smells Catalog
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    27 days ago

    With what in mind? Evading NULL?

    Depends on your perspective. It’s convenient to lean on type checking to avoid a whole class of bugs. You can see this either as avoiding NULL or use your type system to flag misuses.

    Languages that make use of references rather than pointers don’t have this Dualism. C# has nullable references and nullability analysis, and null as a keyword.

    C#'s null keyword matches the monadic approach I mentioned earlier. Nullable types work as a Maybe monad. It’s the same concept shoehorned differently due to the different paths taken by these languages.