Why restrict to 54-bit signed integers?
Because number
is a double, and IEEE754 specifies the mantissa of double-precision numbers as 53bits+sign.
Meaning, it’s the highest integer precision that a double-precision object can express.
I suppose that makes sense for maximum compatibility, but feels gross if we’re already identifying value types.
It’s not about compatibility. It’s because JSON only has a number
type which covers both floating point and integers, and number
is implemented as a double-precision value. If you have to express integers with a double-precision type, when you go beyond 53bits you will start to experience loss of precision, which goes completely against the notion of an integer.
Ok.
I think I could have states my opinion better. I think LLMs total value remains to be seen. They allow totally incompetent developers to occasionally pass as below average developers.
This is a baseless assertion from your end, and a purely personal one.
My anecdotal evidence is that the best software engineers I know use these tools extensively to get rid of churn and drudge work, and they apply it anywhere and everywhere they can.
They existed before LLMs were spitting code like today, and this will undoubtedly lower the bar for bad developers to enter.
If LLMs allow bad programmers to deliver work with good enough quality to pass themselves off as good programmers, this means LLMs are fantastic value for money.
Also worth noting: programmers do learn by analysing the output of LLMs, just as the programmers of old learned by reading someone else’s code.
Claude is laughable hypersensitive and self-censoring to certain words independently of contexts (…)
That’s not a problem, nor Claude’s main problem.
Claude’s main problem is that it is frequently down, unreliable, and extremely buggy. Overall I think it might be better than ChatGPT and Copilot, but it’s simply so unstable it becomes unusable.
Have you ever worked at large old corporation?
I’m not sure you understand that it’s way more than “large old corporations” that use it. Everyone uses it, from large multinationals to small one-taxi shops, and even guys like you and me in personal projects. This has been going on for years. I really don’t know what led you to talk about large old corporations, seriously.
That’s only true in crappy languages that have no concept of async workflows, monads, effects systems, etc.
You don’t even need to sit on your ass and wait for these data types to be added to standard libraries. There are countless libraries that support those, and even if that is somehow not an option it’s trivial to roll your own.
It’s used because the ones who use it have enough money to pay for any problems that may arise from it’s use, (…)
That’s laughable. Literally the whole world uses it. Are you telling me that everyone in the world just loves to waste money? Unbelievable.
That way we’ll just find maintainers went near extinct over time, just like COBOL developers that are as rare as they are expensive.
Care to take a shot at figuring out why COBOL is still used today?
I mean, feel free to waste your time arguing for rewrites in your flavor of the month. That’s how many failed projects start, too, so you can have your shot at proving them wrong.
But in the meantime you can try to think about the problem, because “rewrite it in Rust” is only reasonable for the types who are completely oblivious to the realities of professional software development.
You’d have had me ignore them all and keep using C for everything.
Please tell me which language other than C is widely adopted to develop firmware.
You’re talking about so many up-and-comers during all these decades. Name one language other than C that ever came close to become a standard in firmware and embedded development.
Right.
Yeah, because the new tools are never actually better, right?
Well, yes. How many fads have come and went? How many next best things already died off? How many times have we seen the next best thing being replaced by the next best thing?
And yet, most of the world still runs on the same five languages: C, Java, C++, C#, JavaScript.
How do you explain that, with so many new tools being so much better than everything?
Might it be because fanboys tend to inflate their own definition of “actually better”, while turning a blind eye to all the tradeoffs they need to pretend aren’t there?
If you had a grasp on the subject you’d understand that it takes more than mindlessly chanting “tools” to actually get tangible improvements, and even I’m that scenario often they come with critical tradeoff.
It takes more than peer pressure to make a case for a tool.
That seems like a poor attitude imo.
Why do you believe that forcing something onto everyone around you is justifiable? I mean, if what you’re pushing is half as good as what you’re claiming it to be, wouldn’t you be seeing people lining up to jump on the bandwagon?
It’s strange how people push tools not based on technical merits and technological traits, but on fads and peer pressure.
Clearly Rust is a conspiracy.
Anyone in software development who was not born yesterday is already well aware of the whole FOMO cycle:
They’re a member, because they find Rust useful. This is just them saying another time that they find Rust useful.
Fans of a programming language stating they like the programming language is hardly thought-provoking stuff. There are also apps written in brainfuck and that means nothing as well.
The whole idea to check the donations came from stumbling upon this post which discussed costs per user.
Things should be put into perspective. The cost per user is actually the fixed monthly cost of operating an instance divided by the average number of active users.
In the discussion you linked to, there’s a post on how Lemmy.ml costs $80/month + domain name to serve ~2.4k users. If we went through opex/users metric, needlessly expensive setups with low participation would be a justification to ask for more donations.
Regardless, this is a good reminder that anyone can self-host their own Lemmy instance. Some Lemmy self-host posts go as far as to claim a Lemmy instance can be run on a $5/month virtual private server from the likes of scaleway.
Is there something else I’m not seeing?
Possibly payment processing fees. Some banks/payment institutions charge you for a payment.
I think you’re trying too hard to confuse yourself.
With what in mind? Evading NULL?
Depends on your perspective. It’s convenient to lean on type checking to avoid a whole class of bugs. You can see this either as avoiding NULL or use your type system to flag misuses.
Languages that make use of references rather than pointers don’t have this Dualism. C# has nullable references and nullability analysis, and null as a keyword.
C#'s null
keyword matches the monadic approach I mentioned earlier. Nullable types work as a Maybe
monad. It’s the same concept shoehorned differently due to the different paths taken by these languages.
OP is right. For web development with JavaScript frameworks (React, Angular, etc) with Node and even Typescript, you either use vscode or you haven’t discovered vscode yet.