• 0 Posts
  • 16 Comments
Joined 2 years ago
cake
Cake day: July 6th, 2023

help-circle


  • Listed salaries are almost always what the employee pays, not what it costs the company. In the US, this includes the payroll tax, and cost of “benefits,” like healthcare and unemployment insurance, and is referred to as the burdened rate. This is separate from the income tax the employee has to pay to the government, mind you.

    The burdened rate for most employees at the companies I’ve worked for in the US is like 20-50% higher than the salary paid. Not sure exactly how it works in France, but I do know there’s a pretty complex payroll tax companies have to pay. I think it’s something like 40% at the salary you quoted.


  • I’ll stop here because your position is incredibly privileged and you refuse to see that. The minimum wage is too low, that’s not the point though. 70k a year is absolutely a comfortable wage for a single person to live on in almost every place in the US, except the biggest of the major cities.

    You may not get everything you want but you should be able to cover everything you need, including an emergency fund, and still have enough to put aside a 5-10% for savings most years on 70k. If you really don’t believe that, you live in a bubble.


  • You’re not going to get any argument from me that shit is fucked. Everyone should have guaranteed access to housing, food, and healthcare, and we don’t. A lot of kids were set up for failure by their parents insisting they take out college loans. But your standard for a minimum cost of living is basically the minimum to live like a boomer in the 70s.

    The average white male boomer in the US lived like a king compared to everyone else around them, even at the time. The descendants of those people tend to think that the fact that their parents or grandparents had this means they should too. In reality, those boomers were incredibly lucky to be born into a privileged class during an economic golden age.

    We don’t get that, we get the world they fucked up. Rich dickheads hogging all the wealth and stealing wages is nothing new, it’s been the standard for all of human history. What is new is that you can see clearly how well the privileged live compared to you. Maybe that will cause things to change, idk.

    In the meantime, we need to make do. An emergency fund is intended to be used for emergencies, which are things that threaten your ability to acquire basic needs (food, housing, health). You keep it funded at 6 months of expenses (e.g., the minimum you need to meet your financial obligations plus food+rent). When it’s full, you don’t keep adding to it. When you use money from the fund, you replenish it as quickly as you can. Everyone should have one.

    You shouldn’t be having an emergency every single year though. If you are, it’s not an emergency, it’s an extra expense you need to plan for. If you are spending double-digit percentages of your income on debt (car loans, credit cards, etc), you need to stop spending money on anything else but basic needs until you pay it off. Or start a revolution, but we’re arguing on the Internet so I don’t think the odds of that happening are high.

    The world sucks. It’s not fair. You can still live a good life in it though, even if it’s not as good as it used to be.


  • Saving 20% of your income is way beyond emergency funds and what is needed for retirement. Typical guidelines for emergency funds are to set aside at least 6 months worth of living expenses, you don’t need to save 20% forever. If you saved even 10-15% of your income for retirement your entire life, you’d have a very comfortable retirement (assuming the world doesn’t burn down before then).

    SmartAsset is a financial advisor service, and these numbers seem to be guidelines for middle-class earners. That’s pretty far beyond a minimum cost of living, so I’d say this title is misleading at best.


  • I don’t think everyone is entitled to wealth accumulation. Housing/health/food security, absolutely, but being able to build wealth by making enough to save 20% of your earnings is beyond a basic entitlement. I doubt most people would agree with you on that.

    You could more accurately title this as “Minimum wage needed to live like an average boomer in 1975”. Still fucked up, and not misleading.

    Edit: you added a bunch to your reply. I think the framing of this is just wrong, frankly. Wages absolutely have not kept up with the cost of living, but you’re going beyond that and saying everyone is entitled to the financial security of a middle-class earner. It’s a good goal, but not an entitlement, and no reasonable person would frame that as a minimum cost of living.



  • I’m not sure if you know this, but…that doesn’t fix most of the security issues in the linked list. All the reverse proxy does is handle hostname resolution and TLS termination (if you are using TLS). If the application being proxies still has an unauthenticated API, anyone can access it. If there’s an RCE vulnerability in any of them, you might get hacked.

    I run Jellyfin publicly, but I do it behind a separate, locked-down reverse proxy (e.g., it explicitly hangs up any request for a Host header other than Jellyfin’s), in a kubernetes cluster, and I keep its pod isolated in its own namespace with restricted access to everything local except to my library via read-only NFS volumes hosted on a separate TrueNAS box. If there is any hack, all they get access to is a container that can read my media files. Even that kind of bothers me, honestly.

    The overwhelming majority of Jellyfin users do not take precautions like this and are likely pretty vulnerable. Plex has a security team to address vulnerabilities when they happen, so those users would likely be a lot safer. I appreciate the love for FOSS on Lemmy, but it is scary how little most folks here acknowledge the tradeoffs they are making.



  • Maybe this comment will age poorly, but I think AGI is a long way off. LLMs are a dead-end, IMO. They are easy to improve with the tech we have today and they can be very useful, so there’s a ton of hype around them. They’re also easy to build tools around, so everyone in tech is trying to get their piece of AI now.

    However, LLMs are chat interfaces to searching a large dataset, and that’s about it. Even the image generators are doing this, the dataset just happens to be visual. All of the results you get from a prompt are just queries into that data, even when you get a result that makes it seem intelligent. The model is finding a best-fit response based on billions of parameters, like a hyperdimensional regression analysis. In other words, it’s pattern-matching.

    A lot of people will say that’s intelligence, but it’s different; the LLM isn’t capable of understanding anything new, it can only generate a response from something in its training set. More parameters, better training, and larger context windows just refine the search results, they don’t make the LLM smarter.

    AGI needs something new, we aren’t going to get there with any of the approaches used today. RemindMe! 5 years to see if this aged like wine or milk.


  • Hyperfixating on producing performant code by using Rust (when you code in a very particular way) makes applications worse. Good API and system design are a lot easier when you aren’t constantly having to think about memory allocations and reference counting. Rust puts that dead-center of the developer experience with pointers/ownership/Arcs/Mutexes/etc and for most webapps it just doesn’t matter how memory is allocated. It’s cognitive load for no reason.

    The actual code running for the majority of webapps (including Lemmy) is not that complicated, you’re just applying some business logic and doing CRUD operations with datastores. It’s a lot more important to consider how your app interacts with your dependencies than how to get your business logic to be hyper-efficient. Your code is going to be waiting on network I/O and DB operations most of the time anyway.

    Hindsight is 20/20 and I’m not faulting anyone for not thinking through a personal project, but I don’t think Rust did Lemmy any favors. At the end of the day, it doesn’t matter how performant your code is if you make bad design and dependency choices. Rust makes it harder to see these bad choices because you have to spend so much time in the weeds.

    To be clear, I’m not shitting on Rust. I’ve used it for a few projects and great for apps where processing performance is important. It’s just not a good choice for most webapps, you’d be far better off in a higher-level language.


  • I wouldn’t shortchange how much making the barrier to entry lower can help. You have to fight Rust a lot to build anything complex, and that can have a chilling effect on contributions. This is not a dig at Rust; it has to force you to build things in a particular way because it has to guarantee memory safety at compile time. That isn’t to say that Rust’s approach is the only way to be sure your code is safe, mind you, just that Rust’s insistence on memory safety at compile time is constraining.

    To be frank, this isn’t necessary most of the time, and Rust will force you to spend ages worrying about problems that may not apply to your project. Java gets a bad rap but it’s second only to Python in ease-of-use. When you’re working on an API-driven webapp, you really don’t need Rust’s efficiency as much as you need a well-defined architecture that people can easily contribute to.

    I doubt it’ll magically fix everything on its own, but a combo of good contribution policies and a more approachable codebase might.



  • In reading this thread, I get the sense that some people don’t (or can’t) separate gameplay and story. Saying, “this is a great game” to me has nothing to do with the story; the way a game plays can exist entirely outside a story. The two can work together well and create a fantastic experience, but “game” seems like it ought to refer to the thing you do since, you know, you’re playing it.

    My personal favorite example of this is Outer Wilds. The thing you played was a platformer puzzle game and it was executed very well. The story drove the gameplay perfectly and was a fantastic mystery you solved as you played. As an experience, it was about perfect to me; the gameplay was fun and the story made everything you did meaningful.

    I loved the story of TLoU and was thrilled when HBO adapted it. Honestly, it’s hard to imagine anyone enjoying the thing TLoU had you do separately from the story it was telling. It was basically “walk here, press X” most of the time with some brief interludes of clunky shooting and quicktime events.

    I get the gameplay making the story more immersive, but there’s no reason the gameplay shouldn’t be judged on its own merit separately from the story.