• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle

  • I don’t disagree with there being tradeoffs in terms of speed, like function vs network requests. But eventually your whole monolith gets so fuckin damn big that everything else slows down.

    The whole stack sits in a huge expensive VM, attached to maybe 3 or 4 large database instances, and dev changes take forever to merge in or back out.

    Every time a dev wants to locally test their build, they type a command and have to wait for 15-30 minutes. Then troubleshoot any conflicts. Then run over 1000 unit tests. Then check that they didn’t break coverage requirements. Then make a PR. Which triggers the whole damn process all over again except it has to redownload the docker images, reinstall dependencies, rerun 1000+ unit tests, run 1000+ integration tests, rebuild the frontend, which has to happen before running end to end UI tests, pray nothing breaks, merge to main, do it ALL OVER AGAIN FOR THE STAGING ENVIRONMENT, QA has to plan for and execute hundreds of manual tests, and we’re not even at prod yet. The whole way begging for approvals from whoever gets impacted by anything from a one line code change to thousands.

    When this process gets so large that any change takes hours to days, no matter how small the change is, then you’re fucked. Because unfucking this once it gets too big becomes such a monstrous effort that it’s equivalent to rebuilding the whole thing from scratch.

    I’ve done this song and dance so many times. If you want your shit to be speedy on request, great, just expect literally everything else to drag down. When companies were still releasing software like once a quarter this made sense. It doesn’t anymore.



  • Guys… This is not a complicated discussion. I’m a trans woman. I’ve been the man. And now I’ve been the woman. I’m telling you without question I’m picking the fucking bear. Men are scary motherfuckers. A sizeable number of you are cruel, calculating, and downright uncaring. If you’re debating women about why they’d pick a potentially dangerous animal to be alone with in the woods instead of you, you have entirely missed the point.

    Go talk to every woman you know in your social circles and in your family, and ask them if they have been assaulted or sexually assaulted by men. The number of them that says yes to that question is going to be depressing. Some of them might even confide in you that they’ve been raped. My own sister didn’t tell me until I asked her why she was so upset with my brother one time. She had recently been raped by a boyfriend and when men got angry around her she’d flip out. Those acts, when inflicted on you, poison your default view of your fellow man. If you can’t imagine a man being more dangerous than a bear, then you’ve never had to.

    A bear can’t break my trust. A bear can’t gaslight me into thinking all the shitty things he does are because he loves me. And if I told someone I got attacked by a bear, at least they’d believe me. They wouldn’t need to bring out a bear assault kit to prove it. The bear is predictable. Men are not.



  • I approve of this expanded answer. I may have been too ELI5 in my post.

    If the OP has read this far, I’m not telling you to use docker, but you could consider it if you want to store all of your services and their configurations in a backup somewhere on your network so if you have to set up a new raspberry pi for any reason, now it’s a simple sequence of docker commands (or one docker-compose command) to get back up and running. You won’t need to remember how to reinstall all of the dependencies.


  • BellyPurpledGerbil@sh.itjust.workstoSelfhosted@lemmy.worldWhat's the deal with Docker?
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    2
    ·
    edit-2
    8 months ago

    It’s virtual machines but faster, more configurable with a considerably larger set of automation, and it consumes less computer resources than a traditional VM. Additionally, in software development it helps solve a problem summarized as “works on my machine.” A lot of traditional server creation and management relied on systems that need to be set up perfectly identical every deployment to prevent dumb defects based on whose machine was used to write it on. With Docker, it’s stupid easy to copy the automated configuration from “my machine” to “your machine.” Now everyone, including the production systems, are running from “my machine.” That’s kind of a big deal, even if it could be done in other ways naturally on Linux operating systems. They don’t have the ease of use or the same shareability.

    What you’re doing is perfectly expected. That’s a great way of getting around using Docker. You aren’t forced into using it. It’s just easier for most people