• Thorry84@feddit.nl
    link
    fedilink
    arrow-up
    125
    ·
    9 months ago

    It’s really simple, it’s a container containing a virtual os, which runs a browser and a webserver to run the app. The app connects to several external api services to do it’s thing.

    It’s like, really simple!

      • MotoAsh@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        9 months ago

        If they’re running a virtual OS in a container, they’re doing it very wrong. Containers and VMs are quite different, even on a Windows host.

        • haui@lemmy.giftedmc.com
          link
          fedilink
          arrow-up
          3
          ·
          9 months ago

          I‘m not sure I understand. At least docker containers have their own os, mostly alpine linux. Dunno if that applies to other apllications.

          • MotoAsh@lemmy.world
            link
            fedilink
            arrow-up
            13
            ·
            edit-2
            9 months ago

            Nah, a container isn’t running nearly as much as an entire OS. Not by a long shot. The Kernel isn’t there at all and the entire device stack is gone. Most don’t even have an init system running like systemd. They’re closer to a chroot in a single terminal than running an entire OS.

            The OS flavor in a container is mostly about what flavor of supporting tools are available inside the container. Almost everything else is a thin wrapper making calls in to your host OS or container services.

    • PatMustard@feddit.uk
      link
      fedilink
      English
      arrow-up
      10
      ·
      9 months ago

      It probably was very simple for the kid who wrote it, just import everything and write a couple of lines to use all this stuff that already exists!

    • Dasnap@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      9 months ago

      Gotta love using a base container image that is far too overkill for what you’re trying to run.

      • MotoAsh@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        9 months ago

        I mean, isn’t the entire point of a container largely non-functional compared to good deploy/install scripts? Both are perfectly capable of guaranteeing a predictable functional environment for the app. The container is just easier to use, harder to accidentally render insecure, and easier to clean up.

        All of their benefits are NOT for the app itself.

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          9 months ago

          Harder to accidently render insecure? My experience is the opposite, that docker style containers frequently fail to update vulnerable dependencies.

          Also depending on context, I can say often the container is harder to use. Snap is probably the easiest to use of the solutions, flatpak makes cli invocation a pain, and docker style sucks entirely for interaction, but is fine if your primary interaction is via Web service once you set it up (but oh boy, adding a webui package means you get to mess with nginx or apache proxypass by hand, and each app may require subtly different parameters in proxypass).

          • MotoAsh@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            9 months ago

            Docker is not in a competitor for snap and flatpak. They are tackling very differend kinds of installations.

            • jj4211@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              9 months ago

              The person said “containers” so I was responding to both.

              However, docker containers could stand to learn a thing or two with how flatpak and snap compose a runtime. Applications can say “allow x, y, and z dependency layers to update independent of the application container”, versus the docker style of the app developer must own maintenance of the entire image.

              There may be reasonable differences with respect to how much of a users “real” files and environment are presented to a container in those scenarios, and functional differences like gui and networking suggesting different defaults, but image composition does not need differentiation for their use cases.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      9 months ago

      I get to witness to enterprise services flavor of that. Where the company pays software architects that aren’t actually coding and coders not allowed to make architectural decisions.

      You have software that takes http? You need to rewrite it so that you only speak rabbitmq, and use it for every http request or Web socket message, don’t worry, we have a team that specializes in making http translate to rabbit mq, so you only have to rewrite the server code, another team will handle the http listener that translates to you.

      What’s that, you have a non http protocol? Well, the other team isn’t scoped to handle that, so you’ll need to convert your listener to rabbitmq and create a whole separate container to actually receive the packets in udp and then translate to rabbitmq. No “processing” software is allowed to speak anything but rabbitmq, and network listener containers are only allowed to dumb receive and Forward.

    • lurch (he/him)@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      9 months ago

      It also sends your IP and location to Firebase, because the response comes as a push.

      Every third time it outputs an ad text and plays an audio ad in the background.

    • oce 🐆@jlai.lu
      link
      fedilink
      arrow-up
      9
      ·
      9 months ago

      Some of those can be good if you want a single command to install on any OS.

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          9 months ago

          You think of distribution packaged rpms/deb, but the softwate developer can self publish, and you’ll see plenty of self published packages in ppa, copr, flathub, and even just loose websites because it’s not rocket science to make an apt or yum repository. However the distribution versions may take a little more time, but more likely to work together as a cooperative whole. Flathub has a decent shot by allowing concurrent versions of dependencies to install, while preserving the concept of updating dependencies independent of the package maintainers.

          However, as you go down his chart, it’s less likely that you’ll reasonably update after install. You may get the latest at the second you install, but 6 months later you’ll likely be stale. You may neglect to update npm in each and every project, or it may automatically dependency lock (because self publish nature results in developers having to vet dependency updates, and devs are lazy about that).

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          9 months ago

          I assume he refers to npm/pip/cargo to be the multiple os option, not saying the last one is obviously better for multiple os. At least that has to be because that’s the only option that is os independent.

          Of course it sucks because the essentially uncurated dependency trees result in either instability on updates, or missing updates. Of course also the natural OS updater won’t help you out with pip/cargo/npm, but it will help with apt, yum, snap, and flatpak.

        • oce 🐆@jlai.lu
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          9 months ago

          I was talking about the other ones, but since you mention it, yeah, many people use Bash on Windows, from Git Bash which is part of Git on Windows, which pretty much any developer forced to use Windows will install in order to use Git.
          Developers often prefer to have less interfaces to maintain when possible.

      • smileyhead@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        Gets the job done, but shoudn’t and isn’t intended for non-programmer end user.
        I’m not mad at small programs or developers with not much time to setup a distribution pipeline, they should be praised for their work at the program itself. But different OSes have different places to unpack a program and this allows simple updates, we should respect that for consistency at user end. Expect it’s Windows, which is a unspecified mess anyway, let’s go and unpack everything raw on C:\ or into user directory.

  • Heavybell@lemmy.world
    link
    fedilink
    English
    arrow-up
    48
    ·
    9 months ago

    How much you wanna bet the “dev” doesn’t realise chromium is a dependency, in this scenario?

  • whaleross@lemmy.world
    link
    fedilink
    arrow-up
    32
    ·
    9 months ago

    What do you mean you don’t have to restart your terminal software every afternoon when the four windows consume six gigabytes of RAM?

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      34
      arrow-down
      1
      ·
      9 months ago

      Genuinely, fuzzy search and autocomplete is a great application of “AI” (machine learning algorithms).

      They just need to stop branding it as AI and selling everything they feed the models…

      • thegreekgeek@midwest.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        Hopefully that day is soon what with those 1-bit models I’ve been hearing about. I’d be all for that, but I’ll be damned if I’ll be putting an OpenAI key into my terminal lol.

      • oce 🐆@jlai.lu
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        Do those really require ML? For an e-commerce with millions of entries maybe, but for a CLI I don’t see it.

        • MotoAsh@lemmy.world
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          9 months ago

          “need”? Of course not. Though I do see it being capable of much more sophisticated autocomplete. Like a tab-complete that is aware of what you’ve already typed in the command and gives you only compatible remaining flags, or could tab-complete information available in the environment, like recognize it’s running in Kubernetes and let you tab through running hosts or commands that’d make sense from ‘here’, etc, etc.

          Sure, it’s all things a very nice and complicated algorithm could do, but … that’s all “AI” thus far. There have been zero actual artificial intelligences created.

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          9 months ago

          So I didn’t do it in the CLI direct, but I had a whole lot of files in a collection that obviously had duplicates.

          So I first used fdupes, which got a lot of them. But there were a lot of duplicates still. I noted a bunch were identifiable by having identical file size, but just some different metadata, so I made a quick work of presenting only files with identical stuff and went about reviewing and deleting.

          Then I still see a lot of duplicates, because the metadata might be slightly different. Sizes were close, but non dupes also were close. I might have proceeded to write a little something to strip it the metadata to normalize, but decided to feed it to an LLM and ask to identify likely duplicates. It failed to find them all, and erroneously declared duplicates, but it did make the work go faster. Of course in this scenario a missed duplicate isn’t a huge deal, so I had to double check their results and might have missed some things, but good enough for the effort.

          Sometimes my recall isn’t quite good enough for ctrl-r, but maybe an LLM could do better. Of course a better “search engine” also could do well. Also a mind numbingly obvious snippet could be generated without the tedium. Again, having to be careful to reviee because the LLMs are useful, but unreliable.

        • MotoAsh@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          9 months ago

          Not sure what you mean. It’s already a completely superfluous and additional feature. It “should” execute completely separate of everything regardless of what integrations it has.

          Though if it doesn’t yet exist as a separate thing to hook in to (and it doesn’t), it’s got to execute somewhere. Makes sense it’d show up as a canned extension or addition to something before it’d show up as a perfectly logically integrated tool.

          • smileyhead@discuss.tchncs.de
            link
            fedilink
            arrow-up
            2
            ·
            9 months ago

            Terminal emulator is the window, the tabs, integration with your desktop, etc.
            Shell is more complicated but TLDR is this is everything showing in your terminal window by default, the base program you use that runs other programs. The prompt showing current user, saving history, coloring the input, basic editing keyboard shortcuts, etc.

            By having this AI integrations in a terminal emulator we are very much limiting ourselfs. It would look more fancy in popup windows, but it won’t work over remote connections and not be as portable.
            Usually when we do some smart functions like autocomplete, fuzzy search or integrations like that we do it as an shell (fish, bash, zsh) extension, then it will work on any emulator and even without a GUI.

            • MotoAsh@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              9 months ago

              Yea, I agree it ‘should’ be integrated in a more general way. Though my point is from the dev’s perspective: Why go through the extra effort to ‘properly’ do it if it is an unproven tool many people don’t want?

              Not saying it should stay there, just saying it makes sense it showed up somewhere less sensical than the ideal implementation.

    • Z4rK@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      Warp.dev! It’s the best terminal I’ve used so far, and the best use of AI as well! It’s extremely useful with some AI help for the thousands of small commands you know exist but rarely uses. And it’s very well implemented.

      • TopRamenBinLaden@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        9 months ago

        I don’t understand what is the benefit here over a terminal with a good non-LLM based autocomplete. I understand that, theoretically, LLMs can produce better autocomplete, but idk if it is really that big of a difference with terminal commands. I guess its a small shortcut to have the AI there to ask questions, too. It’s good to hear its well implemented, though.

        • Z4rK@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          9 months ago

          There are two modes of AI integrations. The first is a standard LLM in a side panel. It’s search and learning directly in the terminal, with the commands I need directly available to run where I need them. What you get is the same as if you used ChatGPT to answer your questions, then copied the part of the answer you needed to your terminal and run it.

          There is also AI Command Suggestion, where you’ll start to type a command / search prefixed by # and get commands directly back to run. It’s quite different from auto-complete (there is very good auto-complete and command suggestion as well, I’m just talking about the AI specific features here).

          https://www.warp.dev/warp-ai

          It’s just a convenient placement of AI at your fingertips when working in the terminal.

        • Z4rK@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          Alas. They have said they plan to open some of the source and potentially everything, but it’s little progress.

          They recently ported to Linux, which I think will give them much more negative feedback here, so hopefully with more pressure they’ll find the correct copy left license and open up their source to build trust.