Debian has way less overhead out of the box so in theory it should save a company a decent amount of money. I’m trying to calculate actual numbers and I’m curious if any of you have done any similar calculation.
Define what you mean by “overhead”
Mostly RAM usage
OK, and compared to what? “Less” is a comparison, but you didn’t specify what you’re comparing Debian to.
Out-of-the-box RAM usage is a pretty specious metric because you’re not installing Debian (or any other OS) just to have sit there in its out-of-the-box condition. Do you think a Debian server running Apache with 1000 vhosts will use less RAM than a RHEL server running nginx with 10 vhosts?Debian uses like 200MBs of ram for a basic fresh install. That’s negligible.
Unless you’re deploying 500 virtual machines on a single server, that all run a single simple basic task the base ram usage of the OS shouldn’t even be a factor.
Removed by mod
Computing resource usage of your OS should be indistinguishable from $0 almost everywhere.
from which OS? Ubuntu? Rocky/RHEL? Windows Server?
Mostly Ubuntu. Comes with a ton of extras installed which add storage and ram usage along with additional complexity.
Compared to Arch Linux then yeah you’ll save a ton of money almost guaranteed. But something like Windows? Good luck trying to calculate that.
I wouldn’t even deploy Arch in production as its not designed to be stable.
I mean you’d have to be pretty insane to use Arch on an actual server.
That or a masochist.
I don’t really subscribe to Arch or Debian being better or worse than each other. I encounter issues just as frequently on both. Maybe it’s a little harder to do things in Debian because the repositories don’t update as often but the AUR is where a lot of important stuff is and that’s a pain to deal with too.
Either way it’s better than using Windows.