8GB RAM on M3 MacBook Pro ‘Analogous to 16GB’ on PCs, Claims Apple::Following the unveiling of new MacBook Pro models last week, Apple surprised some with the introduction of a base 14-inch MacBook Pro with M3 chip,…
8GB RAM on M3 MacBook Pro ‘Analogous to 16GB’ on PCs, Claims Apple::Following the unveiling of new MacBook Pro models last week, Apple surprised some with the introduction of a base 14-inch MacBook Pro with M3 chip,…
Don’t all systems compress RAM these days? It’s an obvious win.
If you compress the ram, you need to push data through the cpu, which would tank performance. The whole idea of ram is to have a bit of fast space to store stuff, so it can be used for things that need to run fast and also require a lot of data. To get the performance to be as good as it is, a lot of times the cpu is completely bypassed, to prevent bottlenecks. If you don’t need it fast, just park it on the hard drive, there is plenty of space there. The memory manager takes care of that for you, with for example a technique called swapping.
These days there are lots of complicated technologies that achieve this goal, but it all started with DMA or Direct Memory Access back in the days.
Fun fact, in the 80s and 90s memory was so expensive and modern techniques like swapping were in their infancy it could sometimes be useful to compress data in memory. I used to use QEMM back in the days which had that feature, even though I didn’t use that specific feature myself.
In principle someone could make a memory chip that was capable of on the fly compression. But that would cause wildly inconsistent memory timings, which isn’t ideal and I don’t think the costs would level out versus simply getting more memory.
ZRAM/ZSWAP is a thing on Linux at least. But that doesn’t compress actively used RAM. That’s just compressed swap space in memory.
Which is basically how conpressed memory works anyway.
https://lemmy.world/comment/5163708