Hello. I've noticed that all my Linux clients seem to cause 3-4 times the memory consumption on my MAC than what my Linux client is actually using. For example, If I run a Debian 11 XFCE vanilla client, it runs between 5-600 MB ram on HTOP for 15 minutes, while at the same time using the Parallels Resource Monitor shows 3 Gigs. If I then run a single tab instance of Firefox in the client, my ram usage on HTOP shows 750megs ram, but Parallels shows almost 4 Gigs used on my MAC. Why is there such vast differences between the client and the host in terms of RAM usage compared to other virtualization software such as VMWare or even Virtualbox running the same configuration?