• 15Gb Commit and 8.5 Gb Working Set.
    5 replies, posted
Rust blows up the Commit to 15Gb and this gives me out of memory errors when say... trying to open a web page. Same results on modded or non modded servers.
There's a reason why Rust's minimum and recommended system requirements are 8 and 16GB of RAM respectively. Especially on busier servers full of built stuff. It's a sandbox game and there are some things you can't just optimize away like giant collections of objects taking up memory. Close any web browsers because they'll compete with Rust for RAM. Also make sure your virtual memory settings are set correctly to allow the system room to expand (I recommend setting the max to be at least 1.5x your system memory). P.S. I am not a Rust dev or a mod.
I have 16GB of ram. My VM is at 7GB. Setting it to 24GB is a bit absurd. In all my searches not one single person has had to go to this extreme. The Working Set at ~8GB is fine. Why is the Commit going so high? I'm not well versed in the workings of Ram and VM that's why I ask.
Because Rust is a memory-hungry program, especially on servers with lots of objects. It's not a slim tightly-optimized arena shooter like Unreal Tournament '99, it's a sandbox game still in active development. It needs lots of memory, and you should close other memory-hogging programs like browsers if you're having memory problems. Most games don't include tens/hundreds of thousands of destructible objects placed by players on any given server.
Ah the good ole days. I put several thousand hours in that game. Good times When at the title screen in Rust, it uses about 800MB. Once I load into a server, what is using the 8GB of ram and what is using the 15GB commit? Just trying to wrap my head around this and learn.
The entire map and everything every player built on it. https://i.redd.it/6266ch9zcic21.jpg (Not my screenshot.) These structures don't exist for free. Your game client has to load the whole map and everything on it. I don't know the intimate details with Rust, but it seems like this is the case. Star Citizen had a comparable problem until Alpha 3.3.0 released several months ago. Prior to that patch, which completely rewrote how asset-loading worked, the client loaded the entire star system all at once and had to keep it all in RAM; this is in excess of 70,000 entities, a lot of them with AI scripting other scripting. Everything had to exist locally on your computer, all of it. And because of that SC would max out the RAM on a 16GB system and force the system to use over 20GB of virtual memory, and performance would also be garbage because your computer would constantly have to update all those entities even if they were two million kilometers away. With 3.3.0, your client only loads the stuff close around it and loads new stuff when it needs it, and RAM usage is far more reasonable, maxing out around 12GB if you go to the busiest and most object-full place in the game. It's very difficult to allow players to arbitrarily build and destroy as much as they can while also making things run hyperefficiently the way a pre-baked map can be.
Sorry, you need to Log In to post a reply to this thread.