I was looking through the compile log of my map, and noticed that Entdata max. memory is almost full.
I’ve searched for some answers, but didn’t find what I was looking for, but read that it can be exceed. So I’ve put tons of random prop_dynamic in my map, and managed to reach about 1700 point entities and 500 brush entities.
Map has compiled without any problems, with an exception of the Entdata being this:
Map has ran fine in Garry’s mod, albeit a bit laggy (no wonder, 900 plants in a room.). Breaking props that spit out gibs has worked correctly, and no errors in the console.
So what’s the deal about this?
Does exceeding the limit (Maybe to 125%) have any impact on the map?
The 100% ‘limit’ is the point at which you go from the game being able to run on a potato to needing a toaster to run the map. Source engine has so many arbitrary limitations that are there to uphold valves standard of games being able to be run on a potato running GLaDOS. Pretty much the only problems you’re going to have is FPS drop and possible poor optimization, but you should be able to ail that with good mapping.
I could be wrong, but I think its safe to say ‘no’. Please don’t stone me if I’m wrong.
It can go much higher then that, source just allocates more memory as required - the 100% is a suggested limit. I have a map that’s closer to 300% and works fine.
entdata [variable] 1166339/393216 (296.6%) VERY FULL!
The more entdata you have in a map means more memory used (albeit, not a lot) but the main issue comes to networking ents.
The entdata limit applies to both networks and none-networked ents, naturally, the more networked ents you have will mean more data pushed down to clients during MP. I believe this was the cause of the good old ‘Reliable snapshot overflow’ but citation is needed on that.
Some useful wiki pages that I would recommend reading on this matter as it is a deep subject.
The articles say source has a 2048 entity limit, in Gmod this is 8192 ents.