gmsv_fps - get real server FPS

A DLL that allows you to get server FPS.

You might ask “Why do you need this?”. There is a reason: on dedicated server hook Think is called exactly at tickrate, and hook Tick is called at tickrate, and FrameTime returns delay between ticks. But what about FPS? They are dropped when players spawn lagshit, and that is good sign for scripts to do something against low FPS. As for CPU usage, server may have 300fps at 99% or 10fps at 100%.

Usage is simple:


require"fps"
print(game.GetFPS()) --prints FPS based on real frame time
print(game.GetLastFPS()) --prints FPS in last second
print(game.GetFPSA()) --prints total number of frames since server start
print(game.FrameTimeReal()) --prints real frame time - not tick time


Example script that adjusts phys_timescale to keep FPS high:



require"fps" --loading module...

local pfps = 150
local tn = 1
hook.Add("Think","Unlag",function() 
	pfps = pfps + (math.Clamp(game.GetFPS()-pfps,-100,1)) --smooth fps making it more sensitive to drops
	if tn>=33 then --issue command each 33 ticks, otherwise possible jerky motion
		RunConsoleCommand("phys_timescale",tostring(math.min(pfps/150,1)))  --don't let it go over 1, start lowering timescale when FPS is lower than 150
		tn = 0 
	end 
	tn=tn+1 
	if pfps < 30 then 
		MsgN("Critically low FPS!!!") --debug spam that drops fps too, it is here only for demonstration
	end 
end)


That script kept my server FPS at maximum even when I spammed a lot of craneframes.

This won’t work on client as it uses server interfaces.

Download dll+source

How can the server have a Framerate? It isn’t rendering anything.

Somehow it has. Ask valve what is difference between tickrate and framerate, but they are different. At tick engine is simulating physics, at frame it is sending data to clients I think. The fact is that low server fps = visible lags on client while tickrate somehow stays the same.


gLua->Push(gLua->GetGlobal("RunString"));
gLua->Push("local GameFPS = 0 local GamePrevFPS = 0 function game.GetLastFPS() return GameFPS end timer.Create(\"FrameRateUpdate\",1,0,function() GameFPS = game.GetFPSA()-GamePrevFPS GamePrevFPS = game.GetFPSA() end)"); //lol @ me
gLua->Call(1);

You should look at ILuaInterface::RunString.

Linux version here
Thanks to Skondra for giving hope, I don’t get why I didn’t get it working before.

i’ll stick to net_graph 4, thanks.

also in 2009 engine update you can run 1000 FPS but it does jack shit.

FPS is now tied to tickrate and max is 66.

Try it:

keep tickrate 66, set fps_max 0 and run high priority on administrator with HPET on and notice that you do get 1000 fps, then set fps_max 67 and watch how there is 0 difference whatsoever in its effect on everything else.

FPS doesn’t mean much in source anymore. Having more doesn’t mean you have more power available to run more physics or anything, it just means you’re wasting more CPU.

Umm you know fps isn’t tied to your CPU, its tied to your graphics card…

This meter is good to have in lua though since you can see and automate recovering when the performance (fps) is degraded. Especially on linux (weld lag) this comes in handy.

Oh and why does the server “have” FPS :3

Zing’d.

Fps isn’t tied to graphics card.

In other news, I’ve just added the linbin to my server. Made it print if the server goes below 20 fps, I wish it could tell me what actually is making the fps freak out :expressionless: