Frametime
There are broadly three measurements of time used in the Source Engine. These are the real time (Plat_FloatTime
and co.), ticks (usually 60 per second), and frametime (time per frame). Frametime is measured as a floating-point number of seconds e.g. 60fps = 0.016666f, and ticks are calculated using frametime with a 'remainder' bucket. (see _Host_RunFrame
)
For some reason, the frametime used in the engine is bounded between 0.1 (10fps) and 0.001 (1000fps). This means that if the time between two frames is 0.5 seconds (2fps) the amount of 'time' simulated in the engine is 0.1 seconds, creating a host_timescale
-like effect below 10fps, slowing the game down. The opposite is true above 1000fps, speeding the game up (arguably making it harder to play)
Using SAR we can remove the bounds on frametime to remove the timescale effect, hopefully allowing people with weaker computers to speedrun the game without accidentally cheating, and people with stronger computers to push more frames, reducing input latency. However, removing the bounds has a number of side-effects because the game takes the limits as a given. As such, the functionality has been put behind an experimental cvar sar_frametime_uncap
until the side-effects can be more thoroughly addressed. Hopefully at some point we can do away with the 30-999 FPS rule!