TECH: High FPS, Slow Movement?
This is the first of a series of tech blogs / articles that I’m going to write covering all aspects of the technical side of game development. They’ll range from short, code-level problem/solution articles on a specific problem, to more ‘bigger picture’ type discussions of a technical subject at a much higher, wider level. As this is the first one, we’ll keep it nice and simple.
Here’s a problem I came across recently in XNA / Monogame: Changing from a fixed timestep to variable timestep slows sprite delta movement significantly.
This issue actually caught my eye while I was browsing other gamedev issues as it’s one of those ones that 1. seems completely illogical; and 2. seems to be asked regularly, but never answered.
Until fairly recently, I’ll be honest, I hadn’t really cared enough to investigate because it wasn’t something that had affected me directly. However, due to early morning coding while not entirely awake, I was recently caught out by it too and decided to investigate.
I had started a new prototype one morning and, after an hour or so of getting sprites moving around and suchlike, I realised that I was still running on the default fixed timestep, so I changed it over to a variable timestep. Then logic apparently broke down and the world fell apart beneath my caffeine-starved brain.
What was happening was that my sprites’ movement, all calculated using delta time and so should remain the same regardless of frame rate, was slowing down. Not just by a small amount either, but a really, really noticeable amount which basically ruined the game. This made no sense. I could have understood if the movement speed had increased in relation to the FPS, so the more frames run per second the faster the sprite moved, which would have meant delta time wasn’t being used in the calculations for some reason, but slowing down? Odd. Really odd.
The solution is deceptively simple. It’s one of those ones that makes you kick yourself, and then hate yourself with a passion for the rest of the morning.
Consider the following basic code from an update loop:
float _delta = (float)gameTime.ElapsedGameTime.Milliseconds * 0.001f; _Player.Position.X += BASE_MOVE_SPEED * _delta;
It’s calculating the time delta since the last update loop using the number of elapsed milliseconds, then using that to calculate the new X position of the player. Looks fine, right? Now consider the following, almost identical, piece of code:
float _delta = (float)gameTime.ElapsedGameTime.TotalMilliseconds * 0.001f; _Player.Position.X += BASE_MOVE_SPEED * _delta;
Can you spot the difference?
ElapsedGameTime.Milliseconds in the first example,
ElapsedGameTime.TotalMilliseconds in the second. Moving to TotalMilliseconds solves the issue. It’s that simple.
Ok, but why?
ElapsedGameTime is a TimeSpan struct.
TimeSpan.Milliseconds returns the millisecond component of the timespan, that is, the section that can’t be represented as whole seconds. So, for a TimeSpan of 1.5 seconds,
TimeSpan.Milliseconds would return 500.
TimeSpan.TotalMilliseconds returns the total number of milliseconds in the entire TimeSpan. For a TimeSpan of 1.5 seconds,
TimeSpan.TotalMilliseconds would return 1500.
As long as your frame time doesn’t exceed a second per frame (let’s face it, if it does you may as well scrap the project right now), that should be ok, right? Wrong, and here’s where the problem lies: TimeSpan.Milliseconds returns an int, and therefore is only capable of representing whole milliseconds. TimeSpan.TotalMilliseconds returns a double, and as such is capable of representing fractions of milliseconds.
This means that, as you get nearer and nearer to 1000 FPS and a frame time of 1 millisecond, your delta becomes more and more inaccurate. Add rounding errors to that and your delta is being skewed every which way. Get rid of the inaccuracy and rounding errors by changing to TotalMilliseconds and everything behaves as it should.