Hacker News new | past | comments | ask | show | jobs | submit login

The sconed part "Game Speed dependent on Variable FPS". I'm confused about the parameter of function "update_game()". From the code, I think the time length of "curr_frame_tick - prev_frame_tick" would increase at each timestep beacause of function "display_game()".



Every loop, prev_frame_tick is assigned the old value of curr_frame_tick before curr_frame_tick is updated. That means the value of their difference is the time in ticks it took for the last loop to execute. That difference is used to scale things like physics.

As an example, if we want to move something on screen at a constant rate in pixels per second, the number of pixels to move per update (ie frame) at 60 FPS is half as far as when updating at 30 FPS. The result is that the position at any time is the same regardless of how quickly the computer ran the simulation.


display_game() doesn't change any of these values.

(curr_frame_tick - prev_frame_tick) is the delta time of each loop. So this value is the time it took to do the previous loop.

It doesn't increase each timestep because prev_game_tick is recalculated each step.

Step1 (initial step):

- prev_frame_tick = 0

- curr_frame_tick = 0

- delta = 0

Step2:

- prev_frame_tick = 0

- curr_frame_tick = 200

- delta = 200

Step3:

- prev_frame_tick = 200

- curr_frame_tick = 356

- delta = 156

...


Fair enough. I think I didn't understand the difference between game time and real time before. I though the running time of "display_game()" was the game time (curr_frame_tick - prev_frame_tick). Thanks a lot for your detailed explanation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: