The sconed part "Game Speed dependent on Variable FPS". I'm confused about the parameter of function "update_game()". From the code, I think the time length of "curr_frame_tick - prev_frame_tick" would increase at each timestep beacause of function "display_game()".
Every loop, prev_frame_tick is assigned the old value of curr_frame_tick before curr_frame_tick is updated. That means the value of their difference is the time in ticks it took for the last loop to execute. That difference is used to scale things like physics.
As an example, if we want to move something on screen at a constant rate in pixels per second, the number of pixels to move per update (ie frame) at 60 FPS is half as far as when updating at 30 FPS. The result is that the position at any time is the same regardless of how quickly the computer ran the simulation.
Fair enough. I think I didn't understand the difference between game time and real time before. I though the running time of "display_game()" was the game time (curr_frame_tick - prev_frame_tick). Thanks a lot for your detailed explanation.