32ms divided by 4 is 0.008s which gives us 125Hz (the typical mouse sample rate). So my guess is that some form of mouse movement analysis (like mouse smoothing) is working on at least four samples (so that it can statistically mean something), rolling.
Why it does so might be that 32ms is the right time to evaluate the intent of a typical human interaction with a pointer, maybe even using some form of inverse kinematics, so that when you move your hand towards a small button aiming is smoothly assisted, and/or click target is helpfully adjusted in size to enhance Fitt's law compliance (like iOS keyboard keys dynamically changing sizes depending on the sequence of letters, hence words you type)
Of course this bites hard outside of this environment. It seems Mac OS X needs an API to provide some form of raw value access to the pointer so that arbitrary software (notably games) can conduct its own analysis.
Anecdotally, my personal experience has been that it never bothered me, neither in games (FPS and RTS) nor in WIMP.
IMO a more likely answer: 32ms / 2 = 16ms, the typical frame time for a monitor running at 60 Hz. Also the post noted disabling QuartzExtreme (GPU accelerated rendering) fixed the issue. I suppose the mouse cursor is drawn with GPU acceleration. Then, to optimise rendering, the window system uses double buffering (a common technique) - however, double-buffering has the well-known side-effect of introducing a one frame delay. So it would take 16ms to see your mouse cursor update, but with double buffering it takes 32ms.
I think Windows uses a special graphics card feature (an "overlay" IIRC) to update the mouse cursor independently of the double buffered window system. I noticed moving windows became ever so slightly more laggy going from XP to Vista - probably because the window system became GPU accelerated.
When working on drivers for capacitive touchscreens (of which trackpads are a pretty close sibling), we put all kinds of IIR filtering in there to smooth out the cursor and usually a variance filter to take away noise. Since those filters (esp. IIR) rely on data from previous scans, they naturally introduce a delay into the chain.
Usually this is all on the sensing side and not the host side. Perhaps there's some trackpad filtering that's being applied to the mouse when it didn't need to be?
Indeed there exist this low-level filtering, but what I suppose is at a higher level. Say you have a button in the top left quadrant and your mouse around the bottom right corner. Now you move the mouse to point over the button. In doing so your initial move is quite ample towards the button. From speed and direction the OS could infer that you want to reach the button and ever so slightly steer your move towards it. This would be reevaluated at each 'step' across the move, giving a smooth, spline-like movement, and as you get closer to the button, its virtual size would grow ever so slightly, so that if you rush to the click and undershoot or overshoot it by a few pixels, the click still get registered. Even when there's nothing of interest on screen this system would still be registering samples, but doing NOOP, hence the preservation of the delay in a game situation.
Such a scenario would be impossible to do so close to the hardware as it would have no knowledge of the GUI. I have not even the foggiest idea if OSX does something even close to that but hey, you'll never know. It could just as well be some eager host-side smoothing to accommodate for vintage or unknown mice.
I'm a little surprised that mouse state is something that a modern OS would be sampling (as opposed to, say, getting events 'pushed' by the device up through the driver). Or is this one of those "all event systems are really sampling once you look deep enough" things?
USB is a polled system. Doesn't mean you can't "push" data through it with some kind of supersampling, but you need to hit a rendezvous with the host controller.
32ms is FOREVER for game input. That's two whole frames of a 60hz game.
Mice don't send a continuous stream of information to the computer. Instead, they send samples at some rate. It's that rate that lloeki is referring to.
So the mouse itself is polling/sending at a particular rate? It sounded to me from the article (and the above commenter) like the host machine was doing the polling.
I was recently using an algorithm for determining control points for a bezier curve. You need 4 total points to accurately determine the curve, maybe there is a similar curve that apple is solving for?
32ms divided by 4 is 0.008s which gives us 125Hz (the typical mouse sample rate). So my guess is that some form of mouse movement analysis (like mouse smoothing) is working on at least four samples (so that it can statistically mean something), rolling.
Why it does so might be that 32ms is the right time to evaluate the intent of a typical human interaction with a pointer, maybe even using some form of inverse kinematics, so that when you move your hand towards a small button aiming is smoothly assisted, and/or click target is helpfully adjusted in size to enhance Fitt's law compliance (like iOS keyboard keys dynamically changing sizes depending on the sequence of letters, hence words you type)
Of course this bites hard outside of this environment. It seems Mac OS X needs an API to provide some form of raw value access to the pointer so that arbitrary software (notably games) can conduct its own analysis.
Anecdotally, my personal experience has been that it never bothered me, neither in games (FPS and RTS) nor in WIMP.