I would really, really love to know the answer to this too.
It's hard to describe my surprise that my 5-year-old PC, with a low power 1.6Ghz CPU, can easily breach 100fps while playing JamLegend, and my 9-month old, dual-core 2.66Ghz MacBook Pro can barely squeak out 40fps.
Speaking of framerate in Flash, setting the framerate on a particular SWF does not actually make the SWF run at that framerate, even if the client machine can run at that framerate. Rather, the framerate seems to be more a measure of CPU resources dedicated to running the SWF.
For example, back to JamLegend: My MBP can run JamLegend at 40fps, but not if the framerate is set to 40fps. If it's set to 40fps, I'll get 20. It it's set to 60fps, I'll get 30. In order to get 40fps, the framerate has to be set to 100. Meanwhile, on my old XP machine, setting it to 40 gets 40, 60->60, and 100->100.
So, very good question, why is it that Flash performance is so different on OS X than Windows?
True on the framerate -- it's an upper gate. Static video which overburdens a configuration can drop frames to keep up. But interactive presentations need to display each intended frame, and so will drop framerate instead. More here:
http://www.kaourantin.net/2006/05/frame-rates-in-flash-playe...
For the "why", that's Adobe Corp's tale to tell, but I haven't yet met a person inside Adobe who wouldn't be very, very happy if Mac/Win performance differences could be made to disappear.
That's a great link, I had not seen that article before. Though, in practice,the actual framerate to vary from the specified by much more than the -10 / +5 fps that he lists.
His final sentences might be relevant to this topic:
On high CPU load we might actually cut [the max framerate] into half, e.g. 30 frames/sec. OS X already does this in certain conditions.
Wish he had expanded on that a little more. Any idea what conditions cause OS X to halve the framerate?
It's hard to describe my surprise that my 5-year-old PC, with a low power 1.6Ghz CPU, can easily breach 100fps while playing JamLegend, and my 9-month old, dual-core 2.66Ghz MacBook Pro can barely squeak out 40fps.
Speaking of framerate in Flash, setting the framerate on a particular SWF does not actually make the SWF run at that framerate, even if the client machine can run at that framerate. Rather, the framerate seems to be more a measure of CPU resources dedicated to running the SWF.
For example, back to JamLegend: My MBP can run JamLegend at 40fps, but not if the framerate is set to 40fps. If it's set to 40fps, I'll get 20. It it's set to 60fps, I'll get 30. In order to get 40fps, the framerate has to be set to 100. Meanwhile, on my old XP machine, setting it to 40 gets 40, 60->60, and 100->100.
So, very good question, why is it that Flash performance is so different on OS X than Windows?