in which we used the iPhone Javascript API to capture touch/drag events, and sent them (via a laptop, and then a microcontroller) to a little toy car's transmitter. No SDK use or jailbreaking required at all. However, we had to rely on proprietary Javascript events, and had to include a laptop on ad-hoc wifi to relay the messages. Apple's restrictions certainly lead to major tradeoffs, but we were able to build something pretty neat and accessible regardless.
As a side note, they've locked down both the dock connector and the Bluetooth interface very tightly. Just buying the $99 dev license doesn't allow you to develop for either of those interfaces -- they have a separate "Made for iPod" program for that. The $99 isn't an insurmountable barrier for hobbyists who really want to tinker, but the "Made for iPod" certification probably is. (Some companies are using the audio input/output to hack around this restriction.)
Note that when I use the word "gesture," I am using it the way the HCI community uses it, to refer to strokes and combinations of strokes. iGesture is useful for implementing features like deleting things by "scrubbing" the screen, navigation by swiping the screen, and so forth.