And if only Heroku still had their original product still integrated (a web based code editor), we'd be all set.
I write and run code every day, but rarely in front of the machine it executes on — even the editor isn't running on the machine I'm typing on. I live in browser tabs and terminal emulators displaying remote GNU Screen sessions.
When I have a terminal emulator compiled for Google NaCL (even if I have to port it myself), I'll be living exclusively in browser tabs.
in which we used the iPhone Javascript API to capture touch/drag events, and sent them (via a laptop, and then a microcontroller) to a little toy car's transmitter. No SDK use or jailbreaking required at all. However, we had to rely on proprietary Javascript events, and had to include a laptop on ad-hoc wifi to relay the messages. Apple's restrictions certainly lead to major tradeoffs, but we were able to build something pretty neat and accessible regardless.
As a side note, they've locked down both the dock connector and the Bluetooth interface very tightly. Just buying the $99 dev license doesn't allow you to develop for either of those interfaces -- they have a separate "Made for iPod" program for that. The $99 isn't an insurmountable barrier for hobbyists who really want to tinker, but the "Made for iPod" certification probably is. (Some companies are using the audio input/output to hack around this restriction.)
Note that when I use the word "gesture," I am using it the way the HCI community uses it, to refer to strokes and combinations of strokes. iGesture is useful for implementing features like deleting things by "scrubbing" the screen, navigation by swiping the screen, and so forth.