I don't know about you, but to me once a code file is larger than say a couple hundred lines it becomes exponentially less readable. So any non-trivial amount of JavaScript already includes a concatenation build step anyways, even for development. As long as the TypeScript compiler is reasonably fast on larger (10k LOC) projects, it wouldn't be any issue at all.
This is a moot point. The Closure Compiler supports dependency resolution so that you can unit test your code without building the full project. So, if I am testing foo.bar.Baz, I can create a file that only includes `base.js' and a goog.require('foo.bar.Baz'); call, and `base.js' will include all of its dependencies, in most cases fairly quickly.
Unless you are performing integration tests, there is no reason to compile each time. Doing that is just the legacy of limited build systems.
You're realistically only going to deploy a compiled script. The compiler supports goog.require/provide() without requiring any JS inputs, and core tools like closurebuilder.py (which has superseded calcdeps.py) build your dependency graph for you. The compiler uses the require/provide statements to ensure that all necessary symbols are indeed provided, and are only included once in the script.
For debugging purposes, `base.js' implements goog.require() in JavaScript. The implementation is naïve, it just turns foo.bar.Baz into foo/bar/Baz.js and tries to include it. This is good enough for running unit tests or fiddling around with experimental code, which are the two times you're going to use `base.js'.
Yes, my question is whether goog.require() executes on the client's computer (in production) or whether Closure strips out those calls when you pass a special flag (as part of your deploy build). I personally don't want dependency resolution to happen at runtime in production.