Any time I'm doing lego-brick engineering, aka unix style "small tools pipe together", then startup time impacts me most while I'm actively developing the new system.
That process is iterative, usually on a subset of the data, and slow starting tools make my day qualitatively worse.
The final performance matters sometimes, but more often than not it doesn't. 30 minutes or 45 for a batch job? That runs from cron? I don't usually care.
I'm never going to favour slow starting tools unless some aspect of their functionality pays for that cost, or I'm choice constrained, but final peak performance is almost never going to be a factor.
So to me it's not psychological bias, it's economics, and the currency is time, when it's me personally paying it. Semantics maybe, but just writing it of as bias isn't going to help me get things done.
Dynamic systems like the ones based on javascript or Java or scheme have received A LOT of attention in the last 10-20 years. In fact, javascript does start fairy quickly.
The usual solution to the startup problem is to have many compilers working at once (aka tiered jit compilation). A fast bytecode interpreter + mid-level jit + ssa-based heavy stuff, all working in parallel. Obviously, this means a lot of background compute and memory burned.
That process is iterative, usually on a subset of the data, and slow starting tools make my day qualitatively worse.
The final performance matters sometimes, but more often than not it doesn't. 30 minutes or 45 for a batch job? That runs from cron? I don't usually care.
I'm never going to favour slow starting tools unless some aspect of their functionality pays for that cost, or I'm choice constrained, but final peak performance is almost never going to be a factor.
So to me it's not psychological bias, it's economics, and the currency is time, when it's me personally paying it. Semantics maybe, but just writing it of as bias isn't going to help me get things done.