> For example, if you want to specify that a sort function actually sorts the input list, you might find that the type specification ends up not much shorter than the actual code of the function. And apart from raw effort, this means that your type specifications start being large enough that they have their own bugs.
Not to mention the tools to debug complex type errors are generally much less mature than the tools to debug runtime errors.
But even so, I think we could still benefit from going a little further towards the "proof" end of the type system spectrum in most cases. I don't think anyone really wants to deal with Coq and similar, but having used a language with dependent types for integers and vector lengths it's really nice to be able to say stuff like "this integer is in the range [0, 8)" and then have it catch errors when you pass it to a function that expects [0, 3) or whatever.
Not to mention the tools to debug complex type errors are generally much less mature than the tools to debug runtime errors.
But even so, I think we could still benefit from going a little further towards the "proof" end of the type system spectrum in most cases. I don't think anyone really wants to deal with Coq and similar, but having used a language with dependent types for integers and vector lengths it's really nice to be able to say stuff like "this integer is in the range [0, 8)" and then have it catch errors when you pass it to a function that expects [0, 3) or whatever.