I feel kinda alone on HN, Lobste.rs and LtU in not having in-depth knowledge or opinions on type systems. I get that these underpin the technology we as programmers use every single day, but I'm a little ashamed that I can't get excited about the subject and feel like it's too late for me to bother trying.
I believe you get an actual interest for type systems when you start using a rich one. Unfortunately, most mainstream languages have very poor type systems. In particular, it will come naturally over time if you start using languages like OCaml, F#, Scala, Rust, Haskell, ...
There’s the technical aspects of the fact that every language has to have some notion of “type”. And seemingly interpreted languages might be JIT-compiled etc. This is of interest if you care about the implementation of languages.
Then there’s the opinions that users of languages with more elaborate, expressive type systems have, like how some people really enjoy Haskell or Elm because they feel that the type system helps them express their ideas clearly, avoid errors, and aids refactoring and maintenance.
If you’re worried about this one, don’t I guess? If you can use dynamic languages to achieve your goals, and you like them, then that’s fine! There are plenty of languages you can play with if you want to get a feel for programming with types. Even Java 8 and C++11 are decent at this point (I’m sure a Haskell programmer is fuming right now).
Then there’s like, a few thousand people in the world who have well-informed opinions on research into the theory of programming languages, the Curry-Howard correspondence between types and proofs. Also a lot of Hacker News posters who have heard these words. Some of them pretend like they know what they’re talking about.
; compiling file "/private/tmp/test.lisp" (written 08 JUL 2017 11:10:09 AM):
; compiling (DEFTYPE I1 ...)
; compiling (DECLAIM (FTYPE # ...))
; compiling (DEFUN FOO ...)
; compiling (DEFUN TEST ...)
; file: /private/tmp/test.lisp
; in: DEFUN TEST
; (FOO 311212.2)
;
; note: deleting unreachable code
;
; caught WARNING:
; Constant 311212.2 conflicts with its asserted type INTEGER.
; See also:
; The SBCL Manual, Node "Handling of Types"
;
; compilation unit finished
; caught 1 WARNING condition
; printed 1 note
As you can see that it detects the type error.
The type system has been defined for Common Lisp with its first version in 1984. It had been used to allow the compiler to generate optimized code or to do more precise runtime checks. Early on in the mod 80s the CMUCL compiler then added the idea to use these type declarations as type assertions and to check those at compile time.
If you're going to answer the question of whether a language has types, then you're already trying to see how it can be shoved into types. It's presupposed by the question itself.
You literally have to count the types by analyzing the grammar. So for the lambda calculus, you have lambda=1, halt. So does a single type mean no types or literally one type?
What's the advantage of thinking that 1 type actually equals 0 types? I don't see any, so in my mind, all languages are unityped or have a richer type structure. Whether a richer type structure is desirable is a separate question.
Easily my second favorite instructor, possibly my favorite. I felt pretty prepared to deal with analysis and design in statically typed languages just from that grounding in set theory and logic. Fond of saying things like, "We have a box. What's inside that box? Another box. What's inside that box? We don't care."
Now retired, he was an early proponent of distance learning, so surely some of his stuff is accessible still.
One of the things that's so wonderful about writing software as a profession is that there is a ridiculously huge array of use cases for different languages and styles.
There's nothing wrong with not caring about type systems if they don't make your life or your job better or worse.
As long as you enjoy what you are doing, everything else is optional.
I didn't start caring about type systems until I started running into cases where I really wished for a static one (when I was working on a large system in Python) and later when I was prototyping things where I had to make a lot of guesses in C#. Both situations frustrated me, and then I got to start really caring a lot about type systems.
To a certain extent, I think it's human nature that we often don't really start caring about things that much until we experience real, personal frustration with them. Then we start caring a lot.
The reason you see so many people weighing in on this here on HN, is that many regulars are the kind of person to start feeling pain very very soon and over small inconveniences where other people will just sort of deal with the minor inconvenience and focus on other aspects.
One attitude is not better than the other, nor is one more ideologically pure or a marker of a better programmer or engineer. The only thing it implies is different pain thresholds.
Depending on your area of focus as a developer a low or a high threshold could be either a benefit or a drawback. A language designer needs to have a very low threshold. A front-end developer/designer can afford to have a very high threshold and focus on things besides being provably correct.
One of the things I like the most about software engineering is that there are opportunities for joy and discovery for everyone. And as careers progress, you can easily find yourself caring about different things at different times, and there's nothing wrong with that.
There's nothing to be ashamed of any more than you should be ashamed of preferring strawberry to chocolate ice cream. (Although, in keeping with tradition here on HN, if you say that you prefer strawberry ice cream, you are dead to me and practically Hitler. :).
I don't think it's common for programmers to have in depth opinions about type systems. And most of the ones who do may not really know what they're talking about.
> I get that these underpin the technology we as programmers use every single day
Is this actually true? Of course Haskell, Idris, etc. leverage type theory, but how much type theory underlies the type systems of widespread practical languages like C# or Java? Can something like C++'s SFINAE be grounded in type theory, or is it just a hack?
There is an official book which is very well written. I remember that I started to read it several years ago and was surprised that I actually can understand most of the things.
I feel your pain: after bouncing off Haskell many times, I found that Elm was a great entry into a more practical and narrow way of experimenting with the benefits. Now I'm reading through the new Idris book, which has the same practical approach to more complex (to me) concepts.
What languages have you tried? I think once you experience a language with types outside of the run of the mill Java/C#, you will get more excited about it. Ocaml, Rust, Haskell, Purescript, etc. Haskell for me was the one that got me excited about types.
ive only used vbscript and javascript extensibly. when ive tried java and #C ive been annoyed by the verbosity of types. and when looking at haskel or ocaml im just confused. for me types are an optimization or extra documentation for undescriptive naming, like str x, int y, list z. vs. name,age,friends.
so i want to know what im missing, will there be less bugs and regressions? will i be more productive ?
> for me types are an optimization or extra documentation for undescriptive naming,
This is one of those things where you should try to reserve judgement about it because your experience is so limited. Modern typed languages often don't even require you to write the type, because of type inference.
> will there be less bugs and regressions? will i be more productive ?
The idea with static analysis is that you're pushing more errors into the type system so it's caught at compile time rather than runtime. Everyone will answer this differently.
IMO a dynamic type system doesn't make you more productive because the same invariants you have from not having an explicit type still exist in the code, they just go unchecked. For example, if I write a function to add 1 to a number, in a dynamic language if I pass a string I'll get some output that is invalid if I'm expecting the result to be a number elsewhere. Types let you encode those invariants. But encoding simple types and primitives is really just scratching the surface, I could ramble on for ages here but you should just dive into a language with a good type system (like Haskell or Ocaml like you mentioned) and stick with it long enough to give it a chance. It's so much more than just being and to say 'int' or 'string'.
Speaking of adding to a number, in some dynamic languages, you can add 3/5 to 7/5 and the result will be 2, of type integer, indistinguishable from the object produced by a literal 2. That 3/5 and 7/5 come from some run-time source, so the result type can't be statically hard-coded to integer or rational or whatever. And so now on the static side you're into variant types and "maybes" and other junk creating an incomprehensible soup which basically Greenspuns dynamic typing in a way that will get your name cursed by subsequent maintainers.
I can't make heads or tails of what your criticism is. Algebraic types are wonderful, I find it difficult to code in a language without them. I'd hardly call it 'junk'.
If. I suspect that smcl can't get excited about type systems because of not seeing the benefit.
For me, types are sets of possible values, plus sets of valid operations on those values. I don't much care where they come from. As far as I am concerned, they are an engineering construct to make programming easier and safer, and are interesting only to the degree that they accomplish those goals. Any connection to pure math is completely incidental; if there were no such connection, it would not make (programming) types any less useful.
Now, math often gives deeper insight into what's going on, and enables you to create more powerful (useful) abstractions. But if the useful abstractions don't correspond perfectly to types as used by mathematics, I don't care.
I'm very interested in theory (by the standards of non-academics). I'm very interested in pragmatic type systems. I've spent a few hundred hours on trying to learn type theory, in the mathematical sense. The only thing I have personally found useful, so far, in type theory, is the notion of sum types and product types. But that's just jargon for things I was able to deduce from a shallow study of many programming languages, so even that has not been that useful to me.
On the other hand, I entirely agree with you, that what a type is to a programmer is a set of values and a set of valid operations on those values. Exactly. That's what types mean when you're working close to the metal ("this value is meaningful as a 16-bit float; if you try to dereference it, the consequences are mightily hard to reason about"). That's what types mean when you're talking about the function signatures of higher-order functions that use generic types. That's what types mean when describing statically-typed variables, and what types mean when describing dynamically-typed values.
I see some signs that a few other people share my interest. For example, using dependent types to e.g. specify that two sequences can only be zipped if they are of the same length; that's a useful type-check, and if it can be determined statically, that's great (that's a toy example, of course). Unfortunately, most of the languages that contain these features seem remarkably impenetrable.
I am very interested in situations where math reveals underlying truths about the universe. Like you, I'm so-far unpersuaded that mathematical type theory is a useful avenue to learning about powerful abstractions about types in programming.
I don't think approaching type systems from the mathematical angle is much help either. When I was learning Haskell, I started to try to learn category theory to develop a better understanding of what was going on. In the end I decided my time was better spent learning Haskell, and not category theory. Where they share words I look shallowly into the mathematical concept, but the understanding I develop is how it relates to me in a programming context. That doesn't mean I don't value that there is a mathematical basis for it though.