Watching someone learns basics of programming in Python, I realized that Python is actually not that easy to learn for beginners as I thought:
1. Types in Python are implicit. It's really hard to explain that they need to always think about what type something is, especially when they haven't grasped the concept of types yet.
2. Indentation as part of the language. I thought it's a great thing for beginners, but it's actually confuses beginners that empty space affects your program, e.g. they think that spaces around assignment or arithmetic operators should affect their program too, which is not the case.
3. Ranges for iteration -- they just memorize the syntax blindly :(
So I started to wonder whether Pascal is better to learn for beginners -- you have to list all the types at the beginning, and no implicit conversions are done.
But now I think what about TCL? Maybe it'd be even easier for total beginners?
I've seen beginners, especially those without a very strong maths background, struggle with the concept of syntax. The idea that you need to put the thing you want the program to do into a very specific form. That any deviation is an error, even though "it's clear" to humans. I think that's something fundamental beginners need to learn, regardless of the language they use.
But once you accept that there are rules to follow, and that they are somewhat arbitrary, languages do differ wildly in the amount and obscurity of syntactic forms you need to learn just to get started. Just count the amount of different syntactic concepts in this simple program:
The problem is that Python encourages this "it's all cool, man" syntactic mindset. "Just indent your code the natural way, don't worry about all that extra syntax like curly braces." And then it comes back to bite you when your "formatting" causes syntax problems.
People are used to the idea of explicit delimiters, at least to some extent, thanks to punctuation. People are not at all used to whitespace having syntactic meaning. You can put as many spaces as you want between words, or sentences, or paragraphs, and it might look weird but it still means the same thing.
Semantic whitespace isn't a reduction in syntax, it just hides the syntax to fool you into thinking it isn't there, until it goes wrong and you suddenly have to debug it.
The problem is that Python encourages this "it's all cool, man" syntactic mindset.
It's not a problem, it's just a tradeoff. Like 97 percent of all the other differences between programming languages. As to whitespace in Python -- it was counterintuitive at first, but once I grokked the rationale behind it, I've never been tripped up by it (to an extent that it would slow my thinking for more than a second or two).
Every language has random $foo that you have to get used to and step around once in a while. If you don't like Python, fine, go back to C and bash. They'll be happy to welcome you back with open arms.
it doesn't seem simpler, and any argument that it's faster or easier to type seems to fall apart the first time you need to paste code across different indentation levels, right? what's the upside?
You can put as many spaces as you want between words, or sentences, or paragraphs, and it might look weird but it still means the same thing.
No -- whitespace definitely has meaning in natural languages. It doesn't usually change the meaning much -- but it does affect it (where meaning of course includes tone and consideration for the reader's experience). Especially when it comes to the use of space between sentences and paragraphs.
When my mom was learning to program, she was annoyed that the computer did not recognize that "O" means zero, and "l" means one. Her old manual typewriter didn't have 0 and 1 keys. So her typing habit resulted in a lot of errors.
Later she taught programming. The first thing she told students was: "Computers are dumb. They will only do exactly what you tell them."
I was lucky and that was also the first thing I learned about programming (with the amazing 80's "The Home Computer Course" magazine). I still remember the first program I read:
10 REM Computers are never wrong
20 PRINT "Please type a number"
30 INPUT A
40 LET A = A + 1
50 PRINT "The number you typed was"
60 PRINT A
After a few paragraphs of explanation I learned that computers do what you tell them to do, not what you want them to do.
Some details are hidden by the compiler, but Python isn’t exactly pure machine language either.
Semicolons and bracket are conceptually equivalent to significant white space (but more obviously explicit) and main is also a concept in Python (that is much more syntactically brutal than even its verbose c equivalent) that is hidden by the interpreter — which you also have to explain to beginners.
Only because the default these days is to treat warnings as errors. That code compiles fine with even modern gcc and clang with the -w flag which ignores warnings.
I agree, and I think the problem you mention is a special case of the difficulty some people have to grasp the concept of definition in the mathematical sense.
Unlike "dictionary" definitions, they can be arbitrary: if I say "let S be the set of prime numbers greater than 13" you pretty much have to accept it, as long as it's well-formed and consistent.
On the other hand definitions need to be very precise. Natural language relies on "you know what I mean", which is a feature for most human-to-human communication, but it's not how mathematics and computer science work!
The differences do start to diminish if you add some requirements like "print out the name of the script and the passed arguments then some message to stderr".
I had the pleasure to actually learn Pascal (Delphi actually) at school. I think it is a awesome teaching language.
Though it also showed the problem with teaching languages: The students were not that eager on it. They would have preferred a languages with better job prospects.
As motivation is the key for success to learning, yeah sometimes the language that offers the quickest benefits is a good choice. The strength of Python is that it has great libraries in very trendy fields that allow to get stuff done with very little knowledge. That is why both Python and JS (easiest to get a job) are good beginner languages. (That really hurts me to write as learning should not be solely focused on external rewards but such is the world currently)
That said, if I were to choose a language to teach programming then Lua would be the perfect candidate. It avoids the syntax pitfalls of Python as well as having much lower complexity while still being very powerful. Plus really great for gamedev so many motivating examples.
Same thing, I'm looking for a good explanation why it's not that important what language you learn first.
E.g. if they learn carpentry, and the classes teach them how to do dining chairs (proper tools, wood characteristics, drilling and gluing techniques). But in real world they'll have to make tables. Does it really matter if they learned to make chairs not tables?
You could try something with a regular syntax. It doesn't work for everyone, but when it works it works really well. Racket comes to mind, and it has a progressive suite of languages from neophyte to full-blown. In addition to a community focused on pedagogy and CS fundamentals.
I'm glad you had that experience, but as a "2nd" language, it was rough for me because it was the uncanny valley of programming languages: things _almost_ worked like in other languages but were not syntatically distinct enough to easily keep them separate in my mind. It'd be like if some new regex decided "{}" was range and "[]" was repetition
And we shall not discuss the horrors of its build and dependency system, which I appreciate isn't the "language" but what good is a programming language if one cannot compile it?
I second that. Go instantly came to mind as a fantastic beginner language.
- Go is also inspired by Pascal, but it has more of a foothold today, it shares the simplicity and the syntax is more natural and convenient than C and similar.
- It's a light language. You can approach it in terms of just functions (procedures) and data, as opposed to say Java (which unfortunately is one of the most used languages in to teach programming).
- The tooling is approachable, it compiles fast, there is no friction when it comes to automated testing, you get a static binary out of it etc.
- The std library is comprehensive and very easy to use. You can teach a ton of concepts with just that without ever having to worry about third party dependencies. A full web server, a CLI etc.
- It's low level enough so you can teach a bunch of important concepts such as pointers and constructs around pointers, allocation etc.
Good idea, my project is on Go right now. Will try teaching it and come back...
Nitpicks:
> compiles fast
That is totally irrelevant for beginners, as their programs <100 lines long.
The best would be not having to compile at all.
> as opposed to say Java
Not really a big problem, actually. I watched how people learn Java in college (I was a pro at the time so helped them) -- they usually get main() definition from professor, and then just copy that file around. They write everything right in main() or in other static methods, so not a biggie it's all wrapped in some mystic "class Main {...}". But I think it's much better that:
1. They have to write types for all variables and no implicit conversions made.
2. If it's not in autocomplete -- it doesn't exist. Compare that to Pythons "len()", or Go's "append()" (and explain why result of append() needs to be assigned if it works without it).
The traditional answer was to use a BASIC dialect. Minimal syntax, minimal structure, simple imperative commands. A lot simpler than Pascal. I don’t know TCL very well, but its scope semantics may be confusing. I’d take a look at LOGO also.
I think the beauty of BASIC for teaching is that it's easy to "think like the computer," because the computer holds very little information other than the values of the variables that you used. For this reason things like loops are quite concrete.
Those who break past "the wall" and achieve any fluency at programming, can and will learn to start thinking in layered abstractions. And their programs will start to get bigger. Then it's time to move away from BASIC.
I moved to Pascal. Today, Python. But I agree with the above and about Python, that you are forced to handle abstractions, almost from the git-go. Perhaps what saves it is the ease of trying things, so if the functioning of a loop like "for i in range(10)" is confusing, just try it with a print() statement in there.
BASIC was also preferable on a timeshare machine. Waiting 10 seconds per line for a response seemed more tolerable than waiting 10 minutes for an entire program to compile and link.
Having learned Basic (Commodore as well as Qbasic), Pascal (not Delphi) and TCL in my youth, TCL is the only one I used in production. TCL is fairly simple to learn, it only takes a few concepts really, and it is quite a joy, if memory serves right. Of those I mentioned, it is the only one that I would touch again, for any reason, whether for fun or profit.
The only problem with BASIC is that it's kind of gross scaled up for medium-sized applications. It's downright charming at the Tiny-BASIC, VIC-20 BASIC, or even BASIC-52 level though.
LOGO has real depth, an elegant syntax, and there's an alternate universe where it has Python's niche as everyone's favorite scripting language. But it's effectively a toy language so there's nowhere to go from "learning".
Here's my hot take: before you teach anyone "the basics of programming", teach them how to enter formulas in a spreadsheet.
Tcl forces you to create new data, because you have to think and work harder to introduce effects on variables in upper scope through the use of uplevel and upvar. This reduces side effects significantly and makes data transformation explicit. Which, in turn, reduces cognitive load and increase productivity.
Tcl also does not have distinguished NULL value like Python, C, C++, C# and many other languages. You do not have to account for that.
The reduction of side effects and no universal no-value value puts Tcl in my hierarchy of programming languages right below Haskell and above pretty much everything else. Tcl is very easy to use when you know how to structure the program.
I still use it for various prototyping when I do not want strict type discipline.
Does this non-existence of a no-value not only mean, that when there is no result value, another value (like false or empty string or empty list or whatever) will be used to express that fact, or that an exception is used? One would have to check for those in some cases as well. Those might not be always good alternatives reducing required checks. How does it work?
It is easy to use and write. You can follow algebraic data types convention and prefix values with tags and use something like control structure drafted above. I did that, it is relatively comfortable.
Note that special case of empty list also handled as if it is Nothing. So it is more useful than would appear. ;)
One fundamental principle in tcl is that everything is a string (EIAS) [1]. Therefore, an empty string "" is the same thing as a "null value" or 0-length list, but generally doesn't require any special treatment (your function that takes a list as input generally should know how to handle empty lists).
[1] In practice things are typed in the interpreter internals for performance reasons, but the language behaves as if EIAS.
I think it's worth pointing out that in the years since this article was written (2006) quite a few new features have been added to Tcl. A few of the more significant include coroutines, robust object-oriented system, tailcalls, apply (lambdas), namespace ensembles, et.al.
Of course some of these features were already present in other languages. However, steady improvements to Tcl have pushed the language well beyond how it was described in the article. Definitely not a "toy" but overall as capable as the other scripting tools out there.
What I like most about TCL is that a TCL REPL feels a lot more like a normal command prompt than Lua or Python or whatever. If I’m going to write a complex program, TCL wouldn’t be my first choice, but if a program I’m using has a command prompt (think autocad or the debug console a lot of games have). Plenty easy for one-off commands, but with a simple and powerful programming language there if you ever need it.
Put differently: The command line ergonomics of POSIX shell, with the scripting ergonomics of Lisp
Yeah, that may be why it became popular in CAE/EDA tools, one of the the few places it remains used these days. IMO Tcl works well for that use case, but it often catches shit from people who've had to use it. I think it's either because they misunderstand the language (as in the article, but also not that people who use CAE or EDA tools may have very little programming background) or that the vendor APIs provided to interact with the tool are very poorly designed (which I have definitely experienced).
Yes. I've been kind of wondering about the apparent promotion of Tcl here recently - is someone pushing it for some reason, or is it just a passing fad in a slow (tech) news time?
Years back, for a lot of us, tcl and tk was the bees knees. It was what we use Qt for now, but much easier in some regards. I think sometimes that as we reach peak saturation in other languages people branch back out into areas we enjoyed in the past
My favourite thing about Tcl is (was? This was 25 years ago) that adding commands just requires a couple of Tcl things, then an argc/argv, so it was relatively simple to turn an existing C program into a Tcl command:
int my_command (ClientData clientData, Tcl_Interp *interp, int argc, const char *argv[]) { ... }
However, that might have been the only thing I liked about Tcl. For some binary data import tasks I ended up writing a Perl program that translated it into Tcl.
The thing that I most appreciate about Tcl is that it feels like Unix shell "done right". It is still a bit weird because everything is a string, but it doesn't have all those footguns around variable expansion, such as word splitting. It also has decent support for subroutines and local variables, no weirdness with subshells and so on.
> Every string is internally encoded in utf-8, all the string operations are Unicode-safe
This seems very slightly disingenuous, if my memory is correct. I don't remember all the details, but I ran into an issue with a Tcl application a while back and Unicode support. Digging into it, I recall that the Tcl interpreter actually represents every character as a predefined number of bytes, set by a preprocessor definition. The default was 2, and cut off any Unicode character that needed more than 2 bytes to encode, unless you were willing to recompile the interpreter to use 4 bytes, at the cost of doubling memory consumption for every string.
Very nitpicky, but I think it's important to point out it's not "quite" utf-8, because Tcl needs each codepoint to be O(1) indexable in an array, something normal utf-8 can't do.
> Digging into it, I recall that the Tcl interpreter actually represents every character as a predefined number of bytes, set by a preprocessor definition.
Ah, that’s exactly what old python did. Wonder of it was inspired by the tcl solution.
Fwiw because they rejected indexing recent cpython uses a variable encoding based on contents (possibilities are iso-8859-1, ucs2, or ucs4). That does mean adding an astral codepoint to an ASCII string quadruples its size.
I assume its because the original unicode only supported 2 byte characters, and even after astral characters became a thing it took a while for them to be used for non-cjk things.
> That's pretty much what every older language did.
The fixed size, but I’m referring to the compile-time width switch.
I may be wrong but I was under the impression most older langages simply remained on their original character width (or left the issue ill defined and/or added an ancillary type like C).
This is nonsense. I have had the misfortune of having to learn TCL (it's standard in the EDA world unfortunately) and it is most assuredly a toy language. That doesn't mean you can't write big complicated programs with it. You can do that with any language. But stuff like this is an outright lie:
> There are no types, and you don't need to perform conversions, however, you aren't likely to introduce bugs because the checks on the format of the strings are very strict.
You aren't likely to introduce bugs when literally every type is a string. Even lists. Lists are strings. Lists of strings are strings. Maps are strings.
It's pretty insane, and the interpreter does some horrible magic so that if you never access a list as a string it can store it internally as a real list (to avoid insanely bad performance) but to the programmer it's still a string and nothing prevents you from treating it as such.
I actually made a semi-funcitonal WASM-to-TCL transpiler to avoid having to write TCL. Worked kind of well! Unfortunately WASM is quite large and I lost interest before completing it.
I've managed to work on two TCL codebases in my career, one in 1992ish, and one in 2011ish. I was very surprised to be working on it the second time! I've said it before, but working on TCL is interesting, kinda fun, and ultimately a lesson in frustration once you get to a certain sized codebase. It really hammered home the need for static typing as your repo gets over 50k lines because it is so "type free". However, it is a cool thought experiment for it's premise, which is more-or-less, "what if everything is a string". It has a lot of benefits, and a lot of nasty side effects. I'm glad I spent time doing it, and I think the use case for which it was made ("Tool Control" roughly speaking) it has good ideas. TK was fun, as well, I miss it in some ways. But ultimately, TCL is a dead end for a programming language - the negatives accumulate too quickly.
The idea that Tcl was a toy came from the 90s and the Tcl Wars Stallman was involved with. But to Stallman, anything other than Lisp is pretty much a toy. Tcl is indeed a deep, deep rabbithole -- again, deeper than our dimension allows and lined with toothed cilia... but I can totally see where a person might look in Tcl's deadlights and want to be there.
Ironically Tcl is very Lisp-like in certain respects, though also quite different. Rabbitholes notwithstanding Tcl can be enormously useful and productive. In particular I think the built-in Tcl object-oriented features are a real asset and not nearly the same sort of rabbithole as you encountered in the past.
It does have very flexible metaprogramming capability and the syntax of the language can be extended in the language itself, very much like Lisp, and like Lisp it has a high skill ceiling and a huge possibility space to explore.
It's the ways in which it is not like Lisp, the features that Lisp hackers discovered decades ago were nests of bugs and confusion that... arghhh.
My overall impression of Tcl, however, is quite positive -- and Tcl/Tk is a JATO bottle for rapid GUI development that no Lisp implementation has yet been able to match -- not without embedding or incorporating Tcl itself (e.g., PS/Tk: https://wiki.tcl-lang.org/page/PS%2FTk).
It might be powerful, and Lisp like, however in any large application eventually the amount of C code in extensions to make it perform, outgrows the TCL code.
And then one stars wondering if it isn't better to migrate to something else.
This article fails to note that Tcl is very closely related to shell script. A lot of the concepts listed already exist in Shell, and nobody complains that Shell is only a toy language.
Fond memories of tcl inside ModelSim - it was not the Lua I was enjoying then but, like using emacs, made me love automation and coding a little bit more.
TCL was my entry into coding back when I was a teenager, haven't touched it in over 12 years or so but I'll always fondly remember it. The title of this post is very familiar to me, I'm sure I read it back when it was published but its cool to now know it was written by the creator of redis which I now use extensively
> On-Topic: Anything that good hackers would find interesting. That includes more than hacking and startups. If you had to reduce it to a sentence, the answer might be: anything that gratifies one's intellectual curiosity.
From the HN Guidelines.
Note the lack of the pattern "new" anywhere in there.
>Note the lack of the pattern "new" anywhere in there.
Er... it's literally the name of the site, 'Hacker NEWS'. A 16 year old article is not news. No matter how interesting it might be. The custom here is to put the year in brackets after the title, when posting an old article
Nonetheless, the guidelines allow for such things - and I'm grateful they do, as I enjoyed the time I spent reading this article and am smarter now than I was beforehand.
I do agree that it should've originally had the year added when originally posted.
HN gets numerous articles submitted multiple times, as people discover them for the first time. It's frowned upon to repost too frequently but this was last posted to HN on Oct 28, 2019 so I see no problem with it.
Overwhelming number of tcl positive comments here. Must be selection bias. Tcl is ridiculous. I propose that anyone who prefers tcl over other top languages in 2022 simply doesn't have productivity or teamwork as their top goal.
Do you want your language to help you solve real problems, or do you just enjoy your weird toy from another era? Everything is a string..? Please.
1. Types in Python are implicit. It's really hard to explain that they need to always think about what type something is, especially when they haven't grasped the concept of types yet.
2. Indentation as part of the language. I thought it's a great thing for beginners, but it's actually confuses beginners that empty space affects your program, e.g. they think that spaces around assignment or arithmetic operators should affect their program too, which is not the case.
3. Ranges for iteration -- they just memorize the syntax blindly :(
So I started to wonder whether Pascal is better to learn for beginners -- you have to list all the types at the beginning, and no implicit conversions are done.
But now I think what about TCL? Maybe it'd be even easier for total beginners?