Maybe the reason why this was the number one criticism was that it was so prominently mentioned in the introduction of Arc.
One complete paragraph on how hard it was to fix this in python and that it seemed to take whole year. And another large paragraph basically telling that it's trivial and could be done in a few days, but who cares anyway.
This is simply the sort of writing where pg had to expect attacks.
But from following the discussion on reddit, it seems that the most biting criticism of Arc are not related to unicode or w3c specs, but that Arc doesn't have any enhancements over scheme besides shortening some keywords.
Can someone with more knowledge than me weigh in on what Arc gives us that scheme doesn't?
I think its more like going back to the initial spirit of Scheme and making it right: "Programming languages should be designed not by piling feature on top of feature, but by removing the weaknesses and restrictions that make additional features appear necessary."
So, criticizing lack of features is meaningless to Arc. Valid criticism may be to find weaknesses and restrictions.
One of the more frustrating aspects of Scheme, to me, is that most
implementations layer a defacto strong type system on a weakly-typed base
(something CL does not suffer from, to a certain extent). "What does this
mean," you ask? Here we go!
(define (dumb-fun s)
(let ((front (substring s 0 3))
(back (substring s (- (string-length s 4))
(sub1 (string-length s)))))
(or (string-ci=? front "was")
(string-ci=? back "good")
(string-ci=? s "Yesterday"))))
Notice how, even though we didn't declare "s" to be a string when binding it, we
end up declaring its type in every single function call involving it.
Basically, most Scheme implementations end up with some kind of bizarre
inverse-type system, where programmers perform all of the work (extensive type declarations) and receive none of the
reward (compile-type type checking).
(This is all an artifact of how the rXrs's have been traditionally
written--lacking standard support for generic functions or type-introspection, they've essentially set the precedent by "typing" the functions
provided in the standard. For example, and straight from r5rs, we have:
Is it any surprise implementations end up with functions like 'bytes-length',
'hash-table-length', etc?).
It seems that pg is vaguely cognizant of this deficiency by overloading the
common operators; however, unlike CL (or even Haskell, for that matter), Arc
doesn't yet provide a way of extending the functions beyond the predefined
types.
Anyhing that is released in a version 1.0 (which I think arc isn't even) will have flaws and shortcomings. The value of the feedback by far exceeds the cost of having users look at a product that isn't flawless.
This is especially true for something as complicated as a programming language.
The last paragraph is great advice for all software development, not just languages.
When you're writing software for larger company it's difficult to justify redesigning to create a simpler, more elegant codebase because it adds zero value (from a customer's perspective).
When you're working on your own project, it's possible and also eventually pays off, yielding more maintainable code.
it's funny most people didn't seek perfomance charts than support of Unicode, but this due to the general nature of the audience here in Hacker News, who are mostly web application developers, therefore one of their priorities is multi-language support, so no need to blame anyone.
One more reason Arc is not for everyone is due to its functional syntax.
I think the challenge would be how to make functional syntax accesible to the masses. Arc seems to be one step forward, although functional programming will never be for everyone.
One thing I think Arc does (or intends to do) right is to be as cruft-free as possible. That is one thing you need to get functional programming to the masses. If learning FP means also having to master dotted pairs, associative tables, 13 different object comparison procedures, cadaddadr, the LOOP syntax... then you can count "the masses" right out.
Add a good introductory book (the current tutorial looks good; I haven't finished it yet) and you may be onto something.
One complete paragraph on how hard it was to fix this in python and that it seemed to take whole year. And another large paragraph basically telling that it's trivial and could be done in a few days, but who cares anyway.
This is simply the sort of writing where pg had to expect attacks.