These new features fix nothing, they just add more things that weren't technically needed in the first place.
There is no way to "fix" Javascript without breaking compatibility. It will always suffer from its poor initial design decisions, or it will stop actually being Javascript.
I disagree because of the changes I have in C++ between C++03 and C++11. Compatibility was not broken there, and writing with the new features is a breeze but the option to drop to the old stuff when required still exists (even though it is slow and buggy).
I am not javascript expert, but it seems that the the choices in the newer versions of Javascript are similar to the new changes in C++. No language can make it impossible to prevent bad decisions, but the newer versions of these languages can make good decisions easier.
Was "fun" (tm) (C++11 adds a 2-argument keyword, C++17 adds a 1-argument overload that finally undoes the need for the rewrite.)
I'll also note we have vastly different standards as to what it would take to "fix" C++. atoi("9999999999") can still launch nethack (undefined behavior!), and C++11's response is to add a new overload, atoll. C++17 still doesn't have modules (TRs aside), still has a grammar that's so obscene to try and parse that compilers still disagree on some of the finer points, and still thinks "undefined behavior" is a hip new metal band to name drop for cred, rather than a last resort.
This is not to say we shouldn't try to improve the language anyways, but compatibility was and will continue to be broken (hopefully in small ways that generate compile errors), and C++ will continue to suffer from it's initial design, and it will continue to do so for as long as it remains C++.
Fix the static assert should have been trivial, it should have been just a find and replace. Just like any other new keyword in any other new update to a language. The only exception might be if you used string concatenation during macro resolution, if that is the case it shouldn't have worked in the first.
atoi causing undefined is not a real complaint because it has always caused undefined behavior. You can't blame changing that on the standard or the new version of C++ because it was already allowed to change with every execution of the program. Trying to fix non-determinism is a good thing because well formed code should already be avoiding these undefined behaviors.
Why do and others try to complain when you use undefined behavior and are surprised when it changes in undefined ways? You specifically asked to not be able to know the answer.
> Fix the static assert should have been trivial, it should have been just a find and replace.
Merge conflicts are the gift that just keeps on giving. Every refactor on another branch touching the same code is now a conflict. Any branch doing a search and replace slightly differently (say, whitespacing) is now a conflict. Have you ever had to resolve so many of these conflicts that you started accidentally misresolving even the trivial ones? I have. The nontrivial ones can become much harder to code review (e.g. two branches refactored code in the same file, possibly already causing a merge conflict, now complicated by 'spurious' conflicts unrelated to that refactoring.)
Guess how I found out that P4Diff's UI was buggy and could corrupt your local changes? ...a similarly 'trivial' search and replace for logging functions. The static_assert one actually wasn't really that bad ;). Still not 'trivial' though.
> [...] atoi [...] You can't blame changing that on the standard
That wasn't my intent. My intent was to point out broken behavior that the standard decided to double down (by adding atoll) on instead of fixing (by, say, deprecating atoi, like your compiler or programming standard should.) atoi causing UB is still a real complaint - just not a broken compatibility complaint.
> Why do [you] and others try to complain when you use undefined behavior and are surprised when it changes in undefined ways? You specifically asked to not be able to know the answer.
I don't?
You're perhaps thinking of unspecified behavior. Undefined behavior is a damn sight worse than "asking not to know the answer": it's accidentally telling the compiler to generate a broken program that could do anything, for nobody does this intentionally.
On the subject of things I do ask: I spend a lot of time on enabling more warnings as errors, using more static analysis tools, new compilers, etc. to try and find every instance of accidental undefined behavior so we can fix it. I'm specifically asking to know about all instances of undefined behavior so we can fix them to not invoke undefined behavior. If I could get a comprehensive answer to this, I'd give C++ a lot less shit.
But I cannot get a comprehensive answer to this. There will always be something to slip through the cracks. And other programming languages, taunting me with their lack of undefined behavior. Hell, I'd even settle for a comprehensive answer to "where's all the undefined behavior in my code", and settle for my existing debugging loop when dealing with undefined behavior in third party and system libraries. But I can't even have that on C++ codebases.
> Merge conflicts are the gift that just keeps on giving.
I deeply sympathize you on this.
I see now what you meant on atoi, I suppose I agree. I prefer boost::lexical_cast or rolling my own when performance matters and type safety is in question.
I was too harsh in my earlier comment and I see I wrote at least some it in anger and misunderstanding. The fact it is accidental is bad, more compiler warnings should be able to be enabled by default in my opinion that catch this kind of non-sense. I agree it shouldn't be possible to be accidental.
I suppose I have had this argument with too many people who see C and C++ as slightly more portable assemblers and grow to rely on specific results of undefined behavior. These people try to defend their right to rely on old and specific undefined behavior. I mistakenly lumped you in with these people incorrectly, please forgive me.
For me the extremely weak type system is one thing. In general the language is too accepting of bad code. It's kinda like HTML, "if you can guess at what it's supposed to do, do it." Personally, I prefer stricter interpretation over weaker versions.
Another big one for me is "this", which has significantly different semantics than nearly any other OO language out there. I personally find it confusing, and it requires you in many cases to understand the invocation point of a function. I can not reason about all the data related to the scope of the function, in the function, without first taking into consideration all the callers of the function.
To be fair though, JS is not the only language that features these flaws. The biggest problem with it is that I'm not really given a choice to use other languages. Yes, there are transpilers, but even with those I almost always need to use JS at some point for integration with other code.
- Bad Performance
- Extremely weak type system
- No generic language
- Too many abstractions
- Double behavior of types
- Bad scaling
- Extremely bad OOP Implementation (Just 1% of OOP)
- the Debugging approach doesn't make any sense
- The Community (1000+ Frameworks with almost the same features for what existing standards?????)
It's one of the fastest dynamic languages in existence, considerably faster than e.g. Ruby and Python
> - Extremely weak type system
Compared to what? It's a dynamic language.
> - No generic language - Too many abstractions - Double behavior of types - Bad scaling -
Vague, what does these even mean? Compared to what?
> Extremely bad OOP Implementation (Just 1% of OOP)
Again vague, what is it lacking? Perhaps it just implements the bits people actually use?
> the Debugging approach doesn't make any sense
No idea what this means. Chrome devtools debugger is excellent.
> The Community (1000+ Frameworks with almost the same features for what existing standards?????)
It's a huge community, there're bound to be lots of camps and lots of people trying to crack the same nut different ways. What's wrong with competition? I personally think it's a fantastic community, and find others lacking by comparison.
> It's one of the fastest dynamic languages in existence, considerably faster than e.g. Ruby and Python
I think this is kind of a question of what it means for a language to be fast. The big browser companies have poured a mind-blowing amount of resources into making their JavaScript engines fast. This has made these JavaScript engines faster than other implementations of slow languages that have not had similar resources devoted to performance. But this does not mean "JavaScript is fast" in the sense that the design of JavaScript readily enables good performance, and it doesn't make JavaScript fast relative to actually fast languages.
> Compared to what? It's a dynamic language.
JavaScript's type system is extraordinarily weak even compared to most popular dynamic languages. For example:
$ python -c "print(1 + '1')"
Traceback (most recent call last):
File "<string>", line 1, in <module>
TypeError: unsupported operand type(s) for +: 'int' and 'str'
$ ruby -e "puts(1 + '1')"
-e:1:in `+': String can't be coerced into Fixnum (TypeError)
from -e:1:in `<main>'
$ node -e "console.log(1 + '1')"
11
> I think this is kind of a question of what it means for a language to be fast [...]
It's significantly faster than most other dynamic languages. It's fast as a compilation target to the extent we can run Unreal Engine in the browser. So I consider the original comment I replied to that simply stated 'Bad performance' incorrect or at least lazy/contextless criticism. If you want to reframe, fine, but I'm not going there :)
Accidental string coercion is a valid point. Also hasn't bitten me in ~10 years of building large JS apps. And JS has many advantages over e.g. Python these days that to me vastly overshadow that downside (better support for FP for one).
(And if you really want that type safety you can use TypeScript, just another great thing to come out of the JS community, you know the one that GPP criticised for daring to provide choice).
> It's significantly faster than most other dynamic languages. It's fast as a compilation target to the extent we can run Unreal Engine in the browser. So I consider the original comment I replied to that simply stated 'Bad performance' incorrect or at least lazy/contextless criticism.
The entire point of my comment is that the statement "JavaScript is faster than other dynamic languages" is lazy/contextless praise.
V8 is fast. Rhino is slow. JScript 5 is really slow. They are all JavaScript. V8 is fast because people really wanted it to be fast and did some impressive cutting-edge work to make it happen, not because the language lends itself well to speed. It is a credit to the skill of the people working on the JavaScript engines that JavaScript programmers nowadays can enjoy decent speed, and AFAIK it's not particularly attributable to any specific features of the language.
I might want to read an article about a language that is fast in principle due to its specification. But the question that actually matters to me is: do actual implementations of a language's specification exist, that I can use, that are fast? The answer is yes for JS.
Whatever flaws JS clearly has, speed is not one of them. All the widely used implementations of the language are fast compared to similar languages.
I'm unsure why you think it matters that this speed came about due to hundreds of thousands of hours of work spent on optimizing various compiles and interpreters. I never see anyone saying "Boy SQL sure is fast, but only because so many intelligent people spent their careers making it fast, so it doesn't count."
Remember, this little argument started because someone described JavaScript as slow. That claim is incredibly disingenuous. No, JavaScript's speed isn't directly linked to the language spec but...who cares?
"It's a huge community, there're bound to be lots of camps and lots of people trying to crack the same nut different ways. What's wrong with competition? I personally think it's a fantastic community, and find others lacking by comparison."
One reason was to establish a good standard for all people that you can run your code everywhere without effort.
Our history showed many times is better to create a standard instead of many non-standards because that creates unnecessary complexity on our world. Look at the pipeline concept of unix. More and more companies steal this concept to build real time web applications without having so much state (that is concept is pretty old). Our ancestors created all of the wonderful things like tree, pipes, functional and etc. We are just sometimes too stupid to reuse it.
But it's worse than just being a dynamic language. The biggest problem is the implicit type conversions. If it would at least throw runtime errors when there is a type mismatch, instead of silently converting to another type, it would help a lot.
There is no way to "fix" Javascript without breaking compatibility. It will always suffer from its poor initial design decisions, or it will stop actually being Javascript.