Hacker News new | past | comments | ask | show | jobs | submit login
Overview: C++ Gets an Overhaul (C++0x) (devx.com)
31 points by Anon84 on Aug 20, 2008 | hide | past | favorite | 79 comments



I am so glad I don't have to use C++ any more. I spent years in video games perfecting my programming skills in C++. What a waste of time. Now I'm in python I feel like I can program again.

Don't get my wrong. C++ obviously has it's place but I'm glad I'm in a different place right now.


I agree... I just wish Python's performance was closer to C++s. I use Python everyday for data analysis, etc... but I'm pretty much stuck with C++ for numerical simulations. Hopefully, the gap will become manageably small sometime soon.


I used to wish the same until I became a C# fan. If you'll ignore for a moment it's platform-specificness, it's a great compromise between Python (w/ regards to ease of use and productivity) and C++ (in terms of performance and some other things).


the good news is that it's easy to write extension modules for python when you need performance.


The bad news is, soon enough you'll realize that your entire codebase has been switched over to the extensions and Python isn't really doing anything anymore :)


People used to say something similar in games: 'Why write in C when you know you are going to have to move to asm anyway, just write it all in asm!'.

I've never seen more then 5% of c code usefully move to asm and I've never seen more then a few % of python code usefully move to a module. I am not saying it's always of waste of time it's just I haven't seen it.


What kind of applications do you have experience developing?


Mainly in and around video games on PC/PS2/XBox/PS1. C/C++/ASM for the runtime (sometimes with python for scripting) and Python for level editing and build processes.

I'm not discounting python inadequacies esp when churning vast amounts of data. Video game levels are large and python was too slow to process collision maps, shadow maps etc but we added C modules to do the heavy work and it wasn't a lot of code to write in C (it was a pita to debug them though).

PS I don't work in video games any more so I've not experienced the delights of the PS3/XBox2 first hand. I do websites now... it's a lot easier :)


But on the other hand I guess that this "different place" means that you are now no longer programming serious video games. I know how it feels like - I'm looking around for other languages all the time and even did several smaller test-games in other languages. But for the real jobs I always fall back to c++. Mostly because of the 3D libraries in my case.


Oh boy. Let the ignorant C++ bashing begin...

Before you chime in with some snarky comment about some superficial aspect of the language (like, say, the name), please try to realize that C++ is still one of the most widely-used languages in the world -- for a reason. And if you find yourself about to post something of the form "why would they add [feature X]...python/ruby/haskell/blub has had that for years", please stop and take a deep breath. Realize that it takes a long time to add features to a language that has to be standardized by committee and deployed to as many platforms as C++ has today.

You may like ruby/python/blub better, but that's a bit like saying "I like my swiss army knife better than my chainsaw". Your swiss army knife will do fine in a lot of different situations, but you're not going to cut down many trees with it. Nor would you try to open a package with a chainsaw.


When you say "C++ is still one of the most widely-used languages in the world", you're implicitly lumping C in with it. C/C++ is certainly one of the most widely-used languages, but there are whole huge categories of products that are almost never written in C++.

Some of us appreciate bare-metal systems programming, but loathe C++.


No I'm not. C++ is widely used in part because it's backwards-compatible with C, but it's not even remotely a stretch to say that C++ is one of the most widely-used languages in the world, on its own.


It's not a stretch to say that because you aren't actually saying anything. Leaving the distribution undefined as you did, we could also claim that Haskell is one fo the most widely-used languages in the world: it's certainly used more than CASL, the scripting language I wrote at McAfee in 1998.

I'm not arguing that C++ isn't widely used. I'm taking exception to the "us and them" sentiment I picked up off your comment, that the world is divided between those who appreciate C++ and those who can't hack it outside of Python and Perl. There are plenty of systems programmers who hate C++.


Where did I say such a thing? My whole point was that C++ is just another tool, like a chainsaw.

You seem to be taking this very personally.


I'm sorry, I can see why you'd think I personalized it. I stand by the point I'm making, but apologize for the tone.


Let the ignorant C++ bashing begin...

Speaking as someone who's spent the last 6 years maintaining large C++ applications, I'd like to think that any C++ bashing I indulged in was at least informed C++ bashing... ;)


Similar story (10 years, many gripes). But I think I'd still be in grad school if I had to rely on the blazing execution speed of Python and Perl to do all of my work. I guess I have a soft spot for the language....


I'm not sure why I would ever dick around with C++ lambdas when I could instead just write in C and a "real" high level language, like Python, Lisp, or Ruby.


It really isn't an official YC C++ thread until tptacek posts a couple bitter anti-C++ comments. ;)


Do I sound bitter?


Comparing C plus Python to C++ doesn't make any sense. The comparison would be C plus Python vs C++ plus Python.

The C++ solution would most likely be shorter, more flexible, and more secure. Personally, I fail to understand why anyone would choose to write new C over C++.


The C++ version will absolutely not be more secure; we find just as many vulnerabilities in "modern" template-y C++ code as we do in straight up C. We know how to secure C code, even if we are bad at it. Nobody knows anything about C++ security.

http://www.matasano.com/log/914/c-a-cautionary-tale-or-1-hou...


(Speaking as one of those people who did choose to write C over C++...)

People choose C over C++ because C is a simpler language than C++, something which this new standard does nothing to address.


I have difficulty handling the complexity of typical C code due to the lack of abstraction. C always winds up with huge functions and lots of hard to grasp state up in your face. C++ gives you way more opportunities to hide away complexity.


C always winds up with huge functions and lots of hard to grasp state up in your face.

That's something I've never found to be the case. Most of my C functions are 5-15 lines long. Anything more than 20 lines long is a candidate for refactoring - the unit tests will tell you if you've broken anything.


The solution to complexity is to eliminate it, not to hide it. You've nailed the big problem with C++ right there.


I really don't know what that means...


Think of it like this: Would you rather have an operating system that has a bunch of little ads in the corner by the clock that are periodically "hidden" under a chevron, or would you rather that the crap wasn't there in the first place?


Some complexity is inherent to the problem and you can't get rid of it. You can only manage it through abstractions.


Any good programmer can write good abstraction in any language--including C.

If your functions are huge, you should rethink your design, regardless of the language.


Is this really true? I've never really felt that C++ was buying me a lot of brevity with its features. It was more a subject of manageability.


Modern C++ almost eliminates manual memory management. STL along with tr1::shared_ptr makes memory management almost as easy as with scripting languages. This alone trounces C.


Pass a shared pointer to a library that doesn't know how to handle it (read: almost any third-party library). Now you've got two pointers to the same block of memory under two different memory management regimes. What could possibly go wrong?


You can't accidentally pass a shared pointer to a library function in C++. You'd get a compiler error. Then you'd pass the pointer within the smart pointer and make sure the case was handled appropriately. It's not hard and it's definitely still worth it to liberally use shared_ptr.


You can't "accidentally" do it; it's just that the only way to actually do it in practice is fundamentally unsafe. "It's not hard"? You know what else isn't hard? Making sure you don't treat an integer as signed in one place and unsigned in another. There's $1.5Bn USD of "not hard", right there.


>You can't accidentally pass a shared pointer to a library function in C++.

Spoken like someone who has never used a template function.


There's no law that states that you have to litter your code with calls to malloc()/free() when writing C. If reference counting won't do, there's more than one garbage collector freely available, and they're not all that hard to implement from first principles if the mood takes you.


Okay, so now you're writing your own memory management. How much work are you willing to do to avoid a language which gives you the feature for free?


If you're peppering your code with malloc/free, it has nothing to do with not "writing your own memory management"--your code is just badly designed to begin with.


It was an interesting exercise. (It gratified my intellectual curiosity, you might say...)

How do you suppose memory management works in high-level languages? Magic?


No. Everyone knows that there are little memory gnomes that run around your system, sweeping up bits.


This is exactly what you really do get in garbage-collected programs.


Dude...seriously: there are garbage collectors for C++. They're not exclusive to C!

Moreover, the C++ garbage collectors are pretty darned easy to use, given that the language builds in support for custom allocators, overriding new/delete, etc.


The fact that it is trivially simple to garbage collect both C and C++ code neutralizes the "automatic memory management" advantage, dubious as it was, from C++.


Not really. One reason that you infrequently see C++ programs using garbage collection, is because the language includes some nice tools that mitigate many of the pain points that drive people to use GC in the first place.

If you're using C++ as its own language, you get many of the nice things that come with memory management in dynamic languages, but you don't have to pay for it at runtime (the inclusion of shared_ptr in C++0x is a great example of this, actually). If you merely use C++ as "C with classes" (I'm not saying that you do this...just generally), then you don't see this benefit of the language.


It's weird that you think you don't pay for the C++ Standard Library's implementation of reference counting at runtime.


What are you talking about? auto_ptr? shared_ptr? You don't pay for these if you don't use them; the language gives you the option of trading safety for speed. That's what I meant when I said that you don't have to pay for the features.

In any case, the reference-counting used by the safe pointers is probably an order of magnitude less heavy than a garbage collector. Even if you use them, you aren't paying that much...


You don't think it's nitpicky to suggest that C++ gives you automatic memory "for free", as long as you don't ever use it?

Garbage collection is not an "order of magnitude" more heavy than reference counting, and there's more than one axis to measure memory management on --- in fact, there are several axes for allocation transactions alone.

Just as importantly, if you go to Google Code Search and randomly select ten C++ projects that (a) have more than 10,000 lines of ".cpp" and ".i" file code and (b) use shared_ptr, I'm betting you're going to find that 8 of them rewrote some of the memory allocator for performance reasons. All the fancy template BS in the world doesn't save you from the fact that "new" is just "malloc", and there's no one allocator design that works for every program.


I don't know, but when I want nice memory management in C code (and usually, if I'm writing something in C, it's at least partly because I care a lot about how memory is managed), I just use Boehm GC.


Okay, so you've introduce a third-party dependency. Again, how much work are you willing to do to duplicate a language feature that C++ gives you for free?


Oh give me a break. Until the new standard hit, practically everything that made C++ bearable was a third-party dep on Boost. And the difference between Boost and Boehm GC is, I don't have to read documentation or even change my code to use Boehm.

"Avoiding third party deps". The dev team that embraced that standard sure sounds like a blast to work on. The wheel I can't wait to reinvent? Zlib.


> practically everything that made C++ bearable was a third-party dep on Boost

For what it's worth, pretty much all good C++ devs I know are getting nauseous at any mentioning of Boost .. with "passionate hatred" being a more accurate description. Very professional (older) crowd, responsible for some very notable and large-scale projects.

In general, C++ is too feature-full to ensure any sort of consistency of coding and design styles between any two C++ developers. Some use it as beefed up C, others - strictly as OO language. Former will never be pleased with the code produced by latter, and vice versa. Boost only amplifies these differences, so it's really not a big surprise that it is despised in certain C++ circles.


Comparing a third party GC library to TR1 stuff supported by many compiler vendors and vetted by the committee is a bit of a stretch.


Comparing the most famous garbage collection library of all time to the output of a standards body is not a stretch, and now we're into "some third party deps are good --- the ones that support my argument --- and some are bad --- the ones that support yours". I'm happy to consider us stalemated and move on.


> I'm happy to consider us stalemated

Somehow I doubt it.


One word:

"Greenspun"


The new standard basically has nice improvements over the current one. I use C++ occasionally, when I need to write low-level code and when I think the STL will save me time over plain C. Unfortunately, the new standard won't be too useful for a while.

When the working group finalizes the standard, we get to wait for the compiler implementations to catch up. For the first five years, C++ programmers will enjoy buggy, incomplete, and mutually incompatible implementations of half of the features in the new standard. During the following three years, compilers will even out in quality, and many of them will finally implement most of the features in the 2003 modifications of the 1998 standard. It'll take over a decade for C++ lambdas to become semi-portable and acceptable in many environments.

I really hope that, in 2018, I'll be able to write low-level code in a new and cooler language. It'll probably be a Lisp. :)


I really hope that, in 2018, I'll be able to write low-level code in a new and cooler language. It'll probably be a Lisp. :)

what you really meant to say is that you hope not to write in any language in 2018


Could they have come up with a worse name than "C++0x" ?


It's a place holder. The previous standard was C++98. The last C standard was C99. C++0x means "The C++ standard that will be finalized between 2000 and 2009."


"C++" isn't a particularly great name itself, apparently they wanted to continue the tradition


Can someone actually explain what it means? I'm not a professional programmer, but I haven't seen notation like that.


Unfortunately, C++0xDeadBeef didn't work for vegetarian members of the committee.


Simple solution: Use C++3735928559 instead.


Is it just me, or does it look like a couple of these features (Automatic type deduction, Lambda expressions) seem to be borrowed from c#? And similar things are in languages like Ruby and Python.

I'm not saying that's a bad thing (or the reverse), and not saying that c# invented these ideas: I'm fairly sure that the c# designers only let in features that have already been road-tested in other, more experimental languages.

Just interesting which was around the ideas are flowing.


I had heard about automatic type deduction in C++ even before it was implemented in C#. The problem is that takes several years to get a C++ standard approved, while the C# team can just dream a feature and start implementing it in the next day.


Writing C++ I've found myself all the time implementing C# classes. When doing Win32 development C++ looks like a waste of time. If I want something to run on Windows only I can produce the same result with both languages. If I want to create a multi platform application I'll use Java or Pyhton.

C++ isn't going to die, you've got too much code out there, but its place starts to fade away


I'd disagree with "the next day" part. C#3 (with automatic type deduction and lambdas) had a beta period of about a year, to make sure that all the new features played nicely with each other.


They've been standardizing C++0x for at least three years now.


As per Bagley Shootout http://dada.perl.it/shootout/craps.html

C++ score = 27.63 Python score = 27.0


All hail the bloat-lords.

PS. Why are the additions of nullptr, and strongly typed enums a good thing? and doesn't C++ already have a NULL from C?


C's NULL sucks, because it's not a unique value and thus can be "lost" or confused with other values.

Having a distinct null which can only be arrived at by typing the characters "null" is a good thing, imo.


Can you tell a story about how NULL bit you in the ass in real code?


I remember one early version of a C++ compiler which had NULL #defined as (( void* )0), which caused all manner of fun trying to compile some legacy C code.

This was in 1999, I think.


You mean like this?

#define NULL __DARWIN_NULL

...

#define __DARWIN_NULL ((void *)0)


Something like that. It makes the following invalid in C++ but valid in C:

  int *ptr = NULL;


Except for the fact that the real definition is actually more like:

    #ifdef __cplusplus
    #define __DARWIN_NULL __null
    #else /* ! __cplusplus */
    #define __DARWIN_NULL ((void *)0)
    #endif /* __cplusplus */
So the code you mentioned works just fine in C++.


Except that the actual compiler I was using was for the PlayStation 2, so I very much doubt it had that precise code you just outlined, and it really did have NULL defined as ((void* )0), and it really did break. I saw it happen.


Isn't the right solution to make C NULLs a unique value or at least get rid of them now that a better replacement is here. Seriously, have 5 different ways of doing the same thing is how perl became a hell hole for anything more than 10 lines.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: