Hacker News new | past | comments | ask | show | jobs | submit login
Additional C/C++ Tooling (nickdesaulniers.github.io)
98 points by ScottWRobinson on July 24, 2015 | hide | past | favorite | 67 comments



Calling Valgrind a "memory leak detector" totally underplays its usefulness. But for some reason I've never worked out, a lot of people think this is all it does.

The main things that Memcheck, the default Valgrind tool, do are:

1. Detect accesses to inaccessible memory.

2. Detect dangerous uses of undefined values.

3. Detect memory leaks.

Memory leaks are arguably the least interesting of these three things.

ASAN can do 1 and 3, but cannot do 2. That's why Mozilla runs both Valgrind and ASAN on Firefox test automation.

There's a tool related to ASAN called MSAN that attempts to do 2, but 2 is really hard to do with static instrumentation so MSAN doesn't get much real-world use.


And then there are the valgrind tools other than memcheck: cachegrind (cache and branch-prediction profiler), callgrind (call-graph profiler), helgrind (thread error detector), massif (heap profiler), and probably more I forget.


Some projects use custom allocators which detect memory leaks, so it's even less useful there.


Our team has had a decent compile time improvement by switching to ninja[1] from gmake.

[1] https://martine.github.io/ninja/manual.html


My experience has been very positive for incremental build (you have built once a large software and change one file and need to rebuild) but no difference on fresh build.


After a certain point, you can't get over compiler processing time.


Ninja's magic happens before the compiler in deciding what needs to be compiled quicker than gmake.


That's what I meant. If you have to rebuild everything, then ninja or gmake doesn't make anything faster, since it's the compiler's speed at that point.


Well there is still the amount of parallelism that can be exposed. It just seems that CMake exposes it as well for make than for ninja I guess.


Yup exactly what we use it for. Combined with Jenkins, makes it easy to make sure your master branch is in a good state.


If you mean as a backend to CMake, then it's entirely the fault of CMake's braindead-ness


Explain. You are saying CMake is bad with gmake but good with ninja?


No build system thus far reaches the elegance of redo, though.


While simple, osx "leaks" is truly formidable because you can call it while a program that was compiled without any special flag is running. For example the Redis testing framework runs "leaks" every time an unit was executed completely, warning the user if a leak is found. When this happens I resort to valgrind to find the cause, but to call "leaks" every time you want during development and having it running in a few seconds is a killer feature. I miss it when I develop using Linux.


Here you are: http://www.opensource.apple.com/source/SecurityTool/Security... http://www.opensource.apple.com/source/SecurityTool/Security...

After reading the source, I think that you must put the binary in `/usr/bin/leaks`.


If you work at a company that can afford Coverity (static analyzer), it is great. Super simple to integrate into a standard gcc build.


I have not tried this yet, but Coverity provides a free online tool for open source projects. It will be a nice complement to travis/coveralls and other CI tools available for free to open source projects.

I'll definitely try it out once I get some time to work on infrastructure things on my own open source projects.

https://scan.coverity.com/


I'd love to see a comparison of Coverity vs. Clang sanitizers.

I wish we could buy just a couple Coverity licenses, but they want to license the entire organization for something like $200,000.


When Daniel Hax ran Coverity and Clang against the curl source code, he found that the each "tool detected several issues the other didn’t spot."

Coverity – very accurate reports and few false positives

clang-analyzer – awesome reports, missed slightly too many issues and reported slightly too many false positives [1]

You'll find similar results in PVS Studio vs clang[2] and PVS Studio vs Coverity.[3] If code quality is important to your organization, than one tool is probably not enough. If it's vital, you may want to consider another language such as Haskell or OCaml.

[1] http://daniel.haxx.se/blog/2012/07/12/three-static-code-anal...

[2] http://www.viva64.com/en/b/0272/

[3] http://www.viva64.com/en/a/0076/


Just out of curiosity, how big is your org? I've been tempted to suggest Coverity a few times but I always balk at buying tools where the price is listed as $CallSalesAndNegotiate.


You lost me when you suggested cmake as a build tool. Its principal killer feature seems to be that it abstracts cross platform building. Unfortunately the price you pay for that is eventual insanity, an unreasoning desire to DESTROY EVERYTHING.

Stay away from cmake.


Seven years ago I helped move a fairly complicated commercial codebase from automake to cmake. In the process, I wrote thousands of lines of complicated cmake macros to interface with the packaging system, etc.

The end result was a love/hate relationship with cmake.

Clearly, the weakness is its hokey scripting language. The shame is that if they'd just started with something halfway reasonable like lua it would have been easier to build the tool and it would be far more powerful. Instead, they couldn't resist writing their own DSL where everything you need to do is possible but just barely. I've done so many zany hacks over the years to force it to do what I needed.

Granted, I did the major porting work to cmake 2.4. The newer versions are certainly better (no more duplicating the expression in IF()/ELSE()/ENDIF() statements!) but it's still pretty painful to write something non-trivial compared to any modern scripting language.

However, the results of cmake are fantastic. Once you've got your CMakeLists.txt's working you can largely forget about it -- it'll just work, completely cross platform. If you need to target both UNIX compilers and the Visual Studio world it's probably the best solution.


Having spent moderate to significant time with bjam, waf, qmake, make and cmake, I more or less settled on cmake because:

A. It's what everyone else is using.

B. It's simple to do common things.

C. It's pretty darn fast with the Ninja backend.

The only downside seems to be that the cmake language is ugly and limited. Fortunately, I only need to write a tiny amount of cmake code to build most of my projects, since find_package is so good. (Though, it probably sucks to be the guys making that work.)

To be honest, every tool I've tried has kind of sucked. cmake just sucks the least. One day I'd like to investigate tup and shake, though.


No.

cmake has won.

The other build tools tried (scons, premake, autotools) and they failed.

You don't like the syntax? Cry me a river.

It does the job. None of the other ones do.

Pragmatism > idealism. I don't care how pretty your build files are, if they don't actually work.

/shrug.


I have to agree honestly. I've tried many of the build systems out there, and I have settled on tup[1] as being one of the greatest creations ever. The Tupfiles are incredibly simple, the execution time is bloody fast, and it is more capable of creating easily-reproducible builds than anything else I've seen so far. I'd recommend you give it a look if you're not familiar with it.

[1] http://gittup.org/tup/


CMake works well if you're building typical user space applications or libraries with dependencies to popular and typical libraries that CMake can find or follow typical conventions. It completely falls apart if you need to do something unusual or automate non-build tasks with it. E.g. building bootable kernel images (where you need a linker script, etc) with CMake is quite painful (I've done it, will not do it again).

The best feature of CMake for me has been easy cross compiling so I can develop and remote debug Windows apps on my Linux box.


The thing I hate about CMake is that it can't generate relative directory structure projects. That's stupid and no other build system of significance has this limitation.


I don't know what you mean - can you explain this further? Because I've seen it do what I think you mean, but I guess I don't understand.


I will pray for your soul. :P


People often take it as common knowledge that Visual Studio barely supports C. As of VS2015 almost all of the C99 standard is implemented, with a few corners missing, like tgmath. I am interested to see if this allows more C libraries to be built with VS rather than relying on MinGW and other Unix based alternatives.


Indeed, on my post about "Designated Initialization With Compound Literals in C" [0], someone commented "cool, but I use MSVC so..." [1]. Note: that was two years ago, I'm sure that's not the case.

I'm sure IDE's have great features; I wanted to avoid them in the article; they are simply not for me.

[0] http://nickdesaulniers.github.io/blog/2013/07/25/designated-... [1] http://nickdesaulniers.github.io/blog/2013/07/25/designated-...


It may increase a bit. In fact the recently started C-based project Handmade Hero is done within Visual Studio so maybe that will spark some interest. The real advantage of mingw and other opensource tools is that you can use the same compiler on both Windows and Linux. With something like msys you can even get away with using a single Makefile. Also because C is a small language many professional C developers are unlikely to use an IDE and so would more likely use the windows sdk (assuming that still ships with the command line compiler which I'm not sure about).


Doesn't Visual Studio support Clang as an official alternative to MS' compilers, now?


I think the answer is yes and no. You can get clang to compile standard C/C++, but not any Windows code, as Microsoft has many compiler extensions and quirks they depend on.


If not now, when? I suspect this is a goal of Google's.


Mostly because that is required by the C++11/14 standard revisions.


> What did I miss?

Perhaps a tool to generate header files, so we don't have to write every function declaration twice?


I use makeheaders: http://www.hwaci.com/sw/mkhdr/

It's pretty amazing.

If you can't use makeheaders, I prefer the Plan 9 include style where header files are not allowed to include header files: http://doc.cat-v.org/bell_labs/pikestyle


Interesting. Do you know if it works with C++14?


My educated guess: not a chance. It's probably not going to grok C++ member declaration syntax, templates, namespaces, and so on.


I don't use C++, so I have no idea.


As annoying as it is, especially when updating function signatures, I kinda enjoy having separate header files. It clearly defines the public vs. private interface(s), it's a place to put your docstring comments in and a single file that can provide a quick glance to how a module works.

Although it is much less practical in C++ than in plain C, when you have to put the whole class definitions (including private member functions) in a header.

A good editor/tags system/ide should support jumping between definition and declaration quickly, this definitely makes it a bit less painful (e.g. vim + ctags + cscope work fine).


Such as? Take my money!


I design my programs by writing the header files first.


This approach makes me happy. Interface is key.


Someone should make a cargo clone for building C/C++ code. I don't care if it makes everything statically linked, that is good for me. :)


I was playing around with this idea [0], much to the dismay of colleagues, until I discovered that CMake's ExternalProject module could do everything my "package manager" could do. I was pretty close to making the package manager self hosting, too...

I think it doesn't actually matter if dependencies are statically linked or dynamically linked, just as long as they are not installed globally, and shared between non homogenous processes.

[0] https://github.com/nickdesaulniers/picodeps


Apparently, biicode is just exactly that.

I only played around a little with the examples. So far, so good.


Note that valgrind trunk now supports OS X 10.10 and 10.11 and it's completely trivial to build.


Right, should have tried that! I bent over backwards building some of the other tools from source!

See also: https://www.reddit.com/r/programming/comments/3egclc/additio...


Death to cmake. Long live premake (5)


Death to cmake. Long live make.


You're kidding me. Have you tried to manage a cross platform build using make?


Tell me more.


Apparently that's a build tool that uses lua rather than some bastardized config language (like CMake does), and otherwise is somewhat inspired by CMake?:

https://github.com/premake/premake-core/wiki/What_Is_Premake

https://github.com/premake/premake-core

There are apparently some alpha-level work for supporting ninja as a backend: https://github.com/jimon/premake-ninja


tup looks even better.


Death to cmake. Long live automake.


Gradle also supports C/C++


getting your code pass valgrind's exam is pretty essential. The only problem I have with it is that it doesn't support all syscalls.


Are there any comparisons of:

1) bazel VS. gradle

2) cmake+ninja VS. tup VS. redo


The term "C/C++" is a Tool of the Devil. Each has its many endearing qualities but it is inappropriate to put them together like that.

I often see job board posts and get mail from recruiters seeking coders with experience in "C/C++". Consider that Linus Torvalds and Richard Stallman would be unqualified for C++ work. I myself have done so much C++ that I am not particularly good at C anymore.

If you use C++ as "A Better C" or "C With Objects" you will never get the bugs out of your code. C++ wants to be written a certain way which is quite alien to most C practices.


>The term "C/C++" is a Tool of the Devil. Each has its many endearing qualities but it is inappropriate to put them together like that.

I do understand the common complaint about that but in the author's type of article, it's fine to lump them together. He's talking about "tools" for runtime analysis, formatters, etc. The tools he's talking about handle both languages.

His current url for the article is:

.../blog/2015/07/23/additional-c-slash-c-plus-plus-tooling/

What would be a solution to satisfy purists and always separate C from C++? Should the author create 2 separate urls?

C tooling url would be:

.../blog/2015/07/23/additional-c-tooling/

C++ tooling url would be:

.../blog/2015/07/23/additional-c-plus-plus-tooling/

And both webpages would be 99% the same and running a diff only shows that string "C language" is replaced with "C++ language". Instead of that convoluted redundancy, it's quite reasonable in this specific type of article to lump "C/C++" together.


But the tools _do_ change. E.g.

- Development Environments

High-Level C++ is rather verbose so it is often developed in connection with an IDE, e.g. to jump to a class definition. C works way better with established tools like grep and co, as it does not do function overloading.

- Debugging

C++ is for me the most hard language to debug, as their is so much stuff you have to keep in your head. Due to name-wrangling symbols get also weird names in debuggin areas.

- FFI

C is the established high-quality FFI approach and popular languages have very good _C_ FFI support: Python, Haskell, etc. In particular it is simpler for language designers to support C.

It is similar to the Java/JavaScript debate, where one might argue that both are interpreted languages.


>But the tools _do_ change.

Yes but that's not relevant in this thread. Please try to follow the limited context of this discussion. We are not talking about the set of ALL tools of which some might be specific for C and others might be specific to C++.

Instead, we were discussing Nick Desaulniers' specific article and the particular tools he's describing in his article.


The OP is about tools to build, develop and debug C and C++. None of the mentioned tools apply to one and not the other. Why would you want the title not to mention both?


I agree, C and C++ despite many similarities and close ties in their family trees are quite different. I did not make that point in the article, as it's irrelevant to a piece on tooling that works for both.


You can use a C++ compiler to do "C with objects" fine. Lots of people successfully do that. There's specification for doing this in fact which is adhered to by embedded developers. It's better to use C++ more fully, of course.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: