While I mostly agree with your primary point --- that C and C++ are extremely prone to memory corruption vulnerabilities --- I think there's an important distinction you're glossing over here, between C and C++/ObjC.
Both C++ and ObjC have a string class and containers in the standard library, and support for some automatic memory management in the language. This turns out to make a big difference in practice in reducing those vulnerabilities. There are people in this thread claiming that they do as good a job in reducing those vulnerabilities as using Java or Ruby or Python. I can't really evaluate that claim, but it seems plausible to me. Barely.
* std::string (or NSMutableString) eliminates the stdlibc strxxx() vulnerabilities --- iff you use them exclusively. But lots of C++ code (and, especially, ObjC code) drops to char-star strings routinely.
* Most C++ code still uses u_char-star for binary blobs. ObjC has (to its credit) NSMutableData, but there's still u_char-star handling code there too (I also feel like --- but can't back up with evidence --- ObjC code is more likely to call out to C libraries like zlib).
* Both C++ and ObjC have error-prone "automatic" memory management: shared_ptr and retain/release, respectively. shared_ptr is risky because every place it comes into contact with uncounted pointers has to be accounted for; retain/release because it's "manu-matic" and easy to make mistakes. In both cases, you can end up in situations where memory is released and pointers held to it, which is a situation morally equivalent to heap overflows.
No, I don't think C++ and ObjC do an equivalent job in reducing memory corruption flaws. The MRI Ruby interpreter has had memory corruption issues (it being a big C program itself), but Ruby programs never have memory corruption issues (except in the native C code they call into). C++ and ObjC programs routinely do.
Both C++ and ObjC have a string class and containers in the standard library, and support for some automatic memory management in the language. This turns out to make a big difference in practice in reducing those vulnerabilities. There are people in this thread claiming that they do as good a job in reducing those vulnerabilities as using Java or Ruby or Python. I can't really evaluate that claim, but it seems plausible to me. Barely.