Hacker News new | past | comments | ask | show | jobs | submit login

There are a lot of objections to C++, but I've never seen any of the five he mentions. They sound more like strawmen to knock down than real objections by actual users or commentators critical of the language.



Objections from the article.

1. “To understand C++, you must first learn C”

2. “C++ is an Object-Oriented Language”

3. “For reliable software, you need Garbage Collection”

4. “For efficiency, you must write low-level code”

5. “C++ is for large, complicated, programs only”

1. Not a myth. To write your own C++ code, no. To do anything with C++ other people wrote... Yes, you must first learn C.

2. Weird one, I don't remember hearing this one ever. But it'd be a myth on many levels. C++ has some object-oriented features, but it's indeed not an object-oriented language in the original sense of the concept from Smalltalk. But if you see the effort, it can be used as one. It can also be used as a procedural language. Or even as a functional one to some extent.

3. Real myth detected! GC doesn't affect reliability. Manual memory management reduces reliability. In C++, you can do manual or RIAA-type memory management.

4. This one is not a myth. To get best possible efficiency, you do need to use assembler or compiler intrinsics. The difference you can gain is up to 40x (in my personal experience, this case was about vectorization and branch elimination), typically 2-3x, so it's not insignificant either.

5. Myth. You can use C++ for any size complicated program. You can write simple C++ programs, but the language tries very hard to make sure you can't understand a piece of code out of context. You can't even know what your arithmetic operators do without knowing the includes and other context.


> the language tries very hard to make sure you can't understand a piece of code out of context

The language goes out of its way to make it possible to write code that is understandable with as little context as possible; type deduction, lambdas, range-based for loops— all of these are tools for reducing the amount of context or number of contexts necessary to understand a portion of code.

The language also allows you to write code that requires an inordinate amount of context to understand, yes. So does C. Hell, so does every other programming language worth using. It's practically impossible to prevent such code from being written without totally crippling the language.

> You can't even know what your arithmetic operators do without knowing the includes and other context.

If you can see that the arguments to your arithmetic operators are numeric (and you should be able to tell without looking outside of function scope), you need no more context. If the arguments are of user-defined types, you need no more context than if a function named "plus" had been used instead of the function named "operator+." If that is not enough context to understand the code, you have a problem with someone choosing poor names for functions, which goes far beyond operator overloading. For example, sure, operator overloading allows for code like

    auto operator+ (string_t a, string_t b) {
        return a.length () - b.length ();
    }
but so what? User-defined functions allow for code like

    void uppercaseify (char* str) {
        unlink (str);
    }
but it would be patently absurd to blame the language for code like that.


I guess you don't maintain [legacy] C++ code. Occasionally I have to.


I have maintained C++ code that was poorly written, if that's what you're getting at, but I have yet to find myself maintaining C++ code that has been poorly written as a result of the design of the language.


That's unfortunately something you can say about any language.


I've encountered each of these myths "in the wild," and I dare say you'll see them thrown up repeatedly here on HN in various contexts.

The third item on that list especially is something you can almost certainly see every day in a random HN thread about programming languages (along with counterpoints).

In fact, 'safety', of which GC is a subset, is a huge objection, and one of the first, most frequently raised in my experience. People are afraid of pointers, some rightly (e.g. because they've personally experienced bad memory management code, or they've never worked outside of a dynamic language, etc.), but most not.


Note that one can be against a programming language feature without being afraid of them. I think pointers are a bad feature in a programming language because they're not necessary. It's rather unfortunate that Go, for example, supports pointers (which is of course not surprising when you consider who designed it).


It's hard to deal with low-level code without resorting to pointers in some ways, though. I can't think of a low-level language that doens't use them, actually.


That's true, although you can always use pointers without pointers being a language-level feature. That said, if the programming language is designed for low-level code then pointers are beneifical in my opinion. But is Go really designed to be a low-level language? I don't think so, and if it is, I will stay away from it if I can..


Go is middle-level, lower than scripting languages, higher than C++. You have control over memory layout, which lets you use less memory than a language that boxes everything (like Java or Python). You can use linear structs, or references to data, whatever fits your problem best. But, to do that, you need pointers.

However, Go pointers can't be casted to another type and you can't do pointer arithmetic. In this way, they are very close to Java or Python references, except they're made explicit. How can these kinds of pointers be any worse than references ?


>How can these kinds of pointers be any worse than references ?

Because references are easier to use.


I get your point, although I think references are not that easier ; if one doesn't think what's happening underneath when using refernces, nasty bugs can occur.


You're correct, of course, and I should probably have not used that loaded term.


The article does not claim to be refuting objections, but debunking myths.


Congratulations! Your accusation of Stroustrup's strawman is strawman itself.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: