Hacker News new | past | comments | ask | show | jobs | submit login
Five Popular Myths about C++: Postscript (isocpp.org)
47 points by ingve on Dec 30, 2014 | hide | past | favorite | 38 comments



There are a lot of objections to C++, but I've never seen any of the five he mentions. They sound more like strawmen to knock down than real objections by actual users or commentators critical of the language.


Objections from the article.

1. “To understand C++, you must first learn C”

2. “C++ is an Object-Oriented Language”

3. “For reliable software, you need Garbage Collection”

4. “For efficiency, you must write low-level code”

5. “C++ is for large, complicated, programs only”

1. Not a myth. To write your own C++ code, no. To do anything with C++ other people wrote... Yes, you must first learn C.

2. Weird one, I don't remember hearing this one ever. But it'd be a myth on many levels. C++ has some object-oriented features, but it's indeed not an object-oriented language in the original sense of the concept from Smalltalk. But if you see the effort, it can be used as one. It can also be used as a procedural language. Or even as a functional one to some extent.

3. Real myth detected! GC doesn't affect reliability. Manual memory management reduces reliability. In C++, you can do manual or RIAA-type memory management.

4. This one is not a myth. To get best possible efficiency, you do need to use assembler or compiler intrinsics. The difference you can gain is up to 40x (in my personal experience, this case was about vectorization and branch elimination), typically 2-3x, so it's not insignificant either.

5. Myth. You can use C++ for any size complicated program. You can write simple C++ programs, but the language tries very hard to make sure you can't understand a piece of code out of context. You can't even know what your arithmetic operators do without knowing the includes and other context.


> the language tries very hard to make sure you can't understand a piece of code out of context

The language goes out of its way to make it possible to write code that is understandable with as little context as possible; type deduction, lambdas, range-based for loops— all of these are tools for reducing the amount of context or number of contexts necessary to understand a portion of code.

The language also allows you to write code that requires an inordinate amount of context to understand, yes. So does C. Hell, so does every other programming language worth using. It's practically impossible to prevent such code from being written without totally crippling the language.

> You can't even know what your arithmetic operators do without knowing the includes and other context.

If you can see that the arguments to your arithmetic operators are numeric (and you should be able to tell without looking outside of function scope), you need no more context. If the arguments are of user-defined types, you need no more context than if a function named "plus" had been used instead of the function named "operator+." If that is not enough context to understand the code, you have a problem with someone choosing poor names for functions, which goes far beyond operator overloading. For example, sure, operator overloading allows for code like

    auto operator+ (string_t a, string_t b) {
        return a.length () - b.length ();
    }
but so what? User-defined functions allow for code like

    void uppercaseify (char* str) {
        unlink (str);
    }
but it would be patently absurd to blame the language for code like that.


I guess you don't maintain [legacy] C++ code. Occasionally I have to.


I have maintained C++ code that was poorly written, if that's what you're getting at, but I have yet to find myself maintaining C++ code that has been poorly written as a result of the design of the language.


That's unfortunately something you can say about any language.


I've encountered each of these myths "in the wild," and I dare say you'll see them thrown up repeatedly here on HN in various contexts.

The third item on that list especially is something you can almost certainly see every day in a random HN thread about programming languages (along with counterpoints).

In fact, 'safety', of which GC is a subset, is a huge objection, and one of the first, most frequently raised in my experience. People are afraid of pointers, some rightly (e.g. because they've personally experienced bad memory management code, or they've never worked outside of a dynamic language, etc.), but most not.


Note that one can be against a programming language feature without being afraid of them. I think pointers are a bad feature in a programming language because they're not necessary. It's rather unfortunate that Go, for example, supports pointers (which is of course not surprising when you consider who designed it).


It's hard to deal with low-level code without resorting to pointers in some ways, though. I can't think of a low-level language that doens't use them, actually.


That's true, although you can always use pointers without pointers being a language-level feature. That said, if the programming language is designed for low-level code then pointers are beneifical in my opinion. But is Go really designed to be a low-level language? I don't think so, and if it is, I will stay away from it if I can..


Go is middle-level, lower than scripting languages, higher than C++. You have control over memory layout, which lets you use less memory than a language that boxes everything (like Java or Python). You can use linear structs, or references to data, whatever fits your problem best. But, to do that, you need pointers.

However, Go pointers can't be casted to another type and you can't do pointer arithmetic. In this way, they are very close to Java or Python references, except they're made explicit. How can these kinds of pointers be any worse than references ?


>How can these kinds of pointers be any worse than references ?

Because references are easier to use.


I get your point, although I think references are not that easier ; if one doesn't think what's happening underneath when using refernces, nasty bugs can occur.


You're correct, of course, and I should probably have not used that loaded term.


The article does not claim to be refuting objections, but debunking myths.


Congratulations! Your accusation of Stroustrup's strawman is strawman itself.


> "Unfortunately, there is not a single place to go to look for C++ Libraries."

I don't understand this. I code in C++ everyday at the moment (that's a story for another day) so I know the ecosystem, at least for scientific computing, pretty well. With all the changes and improvements being introduced, what's stopping a package manager from being added? The bane of my existence at the moment is the fact that I have this HUGE Boost dependency in my code, even if I'm only using a few packages. I know the Boost project has now finally migrated to Github and provides individual repositories for libraries. It just startles me that there isn't a single, consolidated package manager to have rolled out of the millions of work-hours that have gone into C++ projects.

It's possibly the single biggest criticism of the language that I felt wasn't tackled at all. There is a complete lack of consensus. In trying to accommodate everything and anything, C++ has in my experience overshot the mark.


Because a package manager for C++ is infinitely more complex than one for a bytecode language like Java or .NET, where theres either a well defined spec (.NET) or a reference implementation (Java, Python) that everyone measures compatibility for?

The first obvious problem is that there is no ABI for the C++ standard library. This makes sharing any kind of binaries plain impossible because exchange of C++ standard library objects between your project and a binary from the package manager will very likely just blow up. So the package manager, right from the bat, has to build every library and every (recursive) dependency of that library on demand on your computer using your C++ development environment. That alone is a herculean task if you consider the number of compilers (GCC (MinGW), Clang, VC, ICC), their respective versions and possible C++ standard library implementations used. And remember: you can't mix and match!

There is a reason even a C++ library such as ZMQ exposes only a C interface, and it's because things become extremely messy very quickly when you are juggling around native code.


  The bane of my existence at the moment is the fact that I 
  have this HUGE Boost dependency in my code, even if I'm      
  only using a few packages...  It just startles me that 
  there isn't a single, consolidated package manager...
Sincerely, I have never understood this common complaint about C++ (applies to C as well, actually). My OS comes with a huge repo of C++ libraries. For example, there is a separate package for each of the Boost libraries you used as your example. Why do I need a C++-specific package manager? What's the use case? Windows?


Windows. OSX (to some degree; Homebrew and MacPorts mitigate this). Linux if you want library versions newer than those supplied by the distro, if you want something that's not supplied by the distro, or if you want to be able to handle your build in a distro-independent manner (those "separate packages for each of the Boost libraries" you reference are provided by your distribution; Boost themselves only distribute one monolithic package from which you can, with some massaging, pull out individual parts if you really need to).

The language-specific package managers used with (for example) Python, Ruby, or Node give developers an OS-agnostic way to pull in dependencies, and, since dependencies are being packaged uniformly, avoids the conflicts between build systems that can make it hard for C++ libraries to coexist (this thing needs CMake, this other guy used Autoconf/Automake, and I've got Boost which built with Bjam, and now I need to make it all work on Windows and they're all providing binaries built with different MSVC versions).


Do you really believe that a C++-specific package manager (PM) repo would somehow be much more up to date than an OS one? Why?

Any PM repo will eventually be missing something you need, and then you will have to compile it yourself, and if you've done that a few times you should have an idea of the difficulty involved in providing a set of compiled dependencies for one system, nevermind multiple versions of multiple systems.

I don't doubt the usefulness of the Python, Ruby, and Node PMs, but those libraries mostly depend only on the interpreter and other interpreted code. It's just so much simpler. The fact that it's all going through an interpreter (compiled specifically for that system by the system PM in most cases!) solves so many of the problems.


This.

I've been playing out with Cargo and the Julia package manager and it's so nice to have a consolidated way of dealing with dependencies, included their build-systems.

I've now resorted to setting up a template project for myself [1] that uses CMakes ExternalProject_Add to pull in dependencies, configure and build them. ExternalProject_Add is so finicking though. How I'd love a packages.json solution for C++, where I can specific standardized libraries or Github/Bitbucket repositories not provided by the distro/package manager.

I'd love to see a Homebrew fork morph into a cross-platform package manager for C++. Would make my life so much easier.

[1] https://github.com/kartikkumar/cpp-project

EDIT: Added missing reference


Pretty much. In particular, Boost is already packaged and available for clean dependency management on all Linux distros. And on Windows, you're almost always looking at shipping a statically build binary (or one with the DLLs packaged) anyway, so an external package manager would provide little value anyway.

Really the existence of language-specific package management might be considered the bug, and not the feature. In all cases I've dealt with, the OS packaging is superior anyway.


I think it's fascinating how wildly differing peoples' experiences are with this. I find system package managers nice for system administrators and applications, but for development, every one I've used is far less convenient than a purpose-built package ecosystem.


Do you think every Node.js package should be released as an OS level package?


Using the packaged versions that comes with the OS is very convenient if you can live with the versions you get. This means that you might have to update your application code when updating the operating system.

This can be fine for open source development, but it has never been acceptable at any closed source shop where I've worked. We wanted the freedom of not updating things in lockstep.


This is exactly why 'stable' or 'long-term-support' versions of OSes exist. They go out of their way not to break binary compatibility (by not upgrading libraries all the time) so that user code need not be recompiled.


That just means that forced updates are less frequent. The flipside of it is that you are now stuck with, potentially, ancient packages. No matter how you slice it, being tied to the release cycle of the OS has problems.

Don't get me wrong, there are definite advantages to sticking with OS-provided packages. I do it for all my personal projects, but it's not a panacea.


That is a tradeoff every software repo is faced with (stability vs. new features). Why would a C++-specific PM be able to do it better? Especially considering that it would have to provide compiled binaries for N versions of M OSes/distros (not to mention multiple CPU architectures).


A package manager is generally not part of the language per se and should not be part of the language, but rather be part of the tooling ecosystem around it. Don't get me wrong. A de facto package manager is extremely important for any major language - and it's high time someone in the C++ community builds and promotes one. It just doesn't belong in the spec.


I do know of at least one: Microsoft's NuGet now supports C++ packages.

Something tells me that isn't the answer folks are looking for.


I think that and the previous issue go hand in hand, contributing to each other:

> Unfortunately, C++ Libraries are often not designed for interoperability with other libraries

And I think it's not just that the libraries don't work well with each other, but they also only work with a subset of C++ projects since any given group of collaborating C++ programmers seems to settle on their own subset of the language to allow in their coding standards. Perhaps I am falling into the trap of thinking about old C++ now -- modern C++ does seem to have more of a broadly understood idiomatic style.


> what's stopping a package manager from being added? ...

> It just startles me that there isn't a single,

> consolidated package manager to have rolled out of the

> millions of work-hours that have gone into C++ projects.

>

> It's possibly the single biggest criticism of the

> language that I felt wasn't tackled at all.

As far as the standard committee: "creating tools" is outside the scope of "defining a language and its standard library." As far as the "millions of work-hours that have gone into C++ projects," it's probably a combination of nobody's come up with a package manager that satisfies enough programmers and nobody's been willing to spend the extra time to create such a tool. It's hard enough to ship code on time.

> There is a complete lack of consensus. In trying to

> accommodate everything and anything, C++ has in my

> experience overshot the mark.

It's certainly true that projects that try to take full advantage of C++'s flexibility end up collapsing. Projects generally shouldn't allow "everything and anything" in the code base. And many of the design principles that the committee follows should probably not be applied to software projects in general.

But the language has to satisfy more than your project, or more than the average project that programmers believe exists but can't agree on what it is. I've been writing C++ code professionally for years, and while I know how to, I haven't shipped any GUI code. I did write a gtkmm-based stopwatch to help my son with an elementary school science project (because I thought it was more fun to do that than to buy a stopwatch at the local sporting goods store). But professionally I've only written code for back end web services. I know I'm not the average C++ programmer people imagine, but I'm not sure if the average C++ programmer is writing desktop GUI software, or iPhone software, or libraries that will be called from Java, C# or some other language via SWIG. I know they aren't the average C++ programmers, but I do know some people write C++ to run on supercomputers (I used to work with one of them).

Regarding a package manager specifically: I personally have been happy to use RPM on my personal Linux system to manage packages. I wouldn't expect people to use RPM on a BSD or Apple system. I'm not sure that we need a special package manager just for C++ libraries. I miss anything like RPM when developing on Windows (edit: another comment mentions NuGet: I'll try it, thanks!). If you write something for Windows, I'll buy a copy.

My web services interact with efficient number crunching libraries written in C++. In my case, I don't need things to be as efficient as possible, but others do. I know Dan Bernstein believes the way to get super-efficient libraries is to use different implementations to take advantage of CPU-specific features ( http://www.tedunangst.com/flak/post/some-gripes-about-nacl ), but this makes package management much harder (at least without function multiversioning -- https://gcc.gnu.org/onlinedocs/gcc-4.9.2/gcc/Function-Multiv... -- which isn't required by the standard). Should the committee tell Bernstein he's wrong? Or encourage compiler writers to ship with tools that Bernstein and others have no use for? How should the package manager handle a case where a library could work on my platform, but nobody's compiled it for me yet? What about platforms where the ABI isn't completely nailed down yet?

You have identified a real shortcoming in C++ development. There isn't an easy answer. But, seriously, if you write one I'll be sure to send you some money.


Sometimes I just wish there would be less ISO standardization process and more new features added to the language. I guess it's not happening because language extension are evil and because maintaining compiler is hard.

But still, I think it would be great to be able to have module right now as an unofficial clang or GCC extension, even if it breaks later. I guess it has more chance happening in the open source realm.


Manual memory in C++ is so cumbersome that is efectively imposible to develop programs with no memory leak bugs in real life.

Anyway, the more important fact about C++ is that it is a language complicated to no end, with an insanely bad design on purpose, so a couple of guys can sell consulting services, write books, and the like.

The year 2015 is about to come, let's just stop filling the world with such primitive and ugly ugly code. We have saner alternatives to C++ since ever.


shared_ptr<int> a = make_shared<int>(10);

What's so hard about managing that? shared_ptr is basically equivalent to Python garbage collection.

Shared_ptr were standardized in 2003, and Microsoft implemented CComPtr<> way back in the 90s for this easy to use reference-counted smart-pointer concept.

http://msdn.microsoft.com/en-us/library/aa266806%28v=vs.60%2...

So... yeah. Stop doing manual memory management in C++. Something better has existed... since the 90s.

So how about you criticize something in C++ that wasn't solved at least 16 years ago?


Correction: shared_ptr wasn't in C++98/03. It was originally developed in Boost, then added to TR1 (in 2005), and finally incorporated into C++11.

One of the major differences between shared_ptr and CComPtr is that shared_ptr is non-intrusive - it can manage anything, even things like ints that don't know anything about strong/weak refcounts.


My history was off, thanks for the clarification.

Nonetheless, the concepts revolving around CComPtr and all that good stuff can be easily seen in C++98 / STL. STL Strings for instance automatically clean themselves up under the principles of RAII.

Auto_ptr<>, the first smart pointer, also was standardized in C++98. While crude and primitive by today's standards (especially compared to unique_ptr<>'s explicit RValue semantics), auto_ptr<> shows me that even in the mid 90s, people understood the principles of RAII.

At least, if you were a learned C++ programmer.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: