Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rust style editions don't work with binary libraries, across compilers, or template code across editions, with semantic differences.

That is why the epochs proposal was rejected.

Additionally, the main reason many of us, even C++ passionate users, reach out to C++ instead of something else, is backwards compatibility, and existing ecosystem.

When that is not required for the project at hand, we happily reach out to C#, D, Java, Go, Rust, Zig, Swift, Odin,.... instead.



> don't work with binary libraries,

None of the edition changes that Rust has made have any effect on the ABI. It also has no stable Rust ABI, so there wasn't an effort to formalize that, but 1) ABIs should always be versioned (to avoid getting stuck with a bad ABI) and 2) you can use editions for other kinds of change in the meantime.

> across compilers,

This is almost tautological. Yes, having two C++ compilers agree to their support of editions is the same as them agreeing to their support of concepts. I don't see how this is a critique of the strategy.

> template code across editions, with semantic differences.

Rust defines editions at the compilation unit level: the crate. But it has macros (which are akin to templates) and editions are handled at that boundary (you can have code in one edition invoke macros in another) because the compiler tracks editions at the token level (the information is attached to their Span). There's no reason editions in C++ can't work with templates. You would have to specify the edition of the template, and given C++ semantics you might have to have an opt-in scope to say "use this edition here, override the rest of the file", but it would be possible.


Rust editions are very conservative, expect everything to be built from source, with the same compiler, and don't touch semantic changes across versions, mostly grammar changes.

Example, there is no story for scenario, where a callback defined in one version, is used in another crate version, calling into code, using yet another version, while passing a closure with a specific type with semantic changes across all versions.

I am not yet convinced they will scale at the size of industry use cases for C and C++, with a plethora of compilers, and mixing several editions on a 30 year old codebase.


If Rust style editions can work across modules, why couldn't they work across binary libraries and so forth? The whole point is to allow the language to progress while maintaining backward compatability.


they don't work with a backward compatible application binary interface

or more specifically they only work with ABI stability if they ABI doesn't change between epochs

which isn't a issue for Rust because:

- it is cleanly modularized

- it build a whole module "at once"

- it doesn't have a "stable" ABI (outside of "extern/repr C" parts which don't contain non reprC parts rust doesn't even guarantee ABI compatibility between two builds in exactly the same context*(1))

- tends to build everything from source (with caching)

- a lot of internees are intentionally kept "unstable" so that they can change at any time

on the other side due to how C/C++ build things, doesn't have clean module isolation, how it chooses build units, how all of that is combined, how it's normal to include binaries not build by your project (or even you), how such binaries contain metadata (or don't) and how too much tooling relies on this in ways which make changes hard, how it doesn't have build-in package management, how it you specify compiler options and how compiler defaults are handled etc. come together to make that impossible

in a certain way how you specify that you use C++11,17 etc. is the closest C++ can get to rust editions

like initially it might seem easy to introduce syntax braking changes (which most rust edition changes boil down to) but then you realize that build units using other editions have to be able to read the header file and the header file e.g. in context of templates can contains any kind of code and that header includes aren't that much different too copy pasting in the header and that you don't have a standard package manager which can trace which edition a header has and endless different build systems and you kinda give up

purely technically it _is fully possible to have rust like editions in C++_ but practically/organizationally in context of e.g. backward compatibility with build systems it's just way to disruptive to be practical


> When that is not required for the project at hand, we happily reach out to C#, D, Java, Go, Rust, Zig, Swift, Odin,.... instead.

Which is all well and good for us, the application developers. But if C++ wants to exist in the future as a thriving language (as opposed to moving in to the nursing home with Cobol and Fortran), then it needs to come up with some solution to remove cruft.


It has a solution: obsoleting features and then removing them. For examples, see

https://en.wikipedia.org/wiki/C%2B%2B17#Removed_features

https://en.wikipedia.org/wiki/C%2B%2B20#Removed_and_deprecat...

https://en.wikipedia.org/wiki/C%2B%2B23#Removed_features_and...

Part of their ‘problem’ is that they have lots and lots of users with long-living code bases. That means that, if they move fast and break things, their users won’t move to newer versions of the language.

Another part is that they want to be able to generate the fastest code possible. That leads to such things as having all kinds of constructors (‘normal’ ones, copy constructors, move constructors), and giving developers the ability to tweak them for maximum performance.

In addition, a lot of this was invented after the language was in use for decades. I think that makes the constructor story more complicated than needed.


Until those languages decide to be fully bootstraped, alongside Khronos and OpenGroup being welcoming to them for newer standards, C++ won't go away.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: