Hacker News new | past | comments | ask | show | jobs | submit login
AWS hires Rust compiler team co-lead Felix Klock (theregister.com)
244 points by ducaale on Nov 28, 2020 | hide | past | favorite | 168 comments



I’m sad to see Firefox lose its edge in Technology. I’m not so sure why I would use Firefox (over say Safari). Servo, Rust, WASM - They’ve been pioneers of many great performance and safety oriented technologies.

It’s great that AWS wants to have the rust team onboard, but in my experience at-least AWS has been good at leeching Open Source, rather than fostering it. Open source isn’t part of their culture.


I believe part of the problem is Firefox has been operating as if it had a large money making product, and investing in long term, not revenue generating projects. While fantastic for the community, they can be expensive. Operating as if your Amazon, Google, Facebook, etc is expensive when you don't have the money printing machine.

I think it'll be a long term lose for Mozilla, but things don't seem great in the short term so maybe priorities needed to shift.


Firefox more or less had a money printing machine. Google gave them about half a billion per year for the last half decade (the last financial statement is for 2018). They verifiably burned through billions of dollars over the last decade (the timespan over which they developed rust) and I'd be surprised if servo and rust constituted more than a tiny fraction of that. If anyone has a rough ballpark of yearly spend on rust and servo, I'd love to hear it.


What would have happened if they had invested that money exclusively in non-silicon-valley engineers?

$500 million per year can buy you 5,000 talented engineers at $100,000 almost anywhere on this planet.


Mozilla did and does employ a lot of people outside of silicon valley.


I might agree with you, except in my experience their money-making products still largely suck. The Mozilla VPN app on both my Android devices randomly logged out weeks ago, and has put me in an unbreakable log-in loop since). They still don't have Mac or Linux support, despite that crowd being the exact kind of early adopters you'd want on a VPN product. I'm paying them money, and if it wasn't for the hope I have that maybe this is just a (way too long) blip, I'd switch over to normal Mullvad tomorrow (which they're using on the backend anyway).

It's not just a priorities shift when you're doing badly at both things and losing incredible people, it's straight-up organizational dysfunction and managerial incompetence.


Mozilla is a non-profit. Their only goal should have been development of Firefox(and follow-up technology like Rust and Servo), not all those money making schemes and side projects they started.


Mozilla is a for-profit owned by a non-profit. The for-profit organization develops Firefox.

(That said it’s run by the same people with the same outlook, so it feels like a non-profit even if it’s not.)


Seems like an unnecessary complexity for what should be a browser maker.


It’s a necessary complexity, particularly given that Mozilla predates the existence of Benefit Corporations by a dozen years. As a purely non-profit corporation, having most of your revenue come from business income is a big problem. As a for-profit, director’s fiduciary responsibilities to shareholders can get in the way of pursuing a public good mission. And, collaboration with for profit orgs gets a little less regulatory scrutiny if a non-profit is in control.


> As a purely non-profit corporation, having most of your revenue come from business income is a big problem.

What's the exact problem?


Pocket is profitable, but people have complained about it nonstop for years.


I am happy for people to find jobs and I am happy for ecosystems to be invested in (time, money, know-how, …). I just wish talented people would work for better employers and stop accepting the jobs from Amazon and the like.


There is a delicate balance between the idealistic employer, and the employer than can actually survive in the universe. These people did used to work for what was probably your idea of "the better employer", and it did not survive sufficiently to pay them. Amazon is a reasonable compromise between say, a non profit, and Palantir. At least amazon works people hard to provide useful services so it can pay them,.


+N to that!

I don't think Amazon are the best but at least its not "we take your tax dollars and in exchange give police and government agencies information that would be illegal for them to collect themselves" Palantir.

HN seems to consider Facebook the worst and Amazon next worst, but some companies are so much worse.


If it is Facebook, and then Amazon, then where is Google? :D


setting quite a low bar there


Better AWS pays for Rust development than if noone would.

Also, some of the reason AWS has a bad rep is not giving back to the communities of the open source projects they use. Well, this would be them doing just that.


> Better AWS pays for Rust development than if noone would

Why?


There's a lot of stuff, even if all the humans are working for free, that's expensive about developing Rust. Our CI and CDN bills are massive, and someone has to pay them. Without it, we would not be able to support the platforms that we do, which is really important.


Because if people are paid to do something they can spend their workday doing it, which equates to more man hours put into the project.


> Better AWS pays for Rust development than if noone would.

That feels like a false choice, though. Rust is popular enough that surely another company (Fastly, for example?) would have good use for such a talent. But I doubt they can rival Amazon in pay. Such is the world.


If someone is working on a social good (the Rust compiler), is it not better that they are taking money from a large corporation rather than a smaller one that might put that money to better use?


Depends if that large corporation decides the direction of their work. Social good isn’t guaranteed in perpetuity.


> Social good isn’t guaranteed in perpetuity.

True, but employment isn't perpetual either. If they later decided that their work wasn't what they wanted to be working on then they would be free to leave at that point.


Well the problem comes when said corporation co-ops said good thing for their own gain, to the deficiet of the wider community.


Such criticism would have been more appropriate had the previous employer not literally been Mozilla, which recently laid off many Rust team members, or the role been not about working on the rust language.


What's wrong when amazon?


(This is my perception of how some feel, it's not how I feel, I've no dog in the fight)

They leech from open source without giving much back.

They have a fairly interesting approach to work rights, unionisation and there have been a a large number of worker complaints around the world (oth they employ a large number of workers so that might be expected).

Treatment of suppliers - they are ruthless (but that's a megacorp rule of thumb).


> They leech from open source without giving much back.

So, you’re saying an open source developer shouldn’t take a job contributing to open source because the company historically has a reputation of not contributing?

(In the language community, it is recognized that Amazon used to not contribute to Java at all and now is one of the major contributors. This probably hasn’t hurt in recruiting contributors for other languages.)


Amongst other things, they use their marketplace to collect data on which 3rd party resellers are making money, then they implement the product themselves and put the 3rd party out of business, resulting in an anti-competitive trend towards monopoly.

https://www.geekwire.com/2020/analysis-read-antitrust-case-a...


Also, a German retail chain LIDL recently opened its own cloud business so that they can stay away from AWS, for obvious reasons.


You are wrong to take a congressional judicial report as facts.


What's wrong with Amazon?


Upcoming: AWS Oxidized Iron Code


This is very funny and very sad at the same time!


so,here it is:

Google backs Go.

Apple backs Swift.

Amazon is trying to back Rust.

Edit : What I am trying to say is that Financial Backing and adoption by FAANG companies can really improve ecosystem of Programming language. I am not here to tell that it will be product of certain company.


Go is a Google product that came out of Google.

Swift is an Apple product that came out of Apple.

Rust was incubated in Mozilla, and this support was essential to getting it going and for its early years (until 1.0 plus or minus a year or so, I’ll suggest), but it was never tied to any one company in the same way: it always had a distinct identity, with others welcome to join that (as distinct from joining Mozilla if they wanted to effect change). Now, various large companies are steadily working on helping support it in this sort of way—Microsoft, Amazon, Facebook, and more.


And, importantly here, including Apple and Google, though perhaps to a lesser degree


This is a such a dead horse. Exactly how is Go _tied_ to one company, Google?

Sure Go is near exclusively "incubated" by Google, they employ much of the core development team, but there is nothing preventing any organization from contributing to it. Go and Rust are both open languages - they can be used with no commercial licenses and the source is available.

The differences are in their major supporting companies, and the perspective which some choose to use as a deciding factor in the platforms they choose.

Neither is a product. They may be used to create products, but they do not directly generate revenue.


A key word here is “governance”.

I searched for “rust programming language governance”, and immediately found https://www.rust-lang.org/governance. Nice and clear description of how it all works, how changes are made and who makes decisions, making it very clear that Rust is its own thing and not the property of or beholden to any one company, and that it’s very deliberately an open project, not merely open source.

I searched for “go programming language governance”, “golang governance” and a few other things, but I can’t find anything at all. When I look through their website, it’s Google this Google that, supported by Google, in order to contribute you must accept Google’s contributor license agreement, that kind of thing. I’ve found mention of “Go maintainers” in about two locations, as being those that can approve changes, but I can’t find any mention of who they might be (search terms included “{go programming language,golang} {maintainers,core team}”). Go is absolutely a Google open source project, as they say in a few places.

I imagine Go is a good deal more open to non-Google contributions than Swift is to non-Apple contributions, but it’s still very firmly Google controlling Go, setting its direction, and managing the project, in a way that was never the case for Rust.


> Sure Go is near exclusively "incubated" by Google, they employ much of the core development team, but there is nothing preventing any organization from contributing to it.

Except if the majority of the expertise is held by Google employees it's extremely difficult to do anything without their support. What if Google decides to drop Go tomorrow? A lot of the expertise disappears with that decision and replacing it would require another company to step up and dump gobs of money into the project.


> What if Google decides to drop Go tomorrow?

The community can fork the code and continue.

If that did not happen, or the community does not believe that is viable, then the languages is not viable.

Google’s theoretical abandonment of the language does not seem to bother the multiple successful projects based on it. Nor does the lack of apparent focus in Rust’s founder affect its growing user base.


Microsoft and Facebook are also hiring developers to work on Rust. Google, Apple, and many other big companies use Rust as well, and I wouldn't be surprised if they do the same.

This is very good imo. Rust needs dedicated developers that get paid for their work. And with the upcoming Rust Foundation it feels like Rust is about to really take off to the next level.


I haven't heard anything about Google using Rust, other than for (essentially toy) usage in Fuchsia.


Google has also been experimenting with using Rust within Chrome[0]. They're working on autocxx to make C++/Rust FFI much smoother[1].

[0] https://www.chromium.org/Home/chromium-security/memory-safet...

[1] https://github.com/google/autocxx


There's a lot of inertia at Google with regards to Rust. Things are happening slowly there, but they're happening. I'm sure it'll be more public soon, but they didn't miss out on the Rust-grab after Mozilla let the project go, even if they haven't been as public about it.

The fact that both Microsoft and Amazon, Google's major competitors, are investing in Rust means that Google would be quite foolish to not be looking as well.


To a lesser extent so far, yes. As you mentioned there's the Fuchsia stuff, and some other smaller open source repositories. But there's also other areas, for example Android.

https://www.youtube.com/watch?v=MNkFSCRUk6E&feature=youtu.be...

Google is also listed as a sponsor here: https://www.rust-lang.org/sponsors

They might not be as invested as other companies, but I bet they'll eventually ramp up their involvement.


ChromeOS uses Rust for its hypervisor stuff, this is actually where Firecracker (from Amazon) started.


ChromeOS uses Rust for crosvm, which Amazon's firecracker was derived from.


the open list :

* Google github repo ( 30 results for repositories written in Rust )

* https://github.com/google?q=&type=&language=rust


Edit: As varbhat has edited their post to clarify that they indeed did not mean to say backing in an adversarial way, my post can be ignored or just read as my understanding of the space.

Apologies if I misunderstand your post, in which case you can disregard my entire post, but -- I assume you mean as in adversarial backing?

These companies use practically all of these languages but to a different extent, for different uses, and for good reasons.

I currently get paid to write C and Go but I dabble in most things, and I'm certainly the happiest when I get to write Rust and ReasonML/ReScript. Having said that, I personally think it's fair to guess that Swift will continue to shine mainly as Apple ecosystem language, Go will continue to shine mainly as network server language and Rust (and whatever follows it) will continue to grow for network servers but will also slowly seep into our shared foundational layer of software. Slowly, slowly, libraries, kernel modules, runtimes, stdlibs for other languages, frameworks, etc. This a good thing, because it slowly rids us of old C and C++ code bases that may not have kept up with the times

Yes, all of these languages are being used to write excellent CLI tools, graphical applications, etc and that's wonderful (I use many of them), but none of these languages are realistically competing to kill any of the other in any definitive way.

Now if only Go would give me sum types (ADTs), my life would be even easier :)


> This a good thing, because it slowly rids us of old C and C++ code bases that may not have kept up with the times

+1 to that, that is also the reason why although I am critical of Go's design approaches, I am at the same time quite supportive of projects using it.


> Amazon is trying to back Rust

I don't know. Amazon doesn't really back any languages TBH, teams in Amazon chooses their own stack if it justifies itself. Rust is apparently great language for critical low-level programming, and AWS crazes for safety and security.

I won't attribute too much significance to this hiring.


NVIDIA picked Ada/SPARK. I wonder why Ada is not considered by large companies when it comes to critical low-level programming. Ada/SPARK got all the constructs for that, it has been designed for very critical stuff, and low-level programming is pretty easy and so much less error-prone than most languages that are considered. It is actually a joy to do low-level programming in Ada. There are no performance penalties either, so that is not an issue.


Ada is unfortunately tainted by its traditional customer base (military, avionics, high integrity computing) so it tends to have little love from the traditional software houses.

Additionally from the 5 existing vendors, only Ada Core bothers to keep a link with the FOSS community and somehow approachable prices for commercial licenses, which doesn't help to move beyond the existing perceptions about Ada.


> Ada is unfortunately tainted by its traditional customer base

'Tainted' how?

> only Ada Core bothers to keep a link with the FOSS community and somehow approachable prices for commercial licenses

There's an Ada compiler as part of the GNU compiler suite.


Except, if you want to make proprietary software, you have to compile GNAT yourself and be picky about the libraries you use: Adacore’s own FOSS compiler has a GPL runtime library to prevent commercial usage.


Which is great, want to earn money? Pay for the work of others.

Library authors keep finding out the actual meaning of going with MIT like licenses instead of dual licensing, specially when their libraries get picked up by the likes of Amazon.


I have no issue with this as a business decision, but this particular decision is a barrier to adoption for a lot of people. Hopefully it pays off for AdaCore.


Apparently it does, given that NVidia preferred to go with them, than bet on Rust for their high security firmware on automated vehicles.


An implementation detail that would be trivial for any large company to handle. Or even, paying for a license.


It depends on what specifically you mean by “back.” Amazon has been financially supporting the Rust project for over a year at this point, which is one common interpretation of “back.”


Thought also Microsoft was dabbing into Rust.


Amazon still has a much more significant investment in Java than Rust, just as Microsoft does with C# and TypeScript than Rust even though both companies are investing more and more in Rust for low level projects.


When I talked to some Googlers, they said me that majority of Google code uses Java and C++, not Go. That was few years ago, may be now that changed, but it seems to me that Java and C++ really runs everything everywhere.


Nobody sensible is going to rewrite everything in Rust all at once. They'll start with new tools or things they needed to refactor anyway. Then use it where it's appropriate (it's certainly not a universally applicable language, the tradeoffs Rust makes are well known). Maybe someday AWS will have replaced all their Java with Rust, but I suspect some bits will stay Java for quite a long time, or migrate to Kotlin or some other garbage-collected language.


I do feel Google puts a lot of resources on Dart and its ecosystem (e.g. Flutter). While for golang, there's not much support from the company.


That is mostly driven by the AdWords department, given that they were the ones rescuing the Dart project.


Google probably spends more on C++ than dart and go combined.


And that's why languages such as C and C++ are not a reference implementation from a vendor but an international standard.

Just get a few rust standards out of iso, one for the core language and another for some standard libraries, and get it over with.


Rust is not ready to be specified as is required for ISO and similar standards. A few large chunks of its functionality currently don’t have any good definition written down, but instead are defined by the rustc implementation: most significantly, the ownership and borrow checking system. Research towards formalising this model in a way that a specification could use is ongoing (e.g. https://people.mpi-sws.org/~jung/thesis.html, published a couple of months back), but until then any specification document would be unreasonably complicated and verbose, containing large chunks of the current compiler rather than describing an independently-useful model.

But more generally, I think you’re approaching this from the wrong angle, perceiving a problem that doesn’t exist—Rust is already not tied to a single corporate interest. But have patience, there are rustlings of independent implementations in a few places, and their fullness will come in a few years’ time.


Most languages aren't either, they just start the process and evolve from there.

C was still mostly K&R C when the process started, and in C++ all we had was the C++ARM book.


I don't know if your wording here

> there are rustlings of independent implementations

was a deliberate choice or just fortuitous, but "rustlings" would be a fantastic term for embryonic new Rust implementations.


"rustlings" is already a well-known project within Rust world https://github.com/rust-lang/rustlings


I wrote the word down unconsciously, but then noticed the pun when I was going to reword it, and so left it in place instead.


> Rust is not ready to be specified as is required for ISO and similar standards. A few large chunks of its functionality currently don’t have any good definition written down, but instead are defined by the rustc implementation: most significantly, the ownership and borrow checking system.

That assertion makes no sense at all. If an implementation exists and is already used in production then it's behavior is already relied upon and thus it can already be described. Just write down how it behaves and specify how it is expected to behave. That's it. That's how the whole software world has been for almost half a century.

And no, it makes no difference if a reference implementation decides to implement something that breaks that behavior. That just means it doesn't comply with the standard. That's it.

Why have these basic lessons been forgotten?


The problem is that we don’t have a sound, correct model. What we have is a compiler that seems pretty decent, but rejects some things that shouldn’t really be rejected, and accepts a few things that are iffy and potentially unsound. It’s a whole lot of code crafted in a comparatively ad hoc manner, whereas a true, correct model is likely to be simpler and more beautiful. Any specification would therefore be specifying the wrong thing, essentially a form of overfitting.

Suppose you’re a cave man and you’re just figuring out how to double numbers. So far, you’ve figured out how to double each number from 0 to 100. Someone asks you to write down your fancy rule so they can do something with it. Do you really want to be teaching everyone that the way to double a number is: “if the number is zero, the answer is zero; if one, then two, 2 ⇒ 4, 3 ⇒ 6, 4 ⇒ 8, &c. ad tedium”? Even if it covers all the required cases without error, it’s a mess. It’ll do for the moment for your own personal use, but you don’t want to be teaching it to other people until you figure out that you can say “to double a number, add the number to itself”.

Until we have a sound model, any attempt at specifying Rust would, for these key areas, be simply importing the current implementation of the compiler. That’s not a good sort of a specification.


> Just write down how it behaves and specify how it is expected to behave.

I don't know how to describe it differently as GP, but I'll give it a try. Until a formalised model is built, there is no way to describe it's behaviour that wouldn't come down to writing down the implementation that's currently in the compiler (apart from specifying it by example, which would be even more expansive).

It _might_ (don't quote me on this) be possible to more easily formalize and standardize borrow checking if you leave out non-lexical lifetimes (which were added rather recently), but that's a bad idea because you would lose a lot of functionality, and an alternative compiler with an incompatible implementation of NLLs could split the ecosystem, which obviously nobody wants.


> I don't know how to describe it differently as GP, but I'll give it a try. Until a formalised model is built, there is no way to describe it's behaviour that wouldn't come down to writing down the implementation that's currently in the compiler (apart from specifying it by example, which would be even more expansive).

That is the point some seem to fail to understand: said model is absolutely irrelevant if we already have production releases of an implementation of Rust.

A standard is not supposed to be academically rigorous. A standard is a specification of the current behavior so that multiple implementations can meet the current requirements, and thus ensure interoperability and compatibility.

It makes absolutely no sense to talk about "a formalized model" because rust is already used in production without said formalized model. The absence of said formalization is irrelevant. The only thing that matters is that the current behavior is specified and set in stone so that it serves as a target set in stone that anyone in the world can target and expect to interoperate.

It's terribly simple: if a rust v1.25 exists then it's behavior and interfaces can be described. Obviously. That description is what a standard is: an implementation target set in stone that others can use as a reference.


If you want to exhaustively describe the behavior of e.g. Rust 1.25.0, without a formalized model, then the only way to do so it to run it for all kinds of possible (edge-)cases you encounter and document the output.

If all you have as a reference is a bunch of test cases without any underlying reasoning as to why input A leads to output B, then that's not really a useful target to implement against. At that point you can just go the direct route, not bother with any standard and just implement directly against the Rust compiler.


While I disagree with the general sentiment of “rust needs a standard to be successful generally,” you are absolutely right in this comment. There’s a number of things at play here, but there are folks trying to do exactly this. We’ll get there, in time.


Why would I then base a development project on Rust, if there's neither a spec nor an alternate implementation? This is strictly worse than what we have now.

And no, I don't trust the Rust propaganda machine. When Mozilla announced they were getting rid of Rust teams, all we heard from said Rust devs was that this doesn't spell the end for Rust. My impression was that they were mainly motivated by advancing their language rather than, you know, implementing a browser.

And to this date, I haven't really read a critical post-mortem addressing claims that Rust can actually replace C or C++ in the areas where these languages are essential. Apart from kernels and drivers (where Rust seems an outright no-go), C is traditionally used for higher-level language runtimes and compilers/interpreters/VMs. But Rust's borrow checker is a bad fit for just about any text book algorithm and technique in that space. Hell, even Rust devs themselves say implementing a DOM is about the worst use case for Rust.


> Why would I then base a development project on Rust, if there's neither a spec nor an alternate implementation? This is strictly worse than what we have now.

How exactly is having one compiler defining what is okay and what is not worse than "what we have now"? The multiple implementations of C++ are all incompatible with each other in various subtle ways, produce code which works differently and at the end of the day everyone just takes one and says "we write C++ which works with <compiler>" and that's it.

> When Mozilla announced they were getting rid of Rust teams, all we heard from said Rust devs was that this doesn't spell the end for Rust. My impression was that they were mainly motivated by advancing their language rather than, you know, implementing a browser.

Why exactly is it a problem that the core team of a language is interested in the language? Or do you want to imply that the rust devs working on Firefox weren't interested in it? All the progress Firefox made in the last few years tells a very different story (yes, not all of it is due to rust, but a relevant part is).

> And to this date, I haven't really read a critical post-mortem addressing claims that Rust can actually replace C or C++ in the areas where these languages are essential. Apart from kernels and drivers (where Rust seems an outright no-go), C is traditionally used for higher-level language runtimes and compilers/interpreters/VMs. But Rust's borrow checker is a bad fit for just about any text book algorithm and technique in that space. Hell, even Rust devs themselves say implementing a DOM is about the worst use case for Rust.

If you haven't read any critical postmortem about Rusts difficulties in these spaces you haven't looked around very much. This is an impression I generally get from your post. You don't like Rust (for some reason, it's not very clear what it is) and seem to have an axe to grind with it. Which is .. okay? No one forces you to like a language, but doesn't make for a very interesting conversation.


Rust is a pragmatic project. They chose a compiler which is performant and widely available and bet everything on it to keep things straightforward. It looks like it's served them well.

This compiler is on the other hand naturally not available on all platforms and is not appropriate for all contexts. Alternate implementations could have filled these gaps, but they're not there.

The fact that the compiler defines the language and vice-versa enables additional flexibility in changing the language/compiler and development speed at the cost of instability. Different projects may tolerate different amounts of instability and for those that are more conservative Rust may not be an appropriate investment.

With C++ one can code for a specific compiler or in a standard way. One can pick C++98 or 17 or anything in between. All of these are supported options and they give C++ its massive breadth and offer the most flexibility to teams. There's obviously disadvantages associated with all that, but claiming that diversity is a weakness is not generally true.


Similarly you can compile Rust in "edition = 2015" or "edition = 2018" and soon "edition = 2021" to guard against changes in the language, e.g. adding keywords.

There are already alternative compilers, though most of them are WIP, e.g. mrustc[1], rust front-end for gcc[2][3], or using cranelift[4] for the back-end.

[1] https://github.com/thepowersgang/mrustc

[2] https://github.com/sapir/gcc-rust/tree/rust

[3] https://github.com/Rust-GCC/gccrs

[4] https://github.com/bjorn3/rustc_codegen_cranelift


Technically the possibility is there. What I'm interested in is how this will be regarded by Rust programmers as time passes and how it will continue to be supported.

It would be interesting for example to see if e.g. the 2015 edition will still be usable/used 9 years later (like C++11) or even ~20 years later (like C++98) but we have some way to go before we can find that out.

On the other hand, there's something exciting about being on the bleeding edge. Right now I'm using C++17 and would probably jump on C++20 when it becomes widely available. But having the knowledge that I have the rock-solid C++11 at my disposal and knowing that it will likely be there for decades gives me a feeling of security that's hard to beat.


There's significant difference between the edition system and the C and C++ standards, and that is that the standards are done when they're done, but the editions are not, exactly. That is, new features arrive in the largest set of editions possible, whereas C99 is frozen in time. This is not to say that you or the parent are wrong exactly, but it doesn't really work in exactly the same way.


I still look forward to how editions will actually work in practice, when Rust gets a couple of years behind it.

Lib A compiled with edition 2018, talking callbacks to functions implemented in Lib B with edition 2020, Lib C edition 2030 importing Lib D edition 2025.

Naturally, I am also assuming that for pleasing the commercial users of binary libraries, typical in C and C++ universe, all those crates are only available as binary libraries being pulled into a multi-edition final executable.


Rust does not yet have a stable ABI, and there is not any such thing as a "binary crate" at the moment. So it's kind of hard to run into that problem as the entire exercise is impossible.


That exercise is possible to some extent with C and C++, so the whole point is to prove that epochs aren't the gold solution over language selection switches, as they are currently sold.


I'm assuming you mean LLVM by "compiler". If platform support of Rust is your concern here due to limited platform support of LLVM, then that might be going away. Based on the fact that Cranelift recently landed as a non-LLVM codegen backend in nightly, I assume that work is already pretty far on supporting alternative backends that could support alternative platforms.


Correct, I was not precise. Actually LLVM is not a problem for me personally, as the platforms I'm using right now are mainstream. I was merely trying to explain why some wouldn't be content just with LLVM.


> Why would I then base a development project on Rust, if there's neither a spec nor an alternate implementation? This is strictly worse than what we have now.

This is a double standard. C was created in 1972 and standardized in 1989. A major C compiler (MSVC) doesn't even fully implement C99 from 21 years ago.

C++ itself was created in 1985 and standardised in 1998. Yet it was adopted.

Yes, things have changed and the bar has been raised. But people don't seem to have followed your reasoning even for C and C++.

> Apart from kernels and drivers (where Rust seems an outright no-go)

Why? From what I've read Rust can actually help with managing hardware resources through its semantics. What blocks it from being used?


Having an international standard behind a language gives it a certain weight that can't even be compared to the ad-hoc structures that are seen in other projects.

It enables experts in different areas from different countries to come and work together under a set of specific rules that are much stronger that words written in wikis or repos. It also makes it clear that the language is designed for the long term and is not under the control of a single party (as was the concern about e.g. Mozilla which the Rust community tried so hard to dispel).

It avoids embarrassing transitions like Python's 2->3 or one corporation taking control of a language like Oracle did with Java by simply buying it. It avoids having to watch a language go extinct because its corporate sponsor wishes it so (Objective-C, VB 6) and many other such unpleasantness.

This is partly why this AWS focus is strange to say the least. I would have thought that it's obvious for the Rust community that having one large sponsor is not healthy. This tells me that Rust was never really as independent from Mozilla as claimed and Mozilla refocusing away from it seriously hurt the language.


> one large sponsor

Where is this coming from? I’m also worried about AWS over-representation in the Rust community but they’re far from the only significant corporate sponsors at this point.


Yet Python and Java have been around for 30+/25+ years and are some of the most popular languages around.

Not many people care about standardization, it seems.


I don't care so much about standardization as about choice, though, which is what standards tend to create, or the formation of standardization bodies tend to document. C is simple enough to create a compiler from scratch in less than a year (as proven by Tiny CC), but sinking your precious time into a behemoth like Rust (or other language-to-rule-em-all with open research problems) is on another level altogether in terms of commitment. As the language hasn't finalized, can change at any time, only people with inside knowledge of Rust will be able to work on such projects. Others will have to take a significant risk of their Rust code base becoming obsolete, or Rust simply imploding due to lack of new devs outside the inner circle able to pickup. Which makes Rust an irrational choice in the presence of alternatives.


I find the Rust complexity comment quite funny in a world where C++ and Perl 5 are around and they even managed to release Perl 6 :-)


Yes, that's almost 5 years ago now. But please note that since then, it has been renamed to Raku (https://raku.org using the #rakulang tag on social media). You can check out the Rakudo Weekly News (https://rakudoweekly.blog/blog-feed/) if you want to stay up-to-date on developments.


Do you have some sort of Perl 6 alert for HN? :-p

I know that it's been renamed, but I think you agree that it's still a very complex and powerful language.


It's logical to be concerned about complexity in a specific thing even if there are other complex things in the world. That should be self-evident.


Yeah, but his argument was directly coming from a C/C++ perspective, which makes it at least a bit... strange.


I was comparing directly against C. There's no such thing as "C/C++".


One doesn't need standardized tools to get things done. One doesn't even need correctness, performance, privacy, etc as can be seen from a multitude of very successful projects.

But it's there for those that really need it and personally I like it because it shows what we can achieve when a massive group of experts from all over the planet gets together and they nurture a project over many decades which has to work for such a broad set of use cases under the most strict and diverse of requirements, from avionics to mobile apps, physics simulations to cutting edge video games, medical devices to smartwatches and so on.

These two languages have contributed so much to what we have achieved and they're part of our digital heritage. Maybe that's another reason that they had to be standardized, so that they belong to everyone. :-)


Java has a standardization process, it just happens not to be ISO, it is still driven by multiple companies and at each release written documentation comes out.

https://docs.oracle.com/javase/specs/index.html

https://openjdk.java.net/jeps/1


C only became standardized after 20 years or so, and only out of necessity because there were different compiler toolchains moving into different directions and this ensuing chaos had to be brought under control somehow. And arguagbly the standard is just as much a problem as an advantage (look at all the very useful extensions in clang and gcc which will never make it into the standard, and the C standard libary is still essentially a snapshot of what was already the least common denominator in 1989, and far from the "state of the art" even back then).

As long as a language only has a single implementation or "reference compiler" (like Rust), standardization only adds needless bureaucratic overhead.


> C only became standardized after 20 years or so, and only out of necessity because there were different compiler toolchains moving into different directions and this ensuing chaos had to be brought under control somehow.

The "after 20 years" remark is totally irrelevant because just because the C community was extremely slow it doesn't mean everyone is supposed to repeat the same mistakes.

The only relevant point in your remark is that a standard is required when there are at least two implementations that need to interoperate. That's the only relevant point. A standard is the only way to ensure that interop is possible, and that implementations are reliable.


It takes a couple of decades before a language has stabilized enough and gets popular enough to spawn multiple competing compiler toolchains. And before all that hasn't happened a standard makes no sense.


> My impression was that they were mainly motivated by advancing their language rather than, you know, implementing a browser.

It is true that the folks working on Rust are focused on "advancing the language," yes. I don't see why it would be any other way?

> Apart from kernels and drivers (where Rust seems an outright no-go),

Rust works great in these scenarios. We'll see about some higher profile examples, like Linux. Work there is ongoing. Only one way to find out.

> C is traditionally used for higher-level language runtimes and compilers/interpreters/VMs. But Rust's borrow checker is a bad fit for just about any text book algorithm and technique in that space.

There are a ton of projects in this space. After all, the Rust compiler, for example, is one of the two OG massive Rust codebases. But there's also tons and tons of projects here.

> Hell, even Rust devs themselves say implementing a DOM is about the worst use case for Rust.

Sort of, but also not really. Implementing a DOM would be easier if Rust had inheritance, is usually what is claimed. Yet, Servo was still built, and nobody would suggest that the DOM was a hard blocker here.


There are fields such as automotive and aerospace where absence of Rust's standard is a showstopper for using Rust. Those areas require that all tools that are being used to build software must be certified (ISO 26262 for automotive for example) to ensure that software is safe to use and doesn't kill anybody. And nobody would do that for a language that adds new features every 6 weeks, because such certification is a long expensive process which you have repeat every time when the tool changes.

It's not possible for open source project to do such certifications by themselves. In C/C++ world there are companies that implement certified compilers. Usually, these are pretty shitty compilers that implement only previous standards like C++14.

And here is the problem for Rust. If there is no standard for the language and library, how such companies could implement an alternative compiler? Current Rust editions is not an option because they feel like a bottom line and don't answer the question to what extent the compiler has to be implemented to be called "compliant with Rust 2015".

And those fields aren't small. Usually hundreds of million lines of C/C++ power modern vehicle.

I find it's pretty funny that language, which positions itself as safe replacement for C/C++, can't be used in safety critical applications.


> There are fields such as automotive and aerospace where absence of Rust's standard is a showstopper for using Rust.

"fields" is a bit too broad. Not all applications in those fields require a standard or certification. That being said, you're absolutely right that in some cases, it is required.

> In C/C++ world there are companies that implement certified compilers.

That is true. In Rust, this is what we're seeing too; the first certification effort is being led by a company, in concert with the language team itself. This is still ongoing.

> I find it's pretty funny that language, which positions itself as safe replacement for C/C++, can't be used in safety critical applications.

This is overloading the word "safe." Rust has been about memory safety. Not about every possible meaning of the word "safe." Yes, Rust is not yet mature enough for industries with those kinds of requirements. Yes, they are not small fields. They are still smaller than the sum of all fields that do not require them. Rust doesn't have to be applicable for every possible use at the earliest stages of its life to matter.

We'll see how it goes.


> Usually hundreds of million lines of C/C++ power modern vehicle.

And I find that highly problematic. If anything, Ada would be a much better choice for these safety-critical applications.


> It is true that the folks working on Rust are focused on "advancing the language," yes. I don't see why it would be any other way?

That wasn't the argument. What's problematic is that Rust developers are focused on their language above anything else, to the detriment of the primary product they're developing. When the product - a web browser to stand against Chrome - is much more important than yet another programming language. It's about obsessing on your tools and being in love with a programming language; I call it "thinking with your fingers".


> to the detriment of the primary product they're developing

I guess this is the core of it. It is hard for me to understand why you'd think people working on a language would have "the primary product" be something other than the language. There isn't even a single coherent other project to be associated with. Why would "a browser" be special here? It’s even in the name: the Rust developers. We develop Rust.

Do you fault the C standards body for working on C? Or the Ruby core team for working on Ruby? What language do the developers of the language consider something other than the language as their primary product?

Now, of course, as a tool maker, you have to make sure the tool is fit for purpose. You want to enable the people who are working on projects like a browser to be able to do their job. But that's their job, not our job. It seems like you're suggesting that the primary job of Rust developers is to improve Firefox, and it's very confusing to me why you'd say that, when nobody would say that about any other language. Or at least, I've never come across that sentiment before.


Maybe we're really talking past another, but for example, last week's announcement for Servo's new home page reads

> The Servo Project is excited to announce that it has found a new home with the Linux Foundation. Servo was incubated inside Mozilla, and served as the proof that important web components such as CSS and rendering could be implemented in Rust, with all its safety, concurrency and speed.

This is clearly written by someone who sees Servo as a showcase for Rust above anything else rather than what tf it is, such as a state-of-the-art rendering engine. But the developers addressed as readers, such as me, care about the state and coverage of the actual layout engine, CSS parsing/matching, rendering and rendering targets, font handling, and the like.

I'm not the only one bringing this up; when the announcement was discussed [1], there were similar sentiments. For me, a programming language is a means to an end for creating machine code, not an end in itself. This focus (and Mozilla and developers walking away from it) also hints at Servo not being all that useful (or incomplete?) for actual rendering; we don't really know. Anyway, the take away for me is that diving into Servo and Rust is probably not worth my time investment.

[1]: https://news.ycombinator.com/item?id=25125325


Yeah, I think we are.

I think the issue here is that this post makes sense to me, but it's about Servo and/or Mozilla. But your initial post was complaining about "the Rust developers." These are two (or three) effectively disjoint sets of people. I think that's where we got off the rails.

If you had said "Mozilla" in your post, I would still disagree, but for completely different reasons. Complaining about a "focus" on a team of ~20ish people in an organization of 1500 people doesn't make a ton of sense to me.

And I'm still not clear why you're drawing conclusions about Rust generally from one single project, that's not even its most popular or widely used exemplar project.


> Anyway, the take away for me is that diving into Servo and Rust is probably not worth my time investment.

You're the best judge of your time, but making a claim about Servo and then, in the last sentence of your post, broadening it to all of Rust, is a bit odd. Servo's not even in production, unlike plenty of other Rust code.


Why would you say Rust is bad for kernels and drivers? Asking this because I am curious, not as an argument.


I would also be interested if there's any specific reasoning there, especially because there's an effort to get rust supported as way of writing kernel modules in linux, a pretty dogmatically C (though not standard C) codebase. I am personally very exciting for rust's potential in low-level embedded applications.


My day job is doing embedded rust, with kernels and drivers. It works great.


> And that's why languages such as C and C++ are not a reference implementation from a vendor but an international standard.

I'm not a C++ programmer but when I read C++ programmers discussing their language standard committee, "free and unconstrained by corporate interests" is usually not how they characterize it.


Of course not, but at least we have several very different corporations that have to fight and agree on the final result, instead of just one that can push whatever crap that suits its short-term interests better


But that’s not the case for Rust? Rust is backed by Microsoft, Amazon, Apple, Google, Facebook, Intel, and many others...

The Rust development and specification happens all in the open on GitHub.

There is no single company controlling its future or direction.


Mozilla pays salary to the core team members, who approve/decline pull requests, vote on RFCs. So far there haven't been any reasons for concern, and Mozilla is more of a dying husk without too many interests to affect rust's development, but the situation could be improved with more neutrality/independence.


Mozilla used to pay a salary to a handful of the 200+ folks who make these decisions. And since we operate on consensus, every member of the team must agree. So it’s never been possible for one company to completely control things. This is by design.



Although both are standards by accident.

C because when UNIX started to get widespread, that was the way to settle difference across multiple language implementations.

And C++ because Bjarne eventually settled on ISO as means to settle out the differences across the various C++ implementations that sprung off.

https://www.stroustrup.com/hopl-almost-final.pdf

However this is also a reason why nice tools like package managers and IDEs are ages behind, because when one vendor comes up with nice toys (C++Builder, vcpkg, CUDA), they don't land on ISO and you are stuck in one eco-system anyway.


What’s the advantage of a formal standard when the language itself and the tools are all open source? All I can see a formal standard adding to the picture is a lot of bureaucracy and friction and the headache of subtly incompatible implementations.


> What’s the advantage of a formal standard when the language itself and the tools are all open source?

Let a, b, c be integers. Consider this statement in a language:

  a = b % c
If b < 0, c > 0, is a <= 0 or is a >= 0? How about if b > 0, c < 0? How about if b < 0, c < 0?

There are three ways I might go about determining that.

1. I could write some test cases and run them. That will tell me what the deal is for that particular compiler on the particular OS and architecture I am compiling on/for. It doesn't tell me if I can count on that anywhere else. It could be that the compiler doesn't care, and I'm simply seeing the way the underlying machine division or mod instruction works on my particular CPU.

If I care, then, I have to specifically write code myself to make it work the way I want, or make sure to only run my code on CPUs that work the way I want.

2. I could look at the source for the compiler. Suppose I see in there that it compiles it so that it will be consistent regardless of how the underlying CPU works, and it is even doing it the way I want.

Great, now I don't need to write extra code. I can just use "a = b % c" and be done with it, right.

Nope. Maybe in the next release of the compiler they change this to just do whatever the underlying CPU does because that enables some optimization that they could not make when they forced one particular option for this. Then my code breaks.

3. I can read the language specification and see what it says about signs and modulus. If it says it will always work one specific way, I can code to that and be done with it.

Without a specification for language X, you cannot really write X code. You can just write code for the incompletely documented X-like language implemented by specific releases of specific X-like langauge compilers, possibly also limited to specific CPUs.


True, but (as you say) this merely requires a specification, not a (ISO or other) standard like the GP asked about.

In other words, it's sufficient if the semantics are publicly documented with adequate precision, and the developers of the reference implementation consider any deviation from this specification a bug.


Indeed. It was a very much different situation for, say, C or C++. Back then it was multiple proprietary implementations, and customers (if not vendors) had an interest in a specification.

For Rust, so far there is one (main) implementation, which is open source. Very different scenario.


Having a spec is enough IMO. Go has one. That enables alternative implementations. There are enough reasons to not want an implementation monoculture.


What are the reasons? I see a lot of benefits of an open source monoculture, especially if corporate backing is uncertain.


Competition is great. If there is no competition, implementation quality stagnates. It happened with GCC, for example.


I doesn't seem like a comparable situation. There were already multiple competing C & C++ compilers including several commercial options. I don't think complacency is holding Rust back.


I would be nice to have a document we can rely on and refer to, in case of subtle breakages and bugs. Alternative implementations could also be a possibility down the line. Having a formal spec would also help with code verification, i.e. you basically can't do it unless the language is specified.


A formal, machine-readable specification could be useful. I wonder whether a specification in prose form is worth the effort.


Why do we want alternate implementations?


There are some platforms where the current implementation doesn't run.


> And that's why languages such as C and C++ are not a reference implementation from a vendor but an international standard.

Is that true though? I mean the spec is so "loose" that most compilers don't even work with the same code if it has any level of serious complexity. Normally that's put under the rug of 'undefined behaviour' but imo it's a "vendored version" of the language, the most common variants being MSVC, GCC and CLANG, and even projects which make efforts to be agnostic or have many eyes (linux) can't support GCC _and_ CLANG easily.


If you move up from kernel level, many projects work with multiple compilers. For example most of the underlying libraries that are part of GNOME work with MSVC, GCC and CLANG. Probably also many more, but it does not tend to be tested much.


> Google backs Go. Apple backs Swift. Amazon is trying to back Rust.

Thankfully we can still use independent languages like Nim, Python, C, C++


Although Nim could really use a high profile company to become great.


Statements like this make no sense. What can I _not_ do with Rust, Go, Java that I am allowed to with Nim, Python, or C/C++? What license or restrictions on distribution of my work product is prevented with any of them?

Too many people forget or are not aware that good C/C++ compilers used to be commercial and expensive, they were independent in idea only and evolved today to be independent of any commercial entity. If I wanted to develop performant executables I may have had to purchase an expensive compiler, and abide by its license. If anything, languages are far more independent today than they were.


> What can I _not_ do with Rust, Go, Java ... ?

Trust the language not to be driven by corporate interests, for example by bending it out of shape in future when it suits some needs, or drop it suddenly...

...as it happened a million times with languages and other projects from Google, Oracle, Microsoft.

All evidence indicates that projects that are not tied to a single corporation are more likely to keep their soul and their direction and avoid marketing-based decisions.

The Linux kernel is a good example.


> ...as it happened a million times with languages and other projects from Google, Oracle, Microsoft.

Paranoia. Java and .NET are strong despite their corporate keepers. Open versions of both exist.

Languages are not product, and this is something people seem to have a very difficult time accepting.


I don't think it's that simple:

Here's an example of an AWS open source project, written in Go and hosted on GitHub: https://github.com/kubernetes-sigs/aws-load-balancer-control...


Is your concern that Amazon employing people to work on the open source project will dramatically alter its direction? From what all I've seen I am taking the public comments at face value: Amazon wants to use Rust long term, so they are putting resources into ensuring Rust's long term viability. I think that there are exist few things that Amazon could possibly want that from this effort that wouldn't also be desired by the ecosystem at large. Changing the language isn't something that is done in the dark, the RFC process is as transparent as it could be (although it is not actively promoted to the entire community, which does produce the "why wasn't I consulted?" reaction from time to time) and that process will not be changed to become more opaque.

Rust the project is resilient because although it was significantly shepherded by Mozilla, it's been years since it relied on any single company or even individual for advancing (although there are many people whose contributions are very important).

Rust the language is stable, if "incomplete", as there are many language "features" that are still needed (by "features" I don't necessarily mean "adding new things" as some might be worried about, but rather "making existing features work in more places" which although they complicate the implementation of the language simplify the usage of the language).

Rust the ecosystem has advanced by leaps and bounds in the past couple of years but is still green, particularly if you want to use anything outside of the beaten path trail blazed by hobbyists and early adopters. Writing new crates people might want or bringing existing crates up to production quality requires resources: people, money and time. Amazon, Microsoft, Facebook and Google are four large companies that are providing some of those resources. Lots of other companies and individuals are as well. I see these developments as a good thing.


Thank god, we still have C++ :)


I see that Jon Gjengset was hired into this team as well. His long-form YouTube videos about intermediate level rust are excellent and inspiring. Check them out!


Off topic - somehow I’ve never heard of the register.com. Looks interesting, is it worth reading?


Depends on what you're out after.

The Register is basically Buzzfeed of technology news, not unlike TechCrunch. Many articles are "outrage" articles and most articles are driven by emotion, not facts. Although there are some gems in there, not gonna lie.

I tend to not read The Register directly, but I'm not actively avoiding it either. It's just a lot of noise, so having HN be the filter (and recommendations from friends) seems to leave me with more quality articles. Thanks HN :)


They are often first with the news, they tend to be quite satirical, they play on it, they've been around forever.

They have the BOFH (Bastard Operator From Hell) series which is always worth a read.


They used to be more influential in the tech scene - in typical British tabloid fashion they nicknamed Intel's Itanium chip the "Itanic" which stuck and seemed to be how everyone ended up referring to that disaster. They! still! do! great! Yahoo! headlines! too!


Yes. It has a rather unique style, as you'll soon find out, and sometimes its own lingo, mimicking UK tabloids, but it's quite worth reading.

Their motto used to be "Biting the hand that feeds IT"


Still says that in the copyright footer.


They certainly have their, uh, own style; you usually will either like it or not. They generally try to tend to stay at least a little factual, though. ;)


Amazon has been on a hiring spree lately. https://www.nytimes.com/2020/11/27/technology/pushed-by-pand...


No point in complaining about free beer. Let's just say that this is beyond underwhelming news.


This is a good thing.

Python wasn’t backed by a vendor and some of the decisions that worked against it might not have happened if this was the case. The 3.0 compatibility, the packaging incompatibilities etc.

And most importantly aws does not have an in-house language that competes for strategic roadmap space.


Microsoft recently hired Guido Van Rossum though..

Let's hope for an awesome python4 !


Guido also worked at Dropbox from 2013-19. Python's shortcomings probably can't be attributed to its lead struggling to pay rent.


Python competes with dot Net which is strategic to the Windows platform.


Only if we are talking about learning to program or OS scripting.

ML, .NET can easily bind to the same C, C++ and Fortran used by "Python" libraries.

Web development, yeah as long as one doesn't get thousands of requests per second stressing the poor CPython interpreter.

Desktop? WinRT binding are still embryonic, and there is hardly anything comparable to Forms, WPF.


Data science. Python dominates. You would embraced and extends python to fit into your strategic platform if you wanted to grow in that space.


CUDA, C++ and Fortran dominate, not Python

Any language can glue to those libraries, including R and Mathematica, and anything with a proper FFI.


> CUDA, C++ and Fortran dominate, not Python

This claim is hard to defend. Search job sites for data science and $LANGUAGE, for example on indeed python is about 3x of all of those combined.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: