Hacker News new | past | comments | ask | show | jobs | submit login

I don't want to be "that guy" but I think its a valid question to ask- why didn't they write this in ... ... .... Rust?

I think go and rust serve two somewhat different use cases- go being for distributed computing in a nutshell, and rust for fast running programs. And I think Rust would have been a better choice

Edit: looks like everybody and their dog has a Lua engine now- maybe I'll create one in rust




Idiomatic Go code tries to use cgo sparingly, if at all. And the Lua embedding API makes calls into C with a very high frequency. I haven't tried, but I suspect that the combination of these two factors would make embedding a Lua engine written in C (or, say, in Rust with a C ABI) into a Go program disastrous for the performance of Go's green threading runtime.

Alternatively, while it would be nice to have a Lua engine entirely in Rust, Rust goes to great lengths to make calling into C have no runtime cost, so you generally just see people writing Rust wrappers over the C API, which tend to impose varying degrees of additional safety on top (Lua internally does a ton of setjmp/longjmp, among other things, so it's not easy). A good example of "wrap and add safety" is Chucklefish's wrapper, being used for their game Witchbrook: https://github.com/kyren/rlua


The why not Rust is a valid question. It's because this was put together with an intent of being embedded in an application written in Go (https://helm.sh).

I'm a Helm maintainer so I'm familiar with the history of how this came to be.


What is the advantage of using a full blown language for Helm configuration instead just using something, which is a bit more flexible, but not Turing complete like: https://github.com/dhall-lang/dhall-lang (Or something similar)

As Helm so far could operate with a simple template language , and now suddenly jumps to a Turing-complete configuration language - this seems quite large step... What is the justification?


Good question. For 90+% of things a simple template languages works. But, there are cases where you need something more. For example, the features that were baked into Helm v2 to support installing OpenStack in Kubernetes were complex. IIRC, no one on GitHub has used these but they are complicated and in use privately. That complexity should be pushed to the chart rather than Helm. Helm shouldn't need to hold that complexity, which is in the direct line for everyone, for those complicated apps. This is where Lua comes in.

Lua provides the ability to have the language embedded easily (it was designed for that) for use in cross platform situations (Windows included).

Does that help?


One of the motivations (linked in another comment) may reveal why:

> We also wanted clean and idiomatic Go APIs

If you want to use this as a scripting/plugin language for Go programs (say, for those distributed computing programs you mentioned), it's pretty defensible to make it integrate with Go better by... building it in Go.


> why didn't they write this in ... ... .... Rust?

Probably because they wanted to use it from Go?


Another reason might be GC. They are probably using Go's GC for Lua. In Rust you would have to also write the GC.


They are probably using Go's GC for Lua.

That's quick and dirty. Dirty, because the way Go's GC is optimized is likely to be very unsuitable for Lua.


I am by no means an expert on that particular subject, but from what little I do know, both Go and Lua (the regular C version) have garbage collectors optimized for low latency; I may be wrong, but if I am not, it looks like a good match, doesn't it?


Likely they wanted an idiomatic Go API, and maybe had more in-house Go knowledge than Rust knowledge.

I can imagine how exposing a Rust API to a non-Rust program could be as problematic as a C++ API: your common denominator is a C API, which adds cumbersomeness and removes (some of) the safety.


There is quite a big overhead calling out from Go, so folding stuff they want to use into pure Go often makes sense from a performance perspective.


Yeah but can't we have both?

If you already have projects in go then a rust project will not be of any use, and vice versa...


The more interesting question is "why Lua?" The issue others have linked says "We need a Lua 5.3 engine", I'd be curious why. Presumably that means they want to use some existing Lua plugins? Is this something that's built into Azure?


> I don't want to be "that guy"

Then don't.


I think you got it backwards. Go is good for toy-interpreters, compilers, programming languages in general. Pretty bad for distributed computing though, Rust is definitely better for that use case if your only choices are Rust and Go.


As someone that is usually critic of some Go design decisions I fail to see why.

If you are going to say GC, it all comes down to how one designs the respective data structures.


There are several distributed databases and other systems being built in Go now, so it seems it works fine.


Rust's type system is derived from the ML variants, and as such is fantastic for writing compilers and interpreters. ML stands for "meta language", ie. it's named after how good it is at writing compilers in.


Kind of, I prefer the productivity I get from the GC in ML language variants.

Having done some compiler related work back when we had Caml Light, OCaml was still called Objective Caml and Mirada was being turned into Haskell, I don't see what I would gain from having to deal with the borrow checker.


You don't really gain anything from the borrow checker in this case, but you don't really lose anything either if you're used to it.

It's the ADTs that are killer compared to Go.


Yeah, but then why Rust and not another ML variant?


The discussion here is between Rust & Go. My point is swaping the GC for the borrow checker in order to get ADTs is a net win to me.

If we're veering off into hypotheticals that don't have anything to do with the question at hand, I personally write compilers and interpreters in Rust rarher than another ML because my use cases tend to be worst case latency sensitive enough that a GC is a non starter.


Fair enough, thanks for replying.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: