If I have all the time in the world, sure. When I'm racing against a deadline, I don't want to wrestle with the borrow checker too. Sure, it's objections help with the long term quality of the code and reduce bugs but that's hard to justify to a manager/process driven by Agile and Sprints. Quite possible that an experienced Rust dev can be very productive but there aren't tons of those going around.
Java has the stigma of ClassFactoryGeneratorFactory sticking to it like a nasty smell but that's not how the language makes you write things. I write Java professionally and it is as readable as any other language. You can write clean, straightforward and easy to reason code without much friction. It's a great general purpose language.
Java is incredibly productive - it's fast and has the best tooling out there IMO.
Unfortunately it's not a good gaming language. GC pauses aren't really acceptable (which C# also suffers from) and GPU support is limited.
Miguel de Icaza probably has more experience than anyone building game engines on GC platforms and is very vocally moving toward reference counted languages [1]
Probably much better, given the improvements on the Swift optimizer, but just goes to show "tracing GC" bad, "reference counting GC" good isn't as straighforward as people make it to be, even if they are renowned developers.
It's a cherry picked, out-of-date counter-example. Swift isn't designed for building drivers.
In reality, a lot of Swift apps are delegating to C code. My own app (in development) does a lot of processing, almost none of which happens in Swift, despite the fact I spend the vast majority of my time writing Swift.
Swift an excellent C glue language, which Java isn't. This is why Swift will probably become an excellent game language eventually.
> It's a cherry picked, out-of-date counter-example. Swift isn't designed for building drivers
Do you have a counter benchmark? The burden is on you here to disprove the data presented. What that benchmark shows is that you spend A TON of time counting references, much more than tracing GC, unless you enter Swifts C++ networking code. I would think games don’t spend most of their time calling into networking code.
> In reality, a lot of Swift apps are delegating to C code. My own app (in development) does a lot of processing, almost none of which happens in Swift, despite the fact I spend the vast majority of my time writing Swift.
So what you’re saying is that any language will be good for gaming since they all can delegate to C?
> Swift an excellent C glue language, which Java isn't. This is why Swift will probably become an excellent game language eventually.
What makes swift better at calling C than Java? AFAIK Java has a perfectly good and brand new foreign function interface.
> This is why Swift will probably become an excellent game language eventually.
I would take this bet that it won’t. Purely on the sheer fact that gaming occurs on Windows and Swift is barely capable there.
> I would think games don’t spend most of their time calling into networking code.
Exactly. That is why the out-of-date networking example being touted as evidence is irrelevant here.
What it boils down to is that Java and C have fundamentally incompatible memory models. Direct access to C memory is impossible because of the managed heap and GC.
> I would take this bet that it won’t. Purely on the sheer fact that gaming occurs on Windows and Swift is barely capable there.
This is a very odd comment - gaming occurs in a lot of places. Quite a lot happens on mobile these days. Turns out a lot of mobile devices run Swift, on which it appears to be reasonably capable.
It surely is, according to Apple's own documentation.
> Swift is a successor to the C, C++, and Objective-C languages. It includes low-level primitives such as types, flow control, and operators. It also provides object-oriented features such as classes, protocols, and generics.
If developers have such a big problem glueing C libraries into Java JNI, or Panama, then maybe game industry is not where they are supposed to be, when even Assembly comes to play.
People have 240hz monitors these days, you have a bit over 4ms to render a frame. If that 1ms can be eliminated or amortised over a few frames it's still a big deal, and that's assuming 1ms is the worst case scenario and not the best.
I don’t think you need to work in absolutes here. There are plenty of games that do not need to render at 240hz and are capable of handling pauses up to 1ms. There’s tons of games that are currently written in languages that have larger GC pauses than that.
That has not been my experience. Sure, you don't have any control over the third-party stuff but I haven't seen this issue being widespread in the mainstream third-party libraries I've used e.g. logback, jackson, junit, jedis, pgJDBC etc which are very well known/widely used. The only place I've actually seen proliferation of this was by a contractor, who I suspect, was trying to ensure job security behind impenetrability.
On Objective-C, due to the way the language works, besides ClassFactoryGeneratorFactories, you would need to add all parameter names to the identifier.
I'd have said the same thing 10 years ago (or, I would have if I were comparing 10-year-old Java with modern Rust), but Java these days is actually pretty ergonomic. Rust's borrow checker balances out the ML-style niceties to bring it down to about Java's level for me, depending on the application.
PascalCase has been my favourite since MS-DOS days, I have been through most Borland products, and Microsoft ones, alongside many Pascal influenced languages, thus it feels like home. :)
But yeah it is subjective, also don't have much qualms with other alternatives.
Kotlin is nice indeed. Most of the issues I had with it were in interop with Java code (those pesky platform types, that behave like non-nullable but are nullable: and you are back in the NPE swamp!)