> The unnecessarily-long-for-debugging compilation times certainly put me off. I think I do like the semantics of Julia better than Python, but the compilation times is a deal-breaker for me. Perhaps I might make the switch once this gets fixed in the upcoming years :)
I like Julia, but this is a point of frustration for me as well. Compilation time is fine for smaller scripts. But whenever I've tried using more powerful packages such as the SciML ones (DifferentialEquations.jl, DiffEqFlux.jl, Symbolics.jl), the precompilation time during the edit/debug cycle can be excruciating. It's a shame because the SciML packages are so amazing.
I'm waiting until Julia's static compilation process improves and is standardized. On-the-fly compilation that disappears every time you close the REPL will never be fast enough to give users a good experience when working with highly complex code.
EDIT: Just to be clear, I'm overall confident that things will improve. The folks at Julia have already made good progress in reducing precompilation times from where they used to be. Looking forward to seeing further advancements in the future.
Quote: "Coming very soon: a version of DifferentialEquations.jl that fully precompiles the solvers on Vector{Float64}, virtually eliminating the any #julialang #sciml JIT lag."
It's still rough. They've been at it for yrs and it's a perpetual bed of sand... A lot of machines will fail to install it because it's so resource intense to install... Promising idea overall, maybe in three yrs or so it'll be worth using for something outside of research.
How does that even work? Even with full precompilation, the LLVM time should still be there, no? In my usual Julia workloads, LLVM time is a significant fraction of compile time, so full precompilation only takes 30-50% off latency.
You can strip it out with a system image, and you don't even need more than the basic StaticArrays and Loop vectorization stuff in the image to get almost all of it
Nice, that Twitter announcement looks fantastic! Looking forward to trying out the new precompiled version.
And agreed that the folks at SciML (and the rest of Julia) have put amazing efforts into reducing the compilation lag from where it used to be :) I'm optimistic that things will improve--it'll just take some time.
Theyve been at it for half a decade or so. Ignoring compilation times they shuffle the code around so frequently it's only real use imo is for the authors to publish papers and stay three steps ahead of any of it's users hoping for a stable tool after a few cycles give up. It's a shame but it's academia at its finest.
This part of your criticism seems quite disingenuous. You're simultaneously criticizing them for not improving their code quickly enough, and for changing it too much.
I appreciate your view but seeing as how most Julia projects work this way I sometimes wonder if it's just a problem with the language itself. Not trying to be a troll with impossible expectations, but genuinely the code is unstable, and yes they have been working on it for a long time.
We started working on it at JuliaCon 2021, where it was at 22 seconds. See the issue that started the work:
https://github.com/SciML/DifferentialEquations.jl/issues/786. As you could see from the tweet, it's now at 0.1 seconds. That has been within one year.
Also, if you take a look at a tutorial, say the tutorial video from 2018,
https://youtu.be/KPEqYtEd-zY, you'll see that the code is still exactly the same an unbroken over the half decade. So no, compile times have only been worked on for about a year and code from half a decade ago still runs just fine.
I think you misunderstood me, all good. The diffeq/sciml landscape has been a WIP for half a decade with lots of pieces of it changing rapidly and regularly. But so has the rest of the ecosystem. I think we both know how often this code has changed, but for some reason the Julia people are always like "oh we have packages for that" or "oh that's rock solid" and then you check the package it's a flag plant and does nothing or is broken from a minor version change, then you try to use it, maybe even fix it, and it breaks Julia base... I'm not going to waste anymore time with digging into this to file an issue or prove a point.
I think passerbys should be made aware of the state of things in the language without spin from people making a living selling it. No personal offence to you, just please consider not overselling, it's damaging to people who jump in expecting a good experience.
I linked to you a video tutorial from 2018, https://www.youtube.com/watch?v=KPEqYtEd-zY . Can you show me what code from that tutorial has broken in the SciML/DiffEq landscape? I know that A_mul_B! changed to mul! when Julia v1.0 came out in 2018, but is there anything else that changed? Let's be concrete here. That's still the tutorial we show at the front of the docs, and from what I can tell it all still works other than that piece that changed in Julia (not DifferentialEquations.jl).
> I'm not going to waste anymore time with digging into this to file an issue or prove a point. > No personal offence to you, just please consider not overselling, it's damaging to people who jump in expecting a good experience.
I'm sorry, but non-concrete information isn't helpful to anyone. It's not helpful to the devs (what tutorial needs to be updated where?) and it's not helpful to passerbys (something changed according to somebody, what does that even mean?). I would be happy to add a backwards compatibility patch if there was some more clear clue.
> I think passerbys should be made aware of the state of things in the language without spin from people making a living selling it.
The DiffEq/SciML ecosystem is free and open source software. There is nobody making a living from selling it.
> I like Julia, but this is a point of frustration for me as well.
It's a point of frustration for most of the community, it's just something we're willing to put up with (and work around) for the other benefits of the language. If the tradeoff doesn't seem worth it to you, it's totally fine to wait it out until it's in a more acceptable place for you.
There's a lot of focus on improvements surrounding this in the recent and upcoming versions, precisely because of this frustration, but it's still going to be a gradual process.
I like Julia, but this is a point of frustration for me as well. Compilation time is fine for smaller scripts. But whenever I've tried using more powerful packages such as the SciML ones (DifferentialEquations.jl, DiffEqFlux.jl, Symbolics.jl), the precompilation time during the edit/debug cycle can be excruciating. It's a shame because the SciML packages are so amazing.
I'm waiting until Julia's static compilation process improves and is standardized. On-the-fly compilation that disappears every time you close the REPL will never be fast enough to give users a good experience when working with highly complex code.
EDIT: Just to be clear, I'm overall confident that things will improve. The folks at Julia have already made good progress in reducing precompilation times from where they used to be. Looking forward to seeing further advancements in the future.