What would be the killer feature for me would be if I could select an interpreter-only mode... ideally at either compile-time or run-time.
The products I work on sometimes need to be deployed in environments where JIT code is disallowed due to security constraints. Now, I don't mind if my code runs dog-slow there, but it does still need to run.
Some projects that use JIT techniques work well in this scenario. For example pcre has an opt-in JIT, so at regex-compile time I can decide to stick with the non-JIT implementation and take a speed hit. Everything still works.
I know when V8 was new-ish the party line if you wanted a JITless version was to just have it target ARM and emulate the instruction stream. However I don't know if anyone ever implemented such a beast. It also would be ugly if you wanted to decide at runtime whether to JIT since I don't know how easily V8 can dual-target.
Anyway, if Ignition can run any V8 code maybe we can get an Ignition-only mode? It would also open up a path to running V8/node.js on CPUs that V8 can't target natively yet, albeit somewhat slowly.
An alternative would be to use an js engine like mozilla's (or a purely interpreted one like duktape) but the ecosystem (node and whatnot) seems to have settled on V8 so it definitely would be nice to be able to use.
Apologies if I'm talking nonsense and there's an easy way to do this now.. the last I looked at this problem in depth was pretty early in V8's life. I haven't heard any news about JITless operation though.
IIRC when they first announced Ignition, running in non-JIT environments was an explicit "non-goal".
I don't think that they are against it, but i'm assuming nobody is really trying to get that to work, and I think some code is still JIT-ed during ignitions normal operating, so getting a "pure" interpreter might be much more work.
Things might have changed since then, so someone with more up-to-date information will have to chime in if i'm horribly off.
But I do have issue with this:
>An alternative would be to use an js engine like mozilla's (or a purely interpreted one like duktape) but the ecosystem (node and whatnot) seems to have settled on V8 so it definitely would be nice to be able to use.
The ecosystem has not settled on V8 (hell I just started playing with espruino which is a javascript engine built to run on extremely low power devices like the ESP8266), and IMO it shouldn't ever settle on one implementation. Alternate engines keep the javascript ecosystem healthy. It prevents bugs in one from becoming a defacto standard, it promotes competition and specialization, and it's the way javascript standards become standard (IIRC something like 3 separate major implementations are needed).
Chakra is really giving V8 a run for it's money recently, and even javascriptcore is quickly catching up and is much better in areas like memory usage.
> so getting a "pure" interpreter might be much more work.
If that's the case I'd have to take their word for it.. it's their code so they know the pitfalls. It still surprises me though: if I was developing a feature like that it would be the first thing I'd want to get working so I could try all of my test cases on interpeter-only mode.
Maybe there are some types of code that the interpreter just can't handle and always fall back into the JIT?
> The ecosystem has not settled on V8
Sorry, I probably didn't express myself wall. I personally am excited about the development of other js engines, and I also home that continues.
For non-browser js, though, node is the 800lb gorilla. I'm hoping that the (apparently reborn?) Spidermonkey or node-chakracore really take off to the point they're first-class options for using with node. However, I am not sure if that will ever happen.
I personally think duktape gets a lot of things right with its lua-inspired API and easy portability. However, it will probably never have a JIT available.
I'm pretty sure they have a way to run in "Ignition only" mode, but Ignition still uses executable memory for some things, so an "Ignition only" mode is not a "JIT-free" mode.
You can enable or disable SpiderMonkey's various JITs in Firefox's about:config prefs. Try disabling all the JIT prefs and running some JS benchmarks to measure interpreter-only mode. :) Each JIT level is roughly 10x faster on my machine.
javascript.options.baselinejit controls the first-pass JIT.
javascript.options.ion controls the IonMonkey optimizing JIT.
javascript.options.asmjs controls the OdinMonkey asm.js JIT.
javascript.options.wasm and javascript.options.wasm_baselinejit controls new WASM JITs.
Yes last I checked you could still build firefox on platforms they didn't have a JIT for, so it certainly must be possible. Sadly, SpiderMoney isn't as commonly used for embedding compared to V8. Maybe Spidernode will take off in the node community and make it a more popular option... we'll see.
From watching the video it seems that Ignition is ~2x slower than JIT which sounds like great performance for a non-JIT'ed implementation.
LuaJIT2/newV8 optimized interpreter is as fast as LuaJIT1/oldV8 compiler. And to make it cross platform, it uses the LuaJIT/V8 macro assembler, rather than standard platform tools. (Although, LuaJIT2's macro assembler is architecture dependent and does much less than TurboFan's bytecode/macro assembler).
I suspect they'll realize at some point that the 20% or so interpretation speedup that you get from writing crucial parts of the interpreter in pure assembly is easier than specializing TurboFan to generate good interpreter code - see Mike Pall's explanations[0]
Not just LuaJIT but most heavily worked on VMs seem to converge to this point: it's basically the HotSpot architecture but with the source code compiler included and a register based bytecode instead of a stack based one. Oh, and two heavily optimising compilers instead of one. I guess they'll want to fix that. It was disappointing to hear that CrankShaft and Turbofan are both still in existence, I remember TurboFan being announced quite some time ago.
I really recommend the video in the article for anyone that wants to learn about some V8 internals, how they got to this point, and where it is headed.
They are replacing 3 JITs with an interpreter and a JIT which should improve speed, memory usage, and allow them to add new features faster, but the climb to that point really looks like a tough one.
Watching that talk it really sounds like they painted themselves into a corner with the complexity of fullcodegen, crankshaft, and turbofan. And the path they are taking out is really interesting. By introducing ignition they are adding on MORE complexity, but eventually when turbofan is well rounded enough and ignition is battle tested and improved, they are planning to drop fullcodegen and crankshaft and hook ignition straight to turboshaft making everything simpler, and hopefully faster!
Hm, well, as they say - they've actually now got two compilers even in the ideal case where they get rid of Crankshaft. Source to bytecode and then bytecode to machine code.
It does sound like there's a bit of a story behind V8's evolution here. I wonder when they'll try to standardise the bytecode they invented and introduce an extra protocol on top of HTTP to download it instead of source code. Seems like an obvious optimisation if compressed bytecode is more compact than compressed source (unclear if it would be), or maybe even if it's a bit larger if network speeds continue to improve and all the compile steps become the dominant factor in page load times.
Especially with WebAssembly it seems the whole web architecture is slowly winding its way back to the basic design of Java applets.
Seconded. If you've ever wondered about the internals of Chrome and where they're headed next this is a good watch. 2017 should be the year all of it comes together.
The products I work on sometimes need to be deployed in environments where JIT code is disallowed due to security constraints. Now, I don't mind if my code runs dog-slow there, but it does still need to run.
Some projects that use JIT techniques work well in this scenario. For example pcre has an opt-in JIT, so at regex-compile time I can decide to stick with the non-JIT implementation and take a speed hit. Everything still works.
I know when V8 was new-ish the party line if you wanted a JITless version was to just have it target ARM and emulate the instruction stream. However I don't know if anyone ever implemented such a beast. It also would be ugly if you wanted to decide at runtime whether to JIT since I don't know how easily V8 can dual-target.
Anyway, if Ignition can run any V8 code maybe we can get an Ignition-only mode? It would also open up a path to running V8/node.js on CPUs that V8 can't target natively yet, albeit somewhat slowly.
An alternative would be to use an js engine like mozilla's (or a purely interpreted one like duktape) but the ecosystem (node and whatnot) seems to have settled on V8 so it definitely would be nice to be able to use.
Apologies if I'm talking nonsense and there's an easy way to do this now.. the last I looked at this problem in depth was pretty early in V8's life. I haven't heard any news about JITless operation though.