It seems to be relatively very fast in the space of small javascript interpreters meant for embedding. That's a different niche with different constraints.
But "embedding" is a very vague term. You can embed V8 too. It's just a C++ library, it can be linked into other programs. It can run on relatively small devices. If you're going to say, but what about even smaller devices than that then sure, maybe this implementation can squeeze into a certain class of rare device that can't run V8. But then why would such a constrained device be running JavaScript at all. That would seem to be the issue there!
You certainly can embed V8, but it's a more involved affair. It's not the use case that's prioritized.
It's not at all my field of expertise, but my guess is that problems with V8 are that it's an order of magnitude larger, that it's written in C++ rather than C89, that it's more likely to make large changes to the way it works, and that it uses more memory.
All of those are good decisions to make for a component of Chrome that helps run webpages. But sometimes you'll only want to make it possible for users of your less than massive technical application to script its behavior using a language they might already be familiar with, and then those properties are undesirable.
"Quick" calls to mind something that's not just fast, but nimble.