What I always admired about the ZX-Spectrum's Basic was that it skipped the entire idea of a lexer, because, well, the keyboard let you input tokens directly.
I think it was closer to the other way around. The keyboard let you input tokens because it didn't have a lexer.
I suspect this is more due to the memory cost of storing a literal line of code more than the ROM or CPU cost of a lexer. Going back to the ZX-80 storing even the current line being typed as characters was probably a burden.
That can’t be it. The Basic interpreter could run the lexer on each line the user typed and store the tokens. You would lose the ability to print lines exactly the way the user typed them, but that isn’t a big blocker for the technology of the time.
I can confirm that AppleSoft ][ BASIC stored its programs in memory in tokenized form.
On another note, I went ahead and wrote a code emitter, because I couldn't wait for part three (though I noticed it's in the GitHub repo). Once you get this compiler up and running the sane thing to do is write a Makefile so you can treat Teeny as a first-class programming language. Just add a Makefile with the following:
%.c : %.tiny
<TAB>python3 ./teenytiny.py $< > $@
And you'll be able to type `make hello` and have hello.tiny compiled to hello.c compiled to hello automagically. I can never pass up an opportunity to remind people that Make is wonderful.