Neural nets often fail with (repetitive) gibberish output when the input is too different from the training data. This model appears to take in the entire text input at once or look ahead at the next input letters, so the unusual "bla bla" at the end can mess up outputs near the beginning.
The "bla bla" actually doesn't do much, that's the "My first" that triggers it most of the time. I only added the "bla bla" in the end to make the line longer because it looks better that way, but just writing "My first" or even "My f" is enough.
It is described as "Realistic handwriting generator. Convert text to handwriting using an in-browser recurrent neural network", so, unlike GPT, it is not a transformer and it is small, so it most likely doesn't take the entire text input as once. Most likely, it simply overshoots the previous stroke and decides that a loop is the most appropriate way to continue, then it overshoots that loop, and again, and again, until by chance it stops overshooting and proceeds to the rest of the text. Cursive style like #2, the need for precise strokes (high legibility) and specific letter transitions seem to exacerbate the problem.