Just tried it out a few times. It seems that the old gpt4 model strongly prefers telling story about "Elara" - but only if asked in English to "tell me a story".
Prompting gpt4 in German or the current gpt4o in English leads to stories with many different protagonists.
$ ollama run gemma2
>>> Tell me a story.
The old lighthouse keeper, Silas, squinted at the horizon. […]
>>> /clear
Cleared session context
>>> Tell me a story.
The old woman, Elara, […]
Hmm. Hmm.
Tried llama3.2 too. Gave me a Luna in the mountains twice (almost identical), then a Madame Dupont, then a different Luna in Tuscany twice (almost identical), then Pierre the old watchmaker. llama3.2:1b branched out a little further, to Alessandro in France and Emrys in a far-off land, but then looped back to Luna in the mountains.
GPT has some self understanding. On asking why it uses that name, it at least gave the type of qualities correctly.
> It sounds like you're referring to a story or narrative that I've generated or discussed involving a character named Aldric. If this is the case, Aldric would likely be used as a character who embodies leadership, wisdom, or noble traits due to the name's meaning and historical connotations. Characters named Aldric might be portrayed as experienced leaders, wise sages, or key figures in a fantasy or historical context.
Response #1: Once upon a time, in a quiet little town nestled between rolling hills and thick forests, there was a boy named Leo who loved to explore. ...
Response #2: Once upon a time, in a quiet village at the edge of an ancient forest, there lived a girl named Lyra. Lyra loved exploring, but the village elders. ...
Response #3: In a small village nestled in a valley between misty mountains, there lived a young woman named Lira. She was known for her curious spirit, always venturing deeper into the woods, ...
Response #4: Once upon a time, in a quiet village nestled between towering mountains and lush, green forests, there was a young girl named Lira. She was an ordinary girl, with a bit of an extraordinary heart. ...
Doesn't seem to be true per se, but definitely has that LLM low temperature trend of producing stories that seem to follow a pretty common pattern. Not once did I get a story about aliens, post-apocalypse, civilizations under the surface of Mars or about how the Moon is made of cheese. Depends on what the model is trained for and how all of the samplers and whatnot are set up.
Edit: now here's something more interesting when you crank up the temperature on your typical Llama 3 based model:
Why don't people ride zebras to the 2056 Winter Olympics? They were declared a domestic species in 2172 by The United Galactic Federation who thought riding was inhumane for a zebra. This event brought tremendous scrutiny from the galactic community as riding unpopular species was becoming increasingly commonplace in several neighborhoods and high schools on alien planets.
I love how it makes no sense, but it should be obvious why ChatGPT spewing out stuff like that wouldn't be super useful, especially for regular conversations and questions.
Prompting gpt4 in German or the current gpt4o in English leads to stories with many different protagonists.