Hacker News new | past | comments | ask | show | jobs | submit login

I see something similar with ChatGPT too... it seems to forget a previous constraints and jumps to the generic. In my example, I was trying to generate an ffmpeg command line that given a source file will, during transcode, generate stereo audio even if the source file had no audio (so basically output a blank audio stream). It kept using 0:a in the complex filter and I would ask it what would happen if there was no audio stream in the input file? It would actually correct itself but something else would break in the fix. I would point out something else was wrong and it would fix that but reintroduce 0:a.



Maybe the output was long enough that the context window would miss your previous prompts?


I'm late, but the model can remember 4000 tokens, which according to openai is about 3000 words




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: