Hacker News new | past | comments | ask | show | jobs | submit login
Llama 32K Context Released by Together AI (together.ai)
84 points by averylamp on July 29, 2023 | hide | past | favorite | 9 comments



IIRC, there was a paper which showed GPT models pay most attention to the beginning and end of context window, and much less attention to what's in the middle. In that regard, they behave like human brains. But I'm wondering if these efforts to increase context window actually make the models pay almost uniform attention to all the prompt?


You are correct. The paper is called "Lost in the middle" [1] and it is probably one of the worst drawbacks of this technology. It makes a lot of use cases biased (think of law).

[0] https://arxiv.org/pdf/2307.03172.pdf


The research is slightly misleading... the models they experimented all had an original pretrained context length significantly less than the fine tuned context length they tested for, e.g. they used MPT-30B-Instruct, which was pretrained for 2k sequence length and then fine tuned for 8k sequence length. A real test of if current self attention has this issue would be natively training a model with the extended sequence length.


> It makes a lot of use cases biased (think of law).

Yes, it's unfortunate. I wonder if GPT-4 with 32k ctx window is in a sense "smarter than GPT-4 with 8k ctx.


I have seen this also, but attention is far better for gtp-4, it seems to follow a system prompt for say, outputing json, uniformly compared to gpt-3.5. I have also found that gpt-3.5 follows a system prompt to do the same for only 2 successive outputs. You have to give it that prompt with every single time. So I think increasing context windows may not make it follow the system prompt uniformly.


My understanding is that in NTK aware RoPE scaling, the model does pay uniform attention. With older methods, not as much.


The red flag is when they don’t compare it to GPT3.5


It’s a 7B model, it’s not supposed to, nor going to, compete with GPT3.5


The article original title does a better job at conveying the current state as early exploration.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: