Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
inciampati
39 days ago
|
parent
|
context
|
favorite
| on:
Does current AI represent a dead end?
Want reliable AI? Stop approximating memory with attention and build reliable memory into the model directly.
logicchains
39 days ago
[–]
Standard LLM quadratic attention isn't an approximation, it's perfect recall. Approaches that compress that memory down into a fixed-size state are an approximation, and generally perform worse, that's why linear transformers aren't widely used.
Consider applying for YC's Spring batch! Applications are open till Feb 11.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: