Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> uses mathematics that's fundamentally half a century old

I have news for you, the most common operations in AlphaGo, GPT-3 or any state of the art AI are: multiplication, addition, max, log, exp, sin, cos and random - all functions known for centuries.

It's the architecture, data and the compute that are making the difference, not the math, and all three are recent accomplishments.



Yes, that's my point. That neural networks were conceptualized roughly half a decade ago. Obviously there have been a lot of advancements like convolution, drop out, attention, deep learning, etc. But fundamentally this is old mathematics and while it's yielding good results at solving specific problems, it's not the answer for AGI. For AGI we will need new breakthroughs.


century ago**




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: