Hacker News new | past | comments | ask | show | jobs | submit login

> since NVIDIA is going head on with Apple

I think this is a race that Apple doesn't know it's part of. Apple has something that happens to work well for AI, as a side effect of having a nice GPU with lots of fast shared memory. It's not marketed for inference.




Apple is both well aware and marketing it, as seen at https://www.apple.com/my/newsroom/2024/10/apples-new-macbook...

Quote:

"It also supports up to 128GB of unified memory, so developers can easily interact with LLMs that have nearly 200 billion parameters."




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: