Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is true, but that is only an advantage when running a model larger than the VRAM. If your models are smaller, you'll get substantially better performance in a 4090. So it all comes down to which models you want to run.


It seems like 13b was running fine on 4090, but when I tried all the more fun or intelligent ones became very slow and would have peformed better on m3.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: