Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The article mentions that GPUs are on average 50-200 times faster for deep learning, I’m curious on how he came to that number. It has a lot to do with the code and the frameworks used. I haven’t come across a good comparison, most figures seems to be taken out of the blue.


From my experience the speedup is more around 20x-50x.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: