The article mentions that GPUs are on average 50-200 times faster for deep learning, I’m curious on how he came to that number. It has a lot to do with the code and the frameworks used. I haven’t come across a good comparison, most figures seems to be taken out of the blue.