Hacker News new | past | comments | ask | show | jobs | submit login

neural nets at the same time require multiple passes through the data (epochs). if we can train a model in one epoch jnstead of 10000 epochs thats a breakthrough!



Epochs are more about the training data than the model... If you've got a big enough dataset, one epoch or less is fine!


True, but it sounds like you’re just shifting computation from training to inference. And I’m not sure that’s a very good trade off to make, you’re likely to predict on much more data than you trained on (e.g. ranking models at google, fb, etc)


not sure I get your point, both DNNs and SVMs require one forward pass for inference, so there is no difference. if SVM model can converge in one epoch, how is it not less efficient than the status quo with DNNs?


For kernel SVMs, one needs to keep around part of the training data (the support vectors) right? With DNNs, after training, all you need are the model parameters. For very large datasets, keeping around even a small part of your training data may not be feasible.

Furthermore, number of parameters do not (necessarily) grow with the size of the training data, can be reused if you get more data, can be quantized/pruned/etc. There's not really an easy way to do these things with SVMs as far as I understand.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: