Hacker News new | past | comments | ask | show | jobs | submit login

Back in 2006, in highschool, I was investigating multilayer feed-forward NNs. I found them magical. I wrote the XOR problem etc. etc.

What always confounded me was the choice of the number and width of hidden layers. This is even now more confusing with the advent of deep and recursive networks. We need empirical work on this, that can be taught in much the same way that gravity is taught as an apple falling from a tree.

We need a determination of the entropy of a network, how to route that entropy and expolit it. Specific scenarios are not adequate.




> gravity is taught as an apple falls from a tree.

Is this more advocating for a theory of neural networks rather than empirical evidence?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: