Hacker News new | past | comments | ask | show | jobs | submit login
Predictive Coding Can Do Exact Backpropagation on Any Neural Network (arxiv.org)
113 points by MindGods on June 3, 2021 | hide | past | favorite | 9 comments




Different paper but yeah it's basically the same thing.


That’s what I thought-sounds much like the great stuff coming out of Chris Buckley‘s group.

The authors even refer to that work in the abstract. The “breakthrough“ seems to consist of showing that the method produces the same results as vanilla backprop.


I'm not sure this is that important for understanding biological neural systems. But it might enable mass parallelism!


Is it cool to call your own work a "breakthrough", or does that just engender skepticism?


Unfortunately that is what is often recommended to win grants and get considered for the best journals. But yes, it should draw scepticism. But then again, scepticism is anyway part of the game in science, always.


What does this suggest about biological neural systems?


Predictive coding is based on a theory for brain operation that grounds in the ubiquity of recurrent connection between functional brain regions, where the forward pass transmits the prediction, and the backward pass transmits the error. Unlike backprop, this theory uses only local information, and is therefore somewhat more compatible with how the brain works.

There is a huge body of work about this. The current paper seems to show that the result of predictive coding is equivalent to the result one gets with backprop.

I think it says more about predictive coding as a way to train neural networks rather than biological systems.


There's been an argument that schemes relying on backpropagation can't provide insight into biological neural systems, but this argument is weakened by the existence of predictive coding equivalents.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: