Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The authors should have referred to this Allen Institute paper in Nature back in 2021 which got good results with just one bit neurons! They also published earlier work in this area going back to 2016 or 2017 but this more recent, refereed paper was what I quickly found in a web search: https://www.nature.com/articles/s41586-021-03705-x

Also if you wanted you could get more resolution by just using the mantissa, not that any hardware supports that these days. I love the 1-bit work but I suspect the future is four or 8 bit mantissa between 0 and 1. Not sure you’d even need a GPU at that point, just a vector machine with a small lookup table in L1 cache.



If you just want the mantissa, why not use integers?


Yes, saturated fixed point or integer math. And I might have been hasty in saying you can’t do this with modern commodity hardware (here’s a paper: https://proceedings.mlsys.org/paper/2022/file/14bfa6bb14875e...)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: