Hacker Newsnew | past | comments | ask | show | jobs | submit | more denfromufa's commentslogin

If the deep learning network has enough layers, then can't it start incorporating "abstract" ideas common to any learning task? E.g. could we re-use some layers for image/speech recognition & NLP?


this is exactly what happens in transfer learning. A recent paper by google ( https://research.googleblog.com/2017/07/revisiting-unreasona... ) shows that pre-training on a very large image database leads to improvements in state of the art for several different image problems. This is because the weights required for one image problem are not necessarily all that different from another image problem, especially in the early layers. There may not be as much common ground beteen images and e.g. NLP. Perhaps at much higher abstraction levels, but we aren't there yet.


Transfer learning has been shown to improve training times in other modes (such using an image classification model to initialize an NLP model) over randomly initialized values.


When an implementation of AGI comes around (yes, it will come around) it will inevitably involve a number of different neural nets working together in concert as separate subsystems. That's what makes these "Neural Nets Will Never Become Conscious!" articles so hilarious.

But yeah, I could see feeding the output of an array of sub-networks into a parent network. So think one NN for vision, one for hearing, etc, etc, all of those outputs feed into a parent level network that could be your abstraction network that deals with making executive level decisions.


State of visualization in Python by Jake Vanderplas:

https://speakerdeck.com/jakevdp/pythons-visualization-landsc...




I don't think pylatex does what lax does i'm not sure though. Try getting latex output from pylatex for formulaes like x * 3 * y * (x/(x-y)). Lax goal is being handy for quickly writing down common math formulaes.


You can also use .NET assemblies from CPython using pythonnet, so it is bidirectional interop bridge!


Has anyone used gnocchi outside of OpenStack?

It can use postgresql as an index store.

https://docs.openstack.org/developer/gnocchi/install.html


zunzun uses this library for equation regression:

https://github.com/zunzun/pyeq2


subset of sympy can run with symengine for speedup




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: