Hacker News new | past | comments | ask | show | jobs | submit login

It's common in machine learning to define a parameter space $\Theta$ that is a subset of Euclidean space with a number of constraints. For a simple example, $Theta$ could be an elipse embedded in $R^2$. In existing Haskell, it is easy to make $R^2$ correspond to a a type, and then do automatic differentiation (i.e. backpropagation) over the space to learn the model. If, however, I want to learn over $\Theta$ instead, then I need to completely rewrite all my code. In my ideal language, it would be easy to define complex types like $\Theta$ that are subtypes of $\R^2$, and have all my existing code automatically work on this constrained parameter space.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: