@jph00 where do I find lstm/gru/seq2seq layers for time-series sequence predictions (not text)? Also interested in autoencoder implementations. The fast.ai docs search does not really work for this. What do you think about other notable APIs built on top of pytorch such as Pyro and AllenNLP?
AGPL would be a restriction if you need to deploy this model on top of PlaidML in production. It is still very useful during the training time after which the neural network can be offloaded into production framework such as tensorflow.
Wow, hacker news is incredible sometimes :) , I was/am just looking over your notebook a week ago. Vacation is starting in a couple days and i was gonna try and solve if infinite play is possible in Tetris with just perfect clears. People have been at the problem for a few years now, and your notebook is pretty cool :D http://harddrop.com/forums/index.php?showtopic=7792&hl=perfe...
I was really surprised how efficient numpy arrays were at finding all combinations of the pieces. The backtracking is not using recursion, but instead a stack. This makes it more verbose to code and read, but easier to understand. I would like to make this notebook more interactive with ipywidgets and Unicode characters, so stay tuned ;)