Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Whilst you are correct that the number of allowed computation steps is limited, LSTMs have still been used with success for such tasks. LSTMs have been used to perform addition with 99% accuracy^ on two 9-digit numbers[1]. The paper even shows that simple Python programs can be evaluated with some degree of accuracy.

Remember that even if the number of computation steps is limited, there can be multiple layers ([1] uses 2) and each neuron unit can perform a computation ([1] uses 400 cells per layer). It only needs to learn how to be an ALU. The work is in fact done by one of the people who established the sequence to sequence framework, Sutskever, and is referenced in the Neural Conversational Model paper.

^ They use "teacher forcing" for evaluation which inflates the accuracy to some degree, but it's still quite impressive.

[1]: http://arxiv.org/pdf/1410.4615v3.pdf



Now for the kicker: recurrent network can implement arbitrary algorithms, including a type of GOFAI. It's quite possible that such an algorithm isn't learnable without supervision; that it was discovered by sheer luck at the dawn of humanity and has been passed down by language ever since.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: