Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a developer with interested in neural-net based ML, my eyes start to glaze over a bit when I see so much crazy-looking numpy operations even just to get a trivial representation of data.

I guess it's just something I have to get used to, but I wish there was an interface to do the same nd-array based logic that was designed more for developers like me rather than data scientists who perform surgery on 50d arrays all day long.



I agree; check out NamedTensor for an attempt to fix this: http://nlp.seas.harvard.edu/NamedTensor


Don’t look at code, because it bakes in so many derived values and simplifications that it’s hard to recover the original ideas. If you can find the original math, and walk through the derivatives (seeing zeros crop up and get simplified away), it starts to make a lot more sense.


Have you checked out numba? You can use regular Python for loops and numpy calls and it will compile it. Instead of numpy vectorization mess.


Good luck with anything of non trivial types. Numba is awesome but also quite finicky.


Sorry if this sounds condescending but just learn linear algebra and numpy. Linear algebra really is a minimum requirement if you want to get anywhere near ML. College freshmen can learn linear algebra and numpy and so can you. You'll only get even more lost and frustrated if you try to go into it thinking in terms of for loops.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: