Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Note that they only compared their solution with Eigen’s tensor transposition functionality. HPTT is not a drop-in replacement.


Does anyone know how much time a typical TensorFlow model spends in transposition routines?


No hard numbers to present, but it would be beneficial in long-sequence LSTM networks, because TensorFlow has to do time-major <=> batch-major transposition between steps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: