Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"The selected model reaches 1.12 bits per byte." (https://arxiv.org/pdf/1704.01444.pdf)

For context, Claude Shannon found that humans could model English text with an entropy of 0.6 to 1.3 bits per character (http://languagelog.ldc.upenn.edu/myl/Shannon1950.pdf)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: