For context, Claude Shannon found that humans could model English text with an entropy of 0.6 to 1.3 bits per character (http://languagelog.ldc.upenn.edu/myl/Shannon1950.pdf)
For context, Claude Shannon found that humans could model English text with an entropy of 0.6 to 1.3 bits per character (http://languagelog.ldc.upenn.edu/myl/Shannon1950.pdf)