Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t think that’s right. What Shannon’s channel capacity theorem says is that a channel with a certain bandwidth and SNR can carry a certain rate of information. It doesn’t matter how the source information is encoded. Actual modern signals on a transmission line, like gigabit Ethernet or your cable modem are not just on and off pulses, they are complex analog waveforms.

The reason for bits being the normal representation of information is that they’re easy to do math on, and in a computer, digital logic is much more efficient in binary than in higher number bases.



Is it the source you think is incorrect or have I misinterpreted its meaning?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: