Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> So baseline, a modern cell phone can connect to a tower about 45 miles away.

Isn't that primarily because of the Earth's curvature and obstructions/hills?



Good question. I've had a bear of a time sourcing good info on this and would welcome a solid analysis.


Radio horizon is somewhat complex, but 45 miles is not entirely unreasonable from an elevated position.

https://en.wikipedia.org/wiki/Line-of-sight_propagation#Radi...

There could also be timing related constraints that limit max range: radio waves propagate at the speed of light which is fast, but not instant.


microwaves go about 10% further than the horizon, barring any tropospheric ducting - which cell towers shouldn't, in general.

The issue you have with long distances is multipath, which can throw off the complex timing required to have multiple users in realtime on a single radio.


I've heard that US spy satellites used to eavesdrop on soviet telephone calls transmitted between line-of-sight ground-based microwave relay towers. The relay towers used directional microwave antenna aimed at each other, but satellites in space could pick up those signals.

I can't find a direct reference to it on wikipedia, but I suspect this is what the Vortex satellites were doing: https://en.wikipedia.org/wiki/Vortex_(satellite)


This [1] may help but I'm wholely unfamiliar with the field. A guy at Defcon did a talk about how HAM radio in Florida allocates their bandwidth and goes into the nitty details of propegation.

[1] https://youtu.be/fH-yyTZffAk




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: