Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> And, by various universality theorems, a sufficiently large AGI could approximate any sequence of human neuron firings to an arbitrary precision.

Wouldn't it become harder to simulate a human brain the larger a machine is? I don't know nothing, but I think that peaky speed of light thing might pose a challenge.



simulate ≠ simulate-in-real-time


All simulation is realtime to the brain being simulated.


Sure, but that’s not the clock that’s relevant to the question of the light speed communication limits in a large computer?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: