I can't say I understood and evaluated all the physics here (I skimmed parts) but I was pretty surprised by how small the estimate was. I would've assumed that, were we to have one or two thousand years more cryptographic history, we'd end up using ginormous keys (maybe on the order of 1 MiB?). But this suggests that 512 or 1024 bits might be all we need.
This is because exponential growth is counter-intuitive. A 256 bit key is not 2x more secure than 128 bit, it is 340282366920938463463374607431768211456x more secure.
Assuming fixed computing power and a perfect algorithm. Historically waiting for more computing power meant the exponential increases in compute per dollar, and often better attacks.
That assumes that the best you can do is brute force, but real encryption algorithms (even AES) are weaker than that, so the attacker can infer some bits of the key. More so for asymmetric encryption.
Correct me if I'm wrong, but if P is not equal to NP, then we ought to be able to derive as much entropy as we need from a key which is non-brute forceable (at least for symmetric encryption). Eg, if we believe 1024 bits of entropy cannot be brute forced, but our algorithm requires a 4096 bit key to provide at least 512 bits of security against cryptanalysis (plus a margin of safety), then we can derive our larger key from our smaller key without sacrificing any security.
But there's an implicit assumption here that all keys are equally strong, so this doesn't apply to asymmetric encryption. At least not as straightforwardly. And it's possible that P is in fact equal to NP. And there's a bunch of other assumptions here too, like that we really do have a secure source of entropy and really can share keys securely.
Anyway, if we take all these assumptions as read, this suggests that symmetric key lengths will saturate at a certain point (and not much wider than they are today). Big if true.