It doesn't need to be close because these changes don't happen linearly. Look at for example computing in 1950 vs 1975 or 1975 vs 2000.
With a new computing paradigm we should anticipate at least a 500X increase in compute per dollar over a period of 25 years. We also should anticipate quite a lot of progress in AI and brain-computer interfaces.
Obviously what I suggested is speculation, but sensible speculation in these areas based on historical trends anticipates changes that are just as radical as the ones we have seen.
"Moore's Law" came close to a wall a long time ago and has been braking hard. Compute-in-memory will allow us to speed up again.
With a new computing paradigm we should anticipate at least a 500X increase in compute per dollar over a period of 25 years. We also should anticipate quite a lot of progress in AI and brain-computer interfaces.
Obviously what I suggested is speculation, but sensible speculation in these areas based on historical trends anticipates changes that are just as radical as the ones we have seen.
"Moore's Law" came close to a wall a long time ago and has been braking hard. Compute-in-memory will allow us to speed up again.