There’s not much time left to develop reversible machines, because progress in conventional semiconductor technology could grind to a halt soon. And if it does, the industry could stagnate, making forward progress that much more difficult. So the time is indeed ripe now to pursue this technology, as it will probably take at least a decade for reversible computers to become practical.
I believe the opposite: companies should wait until performance improvements in mainstream processors have really ground to a halt before investing substantial resources in reversible computing. Otherwise it will be a financial sinkhole like many prior attempts at exotic computing hardware that couldn't make a sustainable profit.
The only path suggested in the article that could be pursued right now involves superconducting devices, requiring cryogenic cooling that will never work for phones, tablets, or laptops. A company would be trying to catch up with a tremendous head start enjoyed by CMOS plus conceding that the new hardware, even fully realized, will never address markets as broad as CMOS.
Maybe its just my distaste for the more whimsical topics in thermodynamics but is this a round about way of saying any circuit without entropy loss is also reversible?
As the article states, you can reverse intermittent computational values to recover energy so it seems that reversibility isn't really a necessary feature of computing as much as a feature of waste free circuits.
EDIT: Why the down votes? I'm asking a question here.
It seems that we need a 'Law of Anonymous Social Systems Entropy' (not to be confused with the idea of 'Social Entropy') that states that every social system (including social media platforms) tends, over time, to demonstrate that groups of negative human beings will eventually -- by virtue of their being less constrained by civility and less tolerant of foreign ideas or groups -- infiltrate and negatively affect all human social systems that allow anonymity. This is likely due to the fact that anonymous social capital (e.g. this site's user reputation) is not real enough to activate normal human self-censorship. As well, the disingenuous will be able to game the system to gain the social capital necessary to negatively affect the system. Note that the negative group need not be in the majority to do damage.
It has taken longer for HN to reflect this truth but it appears that a significant percentage of this site's commentariat has finally become more and more insular to ideas -- and even questions -- more borderline than the ideas of those who hold its social capital (i.e. those with the ability to downvote a post).
In theory, anonymity is great for allowing free-thinking people to express their controversial views without worry for reprisal from other groups, but in reality that gives cover to the people with purely destructive, anti-social, anti-progressive and intolerant tendencies. And those attitudes and their accompanying behaviors have a much greater negative system effect than those of well-intentioned users.
The fact that our entire world is now connected by effectively anonymous social media means that the above concept has crept into every aspect of our lives, and has actually cost lives in places such as Myanmar.
(Note that I don't have the karma to up or down vote.)
That reminds me of Eliezer's model of evaporative cooling of beliefs - and, by extension, communities. TL;DR: as trolls, or random chance, slightly lowers the standard of discussion, the first people to get fed up and leave the community are the ones with above-average quality contributions. When they leave, the average quality goes down. The process repeats until the community is garbage.
A circuit with no entropy loss is, at least theoretically, fully reversible. Of course even if we change how we design circuits we're still going to leak entropy due to how we design circuits, just less of it.
A circuit that reverses intermittent computational values is partially reversible, and practically that's the best we're going to get.
Yes, you're right, but it's also important to combine with Landauer's principle that says that entropy loss (information deletion) actually costs nonzero energy. Reversible computation doesn't.
The idea is cool, but I don't know if anyone really thinks it's an issue relative to all the energy we currently use to compute (the quantities in the theory are negligible).
Interesting but it doesn't really explain how a software trick can help us break the Landauer limit...
I'd love for Isaac Arthur to tackle this, he uses the Landauer limit a lot to express things like "How many human minds can run on the energy of a black hole some 10^53 year into the future?"
So, after some reading and watching... Is this like using balls in space to do computation? It takes no energy to lead them down paths using elastic collisions but it does moving them actively and resetting them to their base position. Then, resetting could also be done by a fully elastic collision causing the ball to go back down its path.
But then, it still must take energy to move the switches that send the balls in desired directions, right? But perhaps this is much less. But not 0.
I am not a physicist in case this post was not glaringly obvious.
One of the possibilities for adiabatic computing I was interested in some time ago was to use an oscillating power supply. You still lose some energy due to the interconnect resistance, but much less, since you're only sustaining the oscillations (the charges don't get dissipated when returning to the ground, they just move back).
Then, it's "only" a matter of deciding which part of the circuit will be powered for the next oscillation, and the switching can be done when vds=0.
I haven't had the occasion to think about this as much as I wanted, though, so the above explanation might be incorrect. Hopefully this gives you a better idea of the concept (which there are many ways to approach).
So yeah, there are losses (mostly due to circuit resistance), but the "switches" are capacitors, which are just "springs" in the electrical sense. You can get back your energy if done in a clever manner, but we typically just short them to discharge (switch) them :)
So, compare the energy required to compact and extend a spring if:
* You oscillate with it
* You compact it, fully release it, then recompact it.
This is a very apt analogy, I think, up to the mathematical level.
Right, even in the idealized models you need to spend a little bit of energy to write down the input and to read out the output of a computation. But the size of the input/output text is typically very small compared to the number of bit operations done during a computation, so if you can "unscramble" the entire intermediate state back to a low-entropy configuration that will get rid of nearly all the heat.
In less idealized models you can't even to do intermediate computations completely losslessly, so those computers will use some energy, but it can approach 0 by working more slowly.
>But then, it still must take energy to move the switches that send the balls in desired directions, right?
If we're talking about theoretical friction-less thought experiments, you could use a lifted barrier, for example, and recover the energy put in when the barrier is lowered.
Wonder if progress on reversible computing tech is being stymied by the NSA's own ambitions in this area? They have an annoying habit of plucking promising talent out of non-classified R&D environments that they feel would be useful for maintaining the computational competitive edge that they need in order to keep ahead of the rest of the world. They have done this for decades in areas like VLSI design and cryptography.
Given how far we are from the Landauer limit, the comparison is apt. The amount of negentropy gained by computation even in the best case is utterly dwarfed by the increase of entropy in the form of waste heat.
Right, it's like the joke the General Motors is a pension and health care bureaucracy with a car-making division. That yes, GM's core business is (rightly regarded as) car-making, but the financial flows related to that other stuff are so big as to dwarf the "car thing".
I was having this conversation recently, trying to think of what about computation produces waste heat - how many watts go to computation vs how many watts are waste due to inefficiency.
But we couldn't put our finger on, where does this inefficiency come from? Is is the silicon heating up due to resistance ? Or is there some analogy for friction in switching the state of a flip flop , such that there is heat generated by the state change ?
In our current hardware it is almost entirely electrical resistance. But even in a theoretical 100% superconducting computer there’s a fundamental lower bound to heat dissipated by every bit-erasing operation; this is the Landauer limit. Entropy increases when we forget information: we can’t reverse the computation, giving time a direction (the so-called thermodynamical arrow of time).
You can think of it as conservation of energy: if you have two electric signals and put them through a gate with only one output, some of the total energy must be dumped to balance the books. If a computation is completely reversible—if we can always get our input back given the output, Landauer limit does not apply, but of course other hardware inefficiencies may still be present.
Actually, cmos input stages are high impedance, i.e. high resistance, and therefore have close to no current flowing between state changes. The output stage should be very low impedance if driving loads, or very high impedance if turned of. Indeed, switching between states, saturating the gates with enough electrons is the problem that generates heat, when there is voltage and resistance visible for the following node. At that scale, the small resistance of the connections plays a role, too, sure.
With reversible computing, they're talking about being able to perform computation for nearly-zero energy. By comparison, a modern computer is essentially an expensive electric heater that performs a very tiny amount of computation with a vanishingly small fraction of said energy.
An incandescent lightbulb turns 90% of the energy consumed into heat and only 10% or so into light. It's a heating element that produces light as a side effect. Computers are a little like that, given their thermal inefficiency.
Very cool. Still, it seems like other cooling technologies (and more energy-efficient chip design) will remain lower-hanging than this approach for some time. Lowering transistor size isn't the only way to increase speed.
A bonus feature of reversible computing is that a lot of garbage collection becomes a lot easier. Do a bunch of computation, copy off the results you need long term, unwind the computation. No need to track the unwound values’ lifetimes.
... at the cost of erasing the result, and taking time proportional to the original computation? Clever optimization, but the net result seems suboptimal
The idea is that eventually we'll have to go reversible anyway. Computation will have to run twice in order to run at more than 2X the clock speed of running once. Getting easy garbage collection is a side effect, not a goal of the optimization.
And, the result will have to be copied off before rewinding.
This is a great idea but is contingent on advances in storage. Perhaps by storing information in a more biological way (utilizing DNA, for example) instead of silicon, is the way forward.
I believe the opposite: companies should wait until performance improvements in mainstream processors have really ground to a halt before investing substantial resources in reversible computing. Otherwise it will be a financial sinkhole like many prior attempts at exotic computing hardware that couldn't make a sustainable profit.
The only path suggested in the article that could be pursued right now involves superconducting devices, requiring cryogenic cooling that will never work for phones, tablets, or laptops. A company would be trying to catch up with a tremendous head start enjoyed by CMOS plus conceding that the new hardware, even fully realized, will never address markets as broad as CMOS.