Low-level Theory

Note

In this article, we describe several established laws of computers' energy efficiency. Because these laws are based on hard maths and theoretical physics, we are not going into much detail of why these laws hold. If you want to learn more, all references to the cited articles are provided.


Landauer's principle

This classical principle of energy spent on computations, proposed in 1961 by physicist and IBM engineer Rolf Landauer states the following:

...any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment.

The most confusing part of the quote, which is "entropy increase in non-information-bearing degrees of freedom" can be interpreted as "energy loss". "Merging of computational paths" refers to logic gates, which produce single output from several inputs, thus "merging" them. Hence, simplified to its extreme, Landauer's law means that energy wasting cannot be avoided while performing irreversible computations. It also entails that there is a minimum limit of the amount of energy needed to perform an irreversible operation on one bit of information. This limit is: (5)

k × T × ln(2)

If the temperature is 20 (Celsius) degrees, the absolute maximum possible bits of information that can be erased is approximately 3.5720.

At the time of writing the article, the most energy-efficient supercomputer in the world (stated by The Green 500) has the efficiency of 17 Gigaflops per watt (2). The term "Flops" refers to an operation on a floating point number, which takes some bits to compute. Also, we can assume that most of the computations it performs are irreversible. Making these assumptions, the most energy-efficient supercomputer in the world in 2018 has the efficiency of roughly 0.0001% of theoretically possible maximum.

Koomey's law

Perhaps the most famous law in the world of computing is Moore's law, which states that the processing power of computers doubles every two years. Interestingly, there is a very similar law for energy consumption of computers, proposed by Stanford professor Jonathan Koomey in 2011. It states the following:

At a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half.

Implications of Historical Trends in the Electrical Efficiency of Computing , by Jonathan Koomey; Stephen Berard; Marla Sanchez; Henry Wong (3)

The law does not necessarily imply that the battery life of our devices has to double every 1.5 years. Rather, the performance of those devices can be increased while battery life remains the same. Koomey has backed his law by the following diagram, which shows that the number of computations per joule of energy dissipated has been doubling approximately every 1.57 years. As we can see from the diagram, this trend has been stable since the 1950s.

Fig. 1: Computations per kWh over time (3)

Combining the Two

In his original 1961 paper, Landauer himself questioned the possibility of practically achieving the limit he proposed. However, given the steady trend spotted by Koomey will continue, Landauer's limit will be reached by approximately 2050. For comparison, Moore's law is predicted to no longer be applicable in the early 2020's (4). The good thing about the result mentioned above is that we know for sure our current computers are so incredibly far away from the ideal one that we should anticipate steady growth in energy-efficiency for at least 20–30 years. At the same time, if today's rapid rate of development continues, it is not that far enough until the date is reached. So what's next? Landauer's principle has been proven several times, including experimentally, so we can be almost sure that it holds. Fortunately, there is an answer, and it is written in principle itself — the Landauer's limit only applies to irreversible computations; hence it is possible to bypass it by making them reversible.

Margolus–Levitin theorem

A more recent study, conducted by computer scientists from Boston University Norman Margolus and Lev Levitin in 1998 claims that there is an even more fundamental limit, described in terms of quantum theory rather than the theory of thermodynamics. It applies to all forms of computation, i.e. not only irreversible ones. It is formulated as follows:

...adding one Joule of energy to a given computer can never increase its processing rate by more than about 3 × 1033 operations per second.

This statement entails from the minimum bound for a quantum system to change the state, so it is perfectly distinguishable from its previous state. Here, the states of the system can be interpreted as the states of the memory, and rate of change of these states as processing power. Of course, the theorem mainly applies to quantum computers, but as quantum (subatomic) particles are the most fundamental bits of the universe that we know, this limit may be interpreted as the general bound on energy spent on computations.