A billion years of battery power ....

A graphic view of Koomey's law compared to Moore's law. As shown by the graphic the decrease in power consumption as basically matched the increase in density (Moore's law). Credit: Wikipedia

Schematic cross-section of an Indium-gallium-zinc-oxide (IGZO) thin-film transistor [inset: schematic illustrations of atomic structures for less compensated (left) and more compensated (right) IGZO films, respectively]. “If we were to draw energy from a typical AA battery based on this design, it would last for a billion years." Credit: Sungsik Lee and Arokia Nathan/Science

Switching speeds and power consumption of future chips. Credit: Intel

Batteries have increased their performances, but that would have not been sufficient if electronics, as its performances grew, didn't manage to slash the power budget "per activity".

Present electronics is several orders of magnitude less "energy hungry" than the first transistors. Actually, in the 80ies, several observers predicted that the Moore's law would be forced to fail in the late nineties because the increase in density on a chip would "melt" the chip itself, because of the heat generated by transistors. That didn't happened because transistors got less and less power hungry as they got smaller. This was captured by Koomey in what is known as Koomey law (see first graph).

However, the decrease in power is no longer continuing at the pace of the Koomey’s law. The problem is that you cannot go below a certain thresholds of power (voltage and current) without stepping into the quantum world and its intrinsic indetermination. There is still a gap between the present situation and the quantum limit. What we are actually facing is the “leakage” of electrons across the transistor junctions (Schottky barrier): if we are not using sufficient power for the “signal” we cannot distinguish between the true signal and the leakage current, similarly to the Shannon limit tying the noise to the signal.

Here comes the interesting innovation of a research team from the University of Cambridge. Rather then seeing the leakage current as a barrier they have found a way to exploit it for operating the transistor. Not any transistor, of course, only the ones having the particular architecture (see image) that they have created.

This 3D architecture allows a much better separation of the junctions thus avoiding the problem of one electrode influencing the other.

The result, in terms of power consumption, are spectacular: a normal AA battery would be able to power a chip based on this architecture for a billion year (according to one of the team researcher, Sungsik Lee) of course disregarding the chemical degradation of the battery. All in all, the team expect this architecture to become widespread for application in the Internet of Things area, enabling sensors to be powered by scavenging energy in the environment (this is already possible, but the energy harvested is very low, hence the need for chips with extremely low power need). Another very important area of application is the one of implantable devices, like pacemakers, insulin pumps, early cancer detection…

The nice thing of this architecture is that transistors can be manufactured at low temperature and can be printed on a variety of materials, including paper, polyester fabric, plastic and glass.


The drawback is that you cannot achieve the level of density you get with silicon etching, nor the same performance in terms of switching speed. However, if what you are looking at is low power consumption this is the answer. There is plenty of loose energy to harvest and with very low power electronics we can really create an aware environment, something that was unconceivable in the past, something that will change our perception of the world in the next and following decades.

Author - Roberto Saracco

© 2010-2020 EIT Digital IVZW. All rights reserved. Legal notice. Privacy Policy.