First trains arrived then the cars. Mobility completely changed civilization, and the way energy is used. Before there were horses, afterward engines, oil companies, price volatilities, mechanization of war and soon, just as the cars pushed the human race into an era of dramatic change, algorithms may be the next “cars” or “mobility” in terms of energy usage, dramatic change and the power to shape the future. But like cars, they may be substantial energy consumers. A research claims training a single artificial intelligence (AI) system can be as five-times more polluting as the lifetime emissions of an average vehicle.
Algorithms are no stranger to us. By definition, an algorithm is “a list of rules to follow to solve a problem.” Its original roots extend to 9th-century scholar Khwarizmi from Central Asia. Today the algorithms are closely associated with computers. From video files to messaging platforms and servers, the whole internet is powered by algorithms. It is expected that by the year 2025, the communication industry alone will consume 20% of world electricity.
In the past, algorithms were mostly synonymous with efficiency, compression, and processing. But not algorithms were the same. Some algorithms just do not simplify life, and they sacrifice efficiency for other goals, such as anonymity, networking, or pattern recognition. The best example of such an algorithm is a hashing algorithm that is powering the Bitcoin and other blockchain protocols. Bitcoin alone is responsible for 66.7 terawatt-hours of global energy consumption. It is nearly 22% of Turkey’s electricity consumption.
Bitcoin algorithm is a poster child example of how algorithms can turn into energy hogs in a brief period. The core algorithm is used for solving a puzzle. When the first solver is granted with virtual mints, it becomes a race against time. As the software is run on the millions of computers and terahashers, the scale makes the innocent algorithms into energy guzzlers.
The story is not limited to blockchain or hashers. Recent AI algorithms consume increasing amounts of energy. According to the chief executive of Applied Materials, data centers’ AI workloads could account for 10% of world electricity consumption by 2025 if no innovation or efficient designs appear. Strangely enough, AI can also cut energy usages by cutting the cooling bill by 40% for data centers.
Just as the cars changed the wars performed, AI is changing not only the entertainment and financial sectors but defense and surveillance. This part of the story is not as transparent as the other parts. The massive surveillance networks, automated weapons, autonomous technologies should also be accounted for as essential energy consumers. The face recognition systems, continually optimizing surveillance algorithms, navigation systems are the known ones.
Algorithms do not only compete with cars or other sectors, but they became rising energy consumers. Recently published research claims that as vehicles become more autonomous, their energy consumption will increase. The estimate claims 10% range reduction on a highway and 30% range reduction in a city. The more autonomous driving cars appear in the cities, 30% higher gas bills are not a far fetched reality.
Optimist narrative of the story is that: There is still room for improvement. Recently the biggest carbon nanotube chip has reached 14000 transistors. In theory, these chips can be ten times more efficient. There are also improvements in the way AI algorithms work.
Is there any room for efficiency? Jevons paradox claims as to the efficiency increase, the consumption also increases. A particular example is mobile phones. The more efficient the algorithms and chips became, more power and features are embedded in these devices. So will it be the same for algorithms?