Cloud computing is the buzzword in the industry & its not new but been around for quite good amount of time by now!
Reason for its widespread adoption is its value proposition that it offers through a promise for -
- Extra capacity
- Resilience, and
- More efficient use of storage and processing power
So, by switching to cloud computing we are making more efficient usage of resources available to us!
However, is there a hidden cost behind this lucrative offer? Well, yes & its our very own environment paying the price for it.
Consider any computing device right from your laptop that you use everyday or the super computers — all such devices being electronic in nature requires electricity and upon prolong usage generates considerable heat!
Hence to beat this heat, most of the PCs are installed with a coolant unit (fan).
The bigger & busier the computer, the bigger the coolant & therefore more is the power that it needs. In a typical datacenter, instead of installing such individual fans for each computing device, the temperatures are typically controlled via central air conditioning devices.
With the every growing web & the cloud, the demands placed upon it, means such cloud computing hubs are actually a gigantic consumer of electricity!
Consider some of the hard facts from various sources -
The energy consumption of Information and Communication Technologies (ICT) is increasing by 9% every year.
In 2019, according to KTH (The Royal Institute of Technology in Sweden), the internet was consuming 10% of the world’s electricity — an increase of 4% on 2016, reflecting the rapid growth of the network and the demands placed upon it.
According to a report by AFP news agency, watching a 30-minute video on Netflix, Prime, and other platforms usually release 1.6 kg of carbon dioxide or a 6 km drive emit the same amount of CO2.
The digital transition as it is currently implemented participates to global warming more than it helps preventing it. If there is a right time to act on it, then it is right NOW!!!