Cloud can be a green technology, but not without significant planning and up-front work that most enterprises are reluctant to fund. I recently participated in a documentary called “Clouded II: Does Cloud Cost the Earth?” Please watch it. It looks at the issues surrounding cloud computing, its consumption of power, and thus its potential impact on the planet. The documentary did an excellent job of balancing perspectives. I don’t think anyone is arguing that data centers should not exist, but those data centers should be efficient and minimize power consumption. This means optimizing the resources we use and moving away from the “store all the data” mentality we’ve had for the past 50 years. We created 64 zettabytes of global data in 2020, and it has gotten worse in the years since. Can we? Should we? Most of the data we store really does not need to be kept: petabytes of images and videos for social media, a massive number of scanned documents, and data for quad-redundant backup purposes. We’re all guilty. Most of us store more data than we realize on our systems or our respective cloud and social media services. It keeps growing, and all this storage and data processing requires a huge amount of power. As data grows like a weed, should we begin to consider its impact? More importantly, what can we do to manage it better, to reduce the amount of carbon emissions? Or at least to reduce the growth. This gets back to what I’ve been talking about here: building systems that are as optimized as possible, cost less, and require fewer resources, such as power. It is a good thing that the efficient use of resources has significant business and sustainability benefits that are directly linked. What’s good for the planet is also good for business. Is the cloud green? Contrary to popular belief, cloud computing is not inherently green. Cloud data centers require a lot of energy to power and maintain their infrastructure. That should be news to nobody. Cloud is becoming the largest user of data center space, perhaps only to be challenged by the growth of AI data centers, which are becoming a developer’s dream. But wait, don’t cloud providers use solar and wind? Although some use renewable energy, not all adopt energy-efficient practices. Many cloud services rely on coal-fired power. Ask cloud providers which data centers use renewable. Most will provide a non-answer, saying their power types are complex and ever-changing. I’m not going too far out on a limb in stating that most use nonrenewable power and will do so for the foreseeable future. The carbon emissions from cloud computing largely stem from the power consumed by the providers’ platforms and the inefficiencies embedded within applications running on these platforms. The cloud provider itself may do an excellent job in building a multitenant system that can provide good optimization for the servers they run, but they don’t have control over how well their customers leverage these resources. Improving efficiency This is where underoptimization comes very much into play. Those of us who see the finops reports at the end of the month understand that resources are often provisioned and need to be used. However, poorly designed systems eat 10 times the resources required. To fix that, you’ll have to modernize the applications and databases so they are fully optimized on the cloud platforms where they reside. This is not cheap, so most enterprises choose to run them as is, paying for the inefficiencies rather than fixing the systems in the cloud. The irony is not lost on me when I walk past the solar panels and the electric car charger (part of their green building certification) and head into a meeting where I see that the systems are using about 50 times the resources needed. Any good vibes from all the green creds are often lost by running grossly underoptimized systems that use enough power in a week to keep a small town running for a month. Hey, if they get their rating… Changing our thinking There is an essential need for a more eco-conscious approach to cloud computing, including meticulous optimization of applications for power efficiency and devops practices that incorporate sustainability checks. Again, this supports the bottom line in the form of reduced cloud bills and systems that perform up to the expectations of the business—finally. First, we need a change in thinking. How do we build better and more efficient systems? What metrics do we need to use for success? We need to fundamentally change our culture and stop fooling ourselves that buying a Tesla means we can run crappy, power-hungry systems that don’t return value back to the business. Look inward to the real problems that you’re facing and the fact that in order to fix them, you’re going to have to spend money, take some risks, and also admit you made some huge mistakes. I suspect that most IT execs won’t do that, but eventually, someone is going to figure out that their enterprise has been kicking the sustainability can down the road. If so many enterprises are doing the same thing, we’re clearly not interested in sustainability, just the perception of it. That’s not helping. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos