Public clouds are changing the way we look at IT, no matter if you’re using cloud computing or not. Credit: Thinkstock Remember when there was a distinct difference between public clouds and systems you could see and touch in your data center? This is no longer the case. The lines are blurring between traditional systems, meaning hardware and software purchased or licensed for millions of dollars in sunk costs to sit in your own physical data centers, and the public clouds with their flexibility, scalability, and instant provisioning. Legacy or traditional systems are looking more like clouds these days, and what once was a clear decision is no longer clear. I call this “the cloud effect.” Traditional software and hardware players have adopted much of what makes public cloud computing compelling. This includes pay-as-you-go pricing and agreements for hardware and software, even public cloud–connected systems that sit within a data center and often are called edge clouds or microclouds, such as Microsoft’s Azure Stack and AWS’s Outpost. No longer is this a clear path. Is this blurring a good thing? Anything that makes the use of technology more flexible and less expensive is a positive evolution, and this is the same. You may recall when we moved to PCs that we changed the way we leveraged mainframe and minicomputer hardware and software. The cloud effect is no different, however it’s about 100 times greater a game changer as any technological shift that I’ve seen. So, there are benefits, even for those who have yet to move to a single cloud. For sure, data centers have become “stickier,” with many enterprises opting to delay migration to the cloud or cut back on the number of systems that will migrate. They are doing this for strictly business reasons, including the fact that systems in their data center are becoming more cloudlike and thus more cost effective already. The downside is that some enterprises may delay migrations for the wrong reasons. If they are seeking to support more speed and innovation, then cloud computing is typically a better fit than traditional computing approaches. The risk is that vendors that support things that run in data centers become good at retaining customers and, at times, customers may make the wrong decisions for what seems to be the right reasons. I often play devil’s advocate and take the side of staying in the data center when there is too much religion around cloud. Or I become a cloud advocate when nobody wants to take on the risk and costs of making the journey to the cloud, not considering the value left on the table. There must be a compelling reason in each case. Neither path will be a slam dunk—it’s mostly going to be a mix of on-premises and cloud. The mix is the problem to solve. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos