Nothing is the same in the world of cloud technology year-to-year. Could cloud bursting make a comeback in new packaging? Credit: Thinkstock About two years ago I wrote a piece here on the concept of cloud bursting, where I pointed out a few realities: Private clouds are no longer a thing, considering the current state of private cloud systems compared to the features and functions of the larger hyperscalers. You need to maintain workloads on both private and public clouds for hybrid cloud bursting to work; in essence, using two different platforms. It’s clear the bursting hybrid clouds concept just adds too much complexity and cost for a technology stack (the cloud) to be widely adopted by companies that want to do the most with the least. In case you missed the fervor in the tech press a few years ago, cloud bursting is the concept of leveraging public clouds only when capacity of the on-premises cloud runs out. Somehow companies thought that they could invoke a public cloud-based part of the application and have access to the same data without latency. Mostly it did not work. I stand by that posting, but now I have a few additional things to say about the concept of cloud bursting in 2020 and 2021. First, a few on-premises solutions exist today that are close analogs to public clouds, because they are sold by the public cloud providers. The larger hyperscalers, including Google, Microsoft, and AWS, have hardware and software solutions that exist in traditional data centers. Simply put, these are a scaled-down version of their public cloud solutions packaged as appliances. Cloud bursting is possible with these solutions, considering that both the on-premises platform and the public cloud platform are purpose-built to work and play well together. The objective is to eventually move the on-premises workloads to the public clouds by using these on-premises solutions as an intermediate step. Second, edge computing is a thing now. The use of IoT devices connected to public clouds has always been around, but the formal use of edge-based systems that are both devices and legit servers is part of the cloud architecture zeitgeist. This means that edge computing has processing and data storage outside of the public cloud providers that are also purpose-built to work with specific public clouds. Moreover, the public cloud providers support edge directly now. Those deploying systems that leverage edge computing infrastructure don’t have to build things from scratch to use public clouds for back-end processing. Although there are more and more architectural patterns that look like cloud bursting, the notion is really about distributing processing and storage, which is not at all new. My purpose in pointing out what’s changed is really to point out that things do indeed change. This is why I love the cloud computing business. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos