You’ve probably heard how generative AI will solve all cloud migration problems. It’s not that simple. Generative AI could actually make it harder and more costly. Credit: Shutterstock Management was optimistic when XYZ, Inc., embarked on a journey to migrate its extensive legacy systems to the cloud using cutting-edge generative AI tools. Partnering with a leading AI solutions vendor promised efficiency and reduced costs. However, the generative AI tools needed help to handle the complexity and specificity of XYZ’s systems, which led to frequent manual interventions. Timelines were constantly revised, and the project ran over budget six months into the migration. What was supposed to be a streamlined process turned into a tangled web of unexpected expenses and delays. How could this happen? XYZ’s experience contradicts McKinsey’s claim that “the use of generative AI is cutting down cloud migration efforts by 30% to 50% when done correctly.” Smash cut to my inbox, where enterprises that want to do cloud migrations on the cheap keep asking about generative AI-powered migration tools that shorten this process. Of course, there are legitimate benefits to using AI for migration, such as developing net-new applications and application refactoring. However, the overall tone of this article and others often reinforce the hope that generative AI will save us from talent shortages and compressed migration schedules. Spoiler alert: This is wishful thinking at its best. The promise and perils At first glance, using large language models (LLMs) for cloud migration sounds like a silver bullet. These models can theoretically understand a system’s infrastructure and produce the necessary scripts to facilitate migration. However, the assumption that generative AI can homogenize the diverse and intricate landscapes of enterprise IT is an idea that needs to be revisited. Here’s why: Each cloud migration project is unique. Its intricacies demand specific tools and processes tailored to particular problem domains. Generative AI rarely accounts for the bespoke nature of these requirements, which leads to half-baked solutions that require substantial human intervention to correct. Many enterprises operate with legacy systems. Workflows are often nuanced and undocumented. LLMs may need help to accurately interpret these complexities, leading to gaps in the migration process. These gaps can necessitate costly rewrites and intense debugging sessions, defeating the purpose of using AI in the first place. Enterprises in regulated industries face stringent compliance requirements. Although generative AI can help identify potential compliance issues, the final validation often requires human oversight to ensure that AI-driven recommendations meet regulatory standards can add complexity and cost. No simple solution The allure of generative AI lies in its promise of automation and efficiency. If cloud migration was a one-size-fits-all scenario, that would work. But each enterprise faces unique challenges based on its technological stack, business requirements, and regulatory environment. Expecting a generative AI model to handle all migration tasks seamlessly is unrealistic. I suspect that by the time you set up an AI migration toolchain to assist in the migration, the time delay and the cost of that toolchain would erase any potential benefit. If you think about it, you can find other examples in the IT industry where some technology removes a value instead of adding one—about half the time, in my experience. Successful cloud migrations rely on specialized tools and human expertise. For instance, customized tools address specific issues encountered during migration. Real-time synthetic testing and infrastructure-as-code frameworks are indispensable in handling the minutiae of migration tasks. Also, human oversight is still needed. Skilled professionals bring critical insights that AI cannot replicate. Their expertise is essential in navigating unforeseen challenges and ensuring the integrity of the migration process. The real cost of generative AI Beyond the initial investment in AI tools, the hidden costs of generative AI for cloud migration add up quickly. For instance, running generative AI models often requires substantial computational resources, which can be expensive. Also, keeping generative AI models updated and secure demands robust API management and cybersecurity measures. Finally, AI models need continual refinement and retraining to stay relevant, incurring ongoing costs. These factors often lead to a situation very similar to our fictional XYZ Inc. Generative AI, while valuable in certain aspects, is yet to be the panacea for cloud migration complexities. I know what a few of you are saying. “There goes Linthicum again, raining on the generative AI parade and killing the excitement of using AI as a force multiplier for the people doing cloud migrations.” Successful business strategy is about what works well and what needs to be improved. We all understand that AI is a powerful tool and has been for decades, but it needs to be considered carefully—once you’ve identified the specific problem you’re looking to solve. Cloud migration is a complex, multifaceted process that demands solutions tailored to unique enterprise needs. While generative AI holds promise, over-reliance on it can lead to increased costs and complexities rather than simplification. The key to successful migration lies in a balanced approach, leveraging AI where it excels while relying on specialized tools and human expertise to navigate the thorny landscape of cloud transition. By understanding the limitations and realistic applications of generative AI, enterprises can better plan their migration strategies, avoid the pitfalls of overautomation, and ensure a smoother, more cost-effective transition to the cloud. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos