As edge computing explodes, we’re faced with complexity, security, and management challenges that don’t have easy answers. Credit: Matejmo / Getty Images Transparency Market Research has a new report pertaining to the global IoT (Internet of Things) and edge computing-connected-machines market. The report projects the IoT market will reach a value of $1.3 trillion by 2027. This is not at all surprising. What is a bit of a shock are the emerging ways that organizations are leveraging IoT and edge. Edge and cloud architects are building patterns that are more complex, less secure, and harder to manage than most edge computing fans anticipated. Now we’re starting to see a few concerning patterns. We’re solving a specific business problem by putting data processing closer to where the data is consumed, thus providing better and more reliable performance. However, this brings trade-offs that are becoming new problems that are not as easy to solve. The core issue is that edge computing brings complexity. If the idea is to push processing and some data retention out to edge devices, sometimes numbering in the thousands, then the performance and reliability gains need to justify the additional costs of providing security and management to edge-based architectures. Compare this to more traditional, centralized systems, such as within public clouds where updates, upgrades, and fixes are easy to deploy using automated and durable processes. With edge computing, you’re pushing out these updates to thousands of remote nodes and dealing with updates failing, devices missing, and network outages in even reaching the devices. Making the architecture more distributed also makes it more complex. This triples the cost of security and management of the edge-based devices—even having to pay somebody to travel to the location of the edge device to kick the thing to get it working again. Security becomes a problem in that it’s difficult to secure a physical device that can easily be stolen, depending on your application and where it’s installed. Of course, there is encryption, but it may not surprise you to know that most of the data transmitted from a sensor to the edge device for local processing is not encrypted in flight due to performance requirements. This includes edge devices that exist in hospitals, with PII (personally identifiable information) processes by an edge device that may be in an MRI machine or a heart and lung machine. Solving these problems is pretty easy, but paying for them and accepting the risk is not. There are cloud-based edge management systems that basically keep a digital twin of the edge device, including operating systems and installed applications. This makes updating the edge computer/device much easier, although you’ll still have to dispatch the intern to a remote location to repair or replace edge computers. Operations is another problem; you’ll need to manage the thousand-plus edge nodes as if they were a single system. This is very different from running the same applications and data patterns on a centralized system, such as with a public cloud where the data and application processes are reachable at all times. Moreover, the cloud offers cloud-native services. For now, edge computing is pretty DIY. Solutions to these problems are beginning to appear. I’ve worked on a few already. However, it’s not enough to find something that “just works.” It’s finding something that won’t make edge computing unjustified because of cost and risk. Let’s get to work on this one because it’s an ugly problem to solve. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos