Fog nodes are the new ‘cool kid’ components for edge computing. However, we risk overusing them if we don’t understand their true purpose at the edge Credit: Aleksandar Pasaric Think of a fog computing node as a physical server that resides between the edge devices (thermostats, robots, in-vehicle computers) and the back-end systems, typically hosted on public clouds. The fog nodes respond to an architectural problem: too much latency to pass requests all the way back to the public cloud-based services and not enough horsepower to process the data in the edge device itself. This three-tier system adds another compute platform that is able to do some—if not most—of the back-end processing. This addresses the concern that cheaper edge devices with lower power don’t have the processing and storage power to deal with the incoming data natively. Now data can be sent to the fog node for processing, without the latency impact of going all the way back to the remote cloud services. Although fog nodes are a simple solution to a simple issue, you should understand when and when not to use them: You should use fog nodes when data that is complex or in large amounts need to be processed locally and would overwhelm the edge-based device that is consuming the data as well. In other words, you need something that responds in almost real time, such as a factory robot shutting down when the servers overheat. You want that to be instantaneous. You should use fog nodes when a human is waiting for the response to return back to the edge device. Like the need to respond to events that can’t wait, humans need to be considered near-real-time architectural components. People are too expensive to be waiting around for responses from the remote cloud systems. You should use fog nodes when you’re multiplexing data from several different types of edge devices or sending several different types of data. The fog node manages the processing of data from several edge devices at once, dealing with things such as semantic transformation of the data before sending it to the back-end cloud servers. Or the fog node can process and respond directly to the edge-based device. Basically, you should not use fog nodes unless the architecture and requirements fit the above criteria. In my book, they should be used sparingly, especially considering that they add cost and operational complexity—and another layer that can fail. Your best bet for now is to exclude them, and then figure out if you have situations where they can be of use. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos