Microsoft and Intel are working on protecting your data while it’s being used in the cloud Credit: HypnoArt Building and running modern cloud-native applications has its risks. One of the biggest is that you’re sharing computing resources with an unknown number of other users. Your memory and CPU are shared, and there’s always a possibility that data may accidentally leak across boundaries, where it can be accessed from outside your organization. A breach, even an accidental one, is still a breach, and if you’re using Azure or another cloud platform to work with personally identifiable information or even your own financial data, you’re in breach of any compliance regulations. It’s not only user or financial data that could be at risk; your code is your intellectual property and could be key to future operations. Errors happen, even on well-managed systems, and a networking problem or a container failure could expose your application’s memory to the outside world. Then there’s the risk of bad actors. Although Azure has patched its servers to deal with known CPU-level bugs that can leak data through processor caches, microcode-level issues are still being discovered, and it’s not hard to imagine nation-state or organized cybercriminals using them to snoop through co-tenants’ systems. Azure’s cybersecurity infrastructure is one of the best. It uses a wide range of signals to look for malicious activity with machine learning-based threat detection to quickly spot possible areas for investigation. Security and encryption are built into its underlying platform. Even so, some customers want more than the defaults, as good as they may be. They’re businesses that are building cutting edge financial technology in the cloud or using it to process and manage health data. They even may be governments or the military. Introducing Azure confidential computing By default, Azure ensures that data is secured when it’s at rest and in transit. We’re familiar with using encrypted storage and network connections, but in most cases, we need to process it in the raw, decrypting it right where it’s most at risk of leaking. That’s where the concept of confidential computing comes in, building on a mix of hardware and software, along with work from Microsoft Research, to build and operate TEEs (trusted execution environments). These TEEs are perhaps best thought of as secure containers that protect both the compute and memory resources your application needs, shielding them from other users by preventing untrusted code from running in that memory space. By protecting both CPU and memory, it’s possible to provide authorization methods that lock down compute to ensure that only your own trusted code runs and that prevent code from crossing memory boundaries into protected space. When an application frees up a TEE, it’s flushed, ensuring that there’s no data left in processor caches or in memory. External applications can’t read that memory and they can’t modify it either, so they’re unable to inject code across protection boundaries. Using SGX in Azure Azure offers two different TEE models: Virtual Secure Mode and Intel’s SGX. The first is based on Microsoft’s own Hyper-V, using a modified version to increase isolation by preventing code from crossing hypervisor boundaries. This includes code being injected into the TEE by Azure administrators, preventing insider attacks that might otherwise go undetected. Intel’s SGX security extensions add hardware protection to TEEs, and Azure offers access to SGX-enabled servers for applications that don’t trust Microsoft or for multiparty applications where only the application is trusted and where each party can’t have access to the other’s data (for example, machine learning over health care data from multiple providers). A preview of SGX support is available in the DCv2-series of Azure virtual machines, using generation 2 VM images, with between 1 and 8 vCPUs and up to 32GB of data. As they’re in preview, they’re currently only available in the UK South region. Gen 2 VMs use newer hardware and have access to more hardware features than the original generation of Azure VMs, including support for Intel’s SGX. Existing VMs can be converted to gen 2, but it’s a one-way process. Building SGX TEEs with Open Enclave Microsoft pioneered much of its enclave technology in its Coco Framework blockchain platform, with enclaves available inside Azure SQL as part of its Always Encrypted tools. Building and using your own TEEs is now available in preview, with a tool for building secure, enclave-enabled VMs in the Azure Marketplace and a C/C++ SDK for developing your own code to run inside TEEs. The Open Enclave SDK is intended to provide a single way of building code that works with all the available TEE implementations. It takes a layered approach to secure application development, mixing trusted components running inside TEEs with untrusted hosts handling interactions with users and with other applications. Your confidential application holds all its secrets inside the TEE, passing results to the host application. This allows your code to deliver its results securely without revealing any secrets. An application running machine learning over data would use the host to report results, handling the inference in the TEE. An attacker would be able to see the raw data if it were not using a TEE. Using a TEE means all the attacker would know is that a host received a result. This approach is ideal for multiparty applications where the data being processed contains regulated data that can’t be exposed outside an organization. What does the SDK do? It provides tools to create and manage TEEs, including managing enclave identities. Other features include standard ways of transferring results and data across TEE boundaries, with primitives to simplify building in-enclave code. Finally it offers cryptographic tools to seal secrets inside the enclave and to verify enclave identity. The Open Enclave SDK has both Windows and Linux versions, adding SGX support to your code, and is preinstalled on DC-series Azure VMs as part of a confidential compute VM deployment. Using the Open Enclave SDK with Windows Working with the Open Enclave SDK is like working with any other C++ SDK. You can download it from GitHub or, if you’re using Visual Studio with Clang, it’s available via NuGet. Currently the SDK only supports single binary enclaves, and you’ll need to be careful to configure your build correctly. Once built it needs to be signed, both for attestation and for sealing secrets. The SDK installs several sample apps so you can get started with Open Enclave and using SGX TEEs quickly. Confidential computing is still in its early days. TEEs aren’t easy to set up, and they do require a lot of thought due to limitations imposed by SGX. It’s still worth exploring them, especially if you need to process regulated data in the cloud. Adding encryption for data in use, as well as at rest and in motion, is a significant upgrade to your application and data security, an essential requirement for using public clouds for sensitive applications. With Azure vying for more fintech and government clients we should see further upgrades to its confidential computing platform, with more support for Hyper-V security and with future upgrades to Intel’s SGX instruction set. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos