Add Azure’s API Management tools to secure and manage access to your code for serverless and microservices applications Credit: Tookapic One of the building blocks of cloud-native application development is the API. By providing APIs to services, cloud applications can provide common backends to many applications across many platforms. Instead of having many implementations of a billing engine or an inventory, you only need one, with APIs that let it be consumed by other code. Locking down open APIs But open APIs can be a risk, allowing anyone access to your services and opening them up to overload and misuse. That’s where API management tools come in, giving you a façade that wraps open code and provides one place to manage access and to control API usage. Instead of having to build a separate access control system for each API, you can use a single set of keys and access tokens. Developers use a one-stop shop developer porta to register for API usage, and you can set usage limits. Policy-based controls also simplify management, tying API access to existing directory services for role-based controls. There’s another important aspect to API management tools: It adds a new monitoring layer to your application performance management. You can see API access and response times, as well as get a deeper insight into load and usage patterns. You can use output from these services to automate service deployment, adding and removing microservice instances, and handling routing to support demand. The more information you have, the easier it is to scale and control microservices. APIs are a key component of modern application development, and tools like Azure API Management are increasingly important. They don’t only control who can access your APIs, they also help you monitor your application operations. Logs from the service show which accounts have written to APIs, with both diagnostic and activity log options. By adding a tool like this to your code, you’re not only protecting your services, you’re also making it easier for developers to use the tools you’ve given them. Introducing Azure API Management Azure’s API Management service is an important tool for any organization that’s building large-scale applications on Azure, especially if it’s providing core microservices that can be used by departmental applications or even by trusted third parties. Available through the Azure portal, it’s easy to quickly add Azure API Management to existing services and to use it to build and test new APIs. Using the tool is easy enough: If you’ve already built an OpenAPI Swagger definition for your API, it’s simply a matter of creating a new API façade from your existing API, importing the definition from your existing service. Azure’s tools give you HTTP or HTTPS (or both) APIs; it’s a good idea to use HTTPS to improve API security. An API needs both a name and a base URL, with a unique suffix that defines the API being called. There’s also the option of versioning your APIs, so you can deprecate older versions as new APIs are released. Bundling APIs into products One key concept is the “product.” APIs can be grouped into products, giving you the option of controlling developer access in a granular way. Each managed API can be part of one or more products; so, for example, you can have a single shopping-cart API available to all your users, while you have separate payment and catalog APIs for different geographies. Each group of APIs can be bundled into a product, and any subscribing developer gets a key that gives access only to a specific product bundle. Once created, managed APIs can be tested in the Azure Portal. Developers see sample REST constructions based on the OpenAPI description. They can then enter any appropriate query terms, submit the request, and see a response. You can also use the API management tools to strip out information about your back-end stack by managing the HTTP headers sent from the API gateway, controlling the information sent to users and developers. APIs can also be throttled, setting rate limits on a per-subscription basis. If you’re still defining and testing the APIs you intend to use as part of a service, you can use Azure API Management to implement a mock API. Starting with a blank API, you can add a test URL and query string and then hook up a set of predefined outputs. Once you’ve set up the appropriate responses, developers can build both and client server code in parallel, using your predefined API to control inputs and outputs. Managing serverless APIs Azure’s API Management tool gives you two ways of managing APIs: one with its own portal to manage and control developer access to your resources, and one with a serverless consumption-based management model. Developers using the portal get access to an automatically generated API catalog, along with code samples and API definitions that can automatically be brought into Visual Studio or their preferred development environment. As API schema are available as OpenAPI Swagger definitions, they’re easy to import into your code, and you can use them to automatically generate methods and functions. As cloud-native applications continue to evolve, there’s a shift away from infrastructure-as-a-service-based architectures to newer serverless and containerized services. That’s meant a change in how Azure offers API Management, with the launch of a consumption-based tier that’s more closely related to how serverless applications operate. Stepping outside more conventional application development models, this new tool adds support for serverless programming models like Azure Functions and Azure Logic Apps. Even the smallest microservices can be wrapped with API Management, letting you control more effectively how they’re used and how they’re scaled. By automating scaling, you can ramp applications with demand, all the way down to zero instances, and launch new instances when API calls are made. Usage-based pricing also simplifies things, letting you use consumption-based API Management with APIs and services that are only rarely used. Bring your own cache Azure API Management also adds new features that should simplify API operations, including support for bring-your-own cache, with a scalable Redis-based in-memory cache that handles cached responses to reduce the load on your services. If you’re using an API to deliver content that doesn’t change often, like a web store, you’ll find the cache a useful option. You need to set it up in the Azure Portal, with a unique DNS name, and add it to your application’s resource group before adding it to your API. By setting up an expiration policy, you can reduce application load by holding API responses for a set time. The initial call to your API loads the cache, and then all following calls until the cache expires load the same content. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos