AWS Lambda’s serverless functions shine for event-driven data processing and machine learning, connecting cloud services and external APIs, and even triggering builds in a CI/CD pipeline Credit: Ivanastar / Getty Images / Sam Schooler There are many options for deploying cloud-native applications and services. Organizations developing large numbers of applications and services on different platforms and with varying compliance requirements are likely to consider containers and CaaS (containers as a service). Other organizations with few development stacks and operational restrictions looking for a simpler path to production often select PaaS options as they require less configuration and technical expertise. Then there is FaaS, or functions as a service, which abstracts the underlying infrastructure setup and configuration and provides simple mechanisms to deploy and run code. Functions are ideally suited to executing code in response to events, and they can be used as infrastructure for lightweight microservices. In a previous article examining PaaS, CaaS, and FaaS, I share guidelines from architects and cloud experts on some of the considerations when selecting a cloud architecture. In this article, I’ll share more specific requirements for using serverless functions and provide several example use cases. While I focus on AWS Lambda here, the same general principles will apply to using Microsoft Azure Functions, Google Cloud Functions, IBM Cloud Functions, Oracle Functions, and other FaaS platforms. Technical requirements for AWS Lambda functions AWS Lambda is one option for deploying serverless functions on public clouds. It’s important to consider the key technical requirements before implementing these services. AMS Lambda functions can be developed in Java, Go, PowerShell, Node.js, C#, Python, and Ruby. AWS has a list of events that can trigger a Lambda function with the most simple ones triggered as API calls through Amazon API Gateway. Events can also be triggered by code commits, CI/CD pipelines, Kinesis data streams, cloud system monitors, and IoT Events. You can also schedule the running of functions using CloudWatch events. Functions operate synchronously or asynchronously, depending on the type of trigger. AWS Lambda functions can run up to fifteen minutes per execution. They can be configured to have 3GB of memory and access up to 500MB on non-persistent disk. Lambda functions must be stateless and inbound TCP/IP connections are restricted, but they can use environment variables and create threads or processes. This AWS Lambda FAQ provides a lot more details on the technical capabilities and restrictions. Developers can configure individual functions as well as serverless applications with AWS Serverless Application Model. This option allows developers to conveniently bundle, deploy, and manage a group of services that collectively operate as applications. Serverless functions and serverless applications reduce costs I spoke to Hitesh Chitalia, an experienced cloud solutions architect, about how he uses AWS Lambda to rapidly develop and easily support microservices and applications. “I like to use Lambda because it reduces the focus on infrastructure when supporting microservices or simple operational maintenance tasks,” Chitalia replied. “It has matured now to the point where I can orchestrate batch jobs through step functions, execute ETLs, and run full-featured web applications.” Chitalia has succeeded in lowering costs by migrating applications from dedicated Amazon EC2 servers to serverless functions and applications. The cost savings can be significant, especially for applications and services that have episodic usage patterns. Chitalia shared a specific example: The cost structure for FaaS also makes it an attractive option. Its pay-per-use approach combined with the reduction in infrastructure overhead allows businesses to closely track costs based on product utilization. A basic example is an API at a previous company was migrated from two small EC2 instances to Lambda and saw a reduction of 90% in costs. As the usage grew, Lambda automatically scales to support the increased usage. Roughly 6x growth in calls resulted in 2x in increased costs for the Lambda service. Chitalia also shared some of the development considerations. “Because FaaS requires smaller, shorter-running functions to be effective, code will need to be re-written to take advantage,” he said. “There is also the issue of cold starts where the first call to a function that has been idle for a while may take seconds to respond.” Examples of serverless functions and serverless applications To better understand how developers are using serverless functions and applications, I looked through Amazon’s Serverless Application Directory and other sources to identify some common use cases. These may serve as coding examples and also illustrate areas where there are some easy first projects. “How to” functions often illustrate how to connect two or more different Amazon or other external web services. Python and Node.js are the two most popular development languages used in these examples. Several example services connect S3, DynamoDB, API Gateway, SNS, and CloudFront. Alexa skill cartridges and other examples that connect to the Alexa APIs are the most frequently downloaded services in the directory. System administrators are using serverless functions to process log files, compress files in S3 buckets, monitor web services, and handle web server redirects. Manipulating images and other media files is a common need in web applications. Several examples use ImageMagick and other image and media utilities to convert file types, resize images, create galleries, and compress image files. There are several examples of connecting to SaaS applications, vendor APIs, and other services including Slack, Elasticsearch, Sumo Logic, and Selenium. Data processing and machine learning services are also prevalent in the directory. There are examples of connecting Amazon Kinesis to input and output data streams, and several examples of using Amazon Lex for speech and natural language processing. Data scientists will also find services for TensorFlow and Scikit-learn. DevOps examples include integrations with CI/CD pipelines to trigger builds or to capture event information and push it to other tools. Microsoft currently lacks a directory of serverless functions and applications, but there are many good examples to review elsewhere. You can review a full serverless billing application, take in a guide for getting started with JavaScript functions, or check out these Python examples, guides, and quickstarts. The promise of serverless computing There is a school of thought that serverless computing represents a niche use case because of technical limitations or a mix of compliance and security constraints. Serverless functions are hosted in containers where you surrender infrastructure setup, configuration, management, and control over to the cloud provider. For regulated companies, serverless functions running on public clouds may not be an option. Other organizations that already have to set up large-scale applications and services using containers or with PaaS might debate the benefits of having a separate hosting service for functions. For these groups, it might be just as easy to use infrastructure as code and deploy functions directly to an optimally selected infrastructure. But there is another school of thought that puts serverless functions in the mainstream of enterprise software development. First, developers should be excited about the simplicity of deploying functions without the steps to plan and implement the infrastructure. At a minimum, serverless functions can be used during development phases and ported to PaaS or container options only when required. Second, developers should be excited about the potential of serverless functions to vastly simplify software library selection and code reuse. Developers often scour GitHub and other open repositories for code libraries, development kits, and other code samples that can be leveraged in their applications. Once a component is found, tested for its capabilities, reviewed for performance considerations, and validated for security issues, the developer is then left with the task of incorporating the code into their application. The integration can be done in several different ways depending on the coding platform, application architecture, philosophy on optimizing builds, and deployment considerations. But when these code libraries can be deployed as serverless functions, then the developer has to do a lot less work to utilize its capabilities. This option also enables system engineers to size and scale services based on their consumption and processing requirements. This is part of the promise of microservices. It’s interesting to think of a future based on serverless functions. Instead of searching GitHub for code samples, developers will search directories of serverless functions. Organizations with many development teams may also create internal directories of repurposable serverless functions and other microservices. As cloud providers continue to simplify infrastructure and deployment options, we can expect serverless compute offerings to create new avenues for developers to repurpose functions, deploy capabilities faster, and enable cloud operations to optimize costs. Related content analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing analysis Understanding Hyperlight, Microsoft’s minimal VM manager Microsoft is making its Rust-based, functions-focused VM tool available on Azure at last, ready to help event-driven applications at scale. By Simon Bisson Nov 14, 2024 8 mins Microsoft Azure Rust Serverless Computing how-to Docker tutorial: Get started with Docker volumes Learn the ins, outs, and limits of Docker's native technology for integrating containers with local file systems. By Serdar Yegulalp Nov 13, 2024 8 mins Devops Cloud Computing Software Development news Red Hat OpenShift AI unveils model registry, data drift detection Cloud-based AI and machine learning platform also adds support for Nvidia NIM, AMD GPUs, the vLLM runtime for KServe, KServe Modelcars, and LoRA fine-tuning. By Paul Krill Nov 12, 2024 3 mins Generative AI PaaS Artificial Intelligence Resources Videos