Microsoft’s open source, cross-platform microservices framework is ready for prime time at last. Credit: thinkstock Microservices are at the heart of many cloud-native architectures, using tools such as Kubernetes to manage service scaling on demand. Microsoft has been at the forefront of much of this movement, with a deep commitment to the Cloud Native Computing Foundation and by using Kubernetes to support its hyperscale Azure and its on-premises hybrid Azure Stack. Part of that commitment comes from its tools, with a range of different platforms and services to support cloud-native microservice development. One of those tools is Dapr, the Distributed Application Runtime, an event-driven runtime that supports creating and managing service elements using best practices. It’s designed to be platform agnostic, so you can use your choice of target environments (local, Kubernetes, or any other environment with Dapr support) and your choice of languages and frameworks. Dapr gets close to 1.0 It’s been a while since Dapr’s Fall 2019 announcement, but development has continued during the past year, and with a second release candidate currently available, a Dapr 1.0 release isn’t far away. That development process includes development tools and a Dapr CLI to set up development environments and help with application scaffolding, ready for you to add your code. There’s going to be at least one more release candidate, but if you haven’t looked at Dapr, it’s a good time to start seeing if it can help you build services faster. A good place to start is with the Dapr CLI, which works with a local Docker installation. It’s available for Linux, macOS, and Windows, with installation instructions for the main versions. Windows developers can choose to install in WSL or Windows, using the Linux instructions for WSL. You can install either the current release candidate CLI or the last stable development release, 0.11. Once you’ve installed the Dapr CLI and Docker Desktop, with Docker Desktop using Linux containers, you’re ready to get started in self-host mode. I’d recommend using a recent release of Docker Desktop, as it works with WSL 2 directly, making it easier to run and manage Dapr containers on your development PCs. Running Dapr downloads and installs the core Dapr containers on your development system, ready for use. Self-hosting Dapr allows you to try it out without requiring a Kubernetes install, while still using all its development tools. One point to note: I did have some issues with port reservations on my main development PC, which stopped one of the key Dapr containers from starting. This was fixed by stopping the Windows NAT service, which was blocking access to the ports Dapr needed to use. With Dapr running locally in self-host mode, you can test that the correct versions are installed and running and that the three Dapr containers are running. When you’re done, you can then use Dapr’s uninstall command to remove the containers (use the –all option to remove everything). If you prefer to start on Kubernetes, Dapr is available through Helm or via the Dapr CLI. This installs a set of pods that are needed to build and run Dapr applications. Although the CLI approach works, it’s probably best to install Dapr using Helm chart, as this links your installation to the Dapr Helm repository and sets up the appropriate namespaces, allowing you to automate installs and to ensure you’re always running the most current, supported releases. Configuring Dapr components and building your first code With the base Dapr containers in place, you next need to set up a state store in Redis and a pub/sub message broker. The state store is configured using yaml, setting up keys and metadata for your store. The message broker also runs in Redis and uses the same keys in its yaml configuration. It’s not a good idea to hard-code passwords into your environment, though it can make things easier for self-hosted development. If you do hard-code passwords, be ready to remove them from configurations when moving to shared development environments or when storing configurations in git or another source code management platform. Once you’ve got a Dapr environment installed and configured, you’re ready to build a development platform. It’s clear we’re still in the early days of cloud-native software development, as there’s no single installer that brings everything you need into one place. This isn’t like working with .NET in Visual Studio, or even with the Azure platform tools for Visual Studio Code, as you need to find and download SDKs and configure IDEs. Dapr currently has SDKs for .NET, Java, Go, Python, JavaScript, and PHP. Each has a separate GitHub repository, so find the one you want to use and install it from there. The SDKs provide tools to link your code to Dapr’s building blocks, via APIs. As these use either gRPC or HTTP, you’re not limited to working in languages with SDKs, since you can build your own connections using familiar HTTP POST and GET constructs. So, if you want to take advantage of Rust from a WebAssembly front end, for example, your code just needs to work with the Dapr end point you plan to use. The Dapr building blocks implement common microservice design patterns, with a focus on tasks such as managing state or handling and managing events. Instead of writing new implementations for every new application, you can now use Dapr to manage those functions, allowing you to concentrate on your code logic. Dapr provides tools for managing service end points, as well as for managing state through its Redis cache. It will also manage routing for you, sending messages to the appropriate application end point. All you need is the end point name to set up a Dapr call. Working with IDEs and application frameworks You can use your choice of frameworks for your code, so if you’re building a JavaScript application to run on a node.js instance you can work with Express, or a .NET application could be built around ASP.NET MVC. What’s most important is that Dapr provides a microservice platform that can happily coexist with your current toolset and your choice of development processes. IDE extensions for both IntelliJ and Visual Studio Code help speed up development of Dapr applications. They’re both preview versions so you shouldn’t expect them to support all the building blocks. They do integrate with a local Dapr installation to show currently running Dapr end points for debugging and testing, as well as to provide scaffolding to quickly build your own code around Dapr building blocks. You can even use Visual Studio Code’s remoting to work with Dapr Docker containers from your desktop environment. Cloud-native microservices are an increasingly important element in any modern application stack, so choosing the right development environment and tools is essential. With Dapr approaching its 1.0 release, it’s starting to live up to its initial promise, providing a set of building blocks and supporting tools that help you implement key microservice design patterns in an easy-to-deploy and repeatable fashion. Support for common languages and a framework-agnostic approach ensure that it’s well worth taking a few days to evaluate the Dapr release candidates. Related content feature 14 great preprocessors for developers who love to code Sometimes it seems like the rules of programming are designed to make coding a chore. Here are 14 ways preprocessors can help make software development fun again. By Peter Wayner Nov 18, 2024 10 mins Development Tools Software Development feature Designing the APIs that accidentally power businesses Well-designed APIs, even those often-neglected internal APIs, make developers more productive and businesses more agile. By Jean Yang Nov 18, 2024 6 mins APIs Software Development news Spin 3.0 supports polyglot development using Wasm components Fermyon’s open source framework for building server-side WebAssembly apps allows developers to compose apps from components created with different languages. By Paul Krill Nov 18, 2024 2 mins Microservices Serverless Computing Development Libraries and Frameworks news Go language evolving for future hardware, AI workloads The Go team is working to adapt Go to large multicore systems, the latest hardware instructions, and the needs of developers of large-scale AI systems. By Paul Krill Nov 15, 2024 3 mins Google Go Generative AI Programming Languages Resources Videos