Hashicorp Vault Azure Secrets Engine - Secure Your Azure Resources
- 7 minutes read- 1430 words
In this blog post, we talk about the HashiCorp Vault Azure Secrets Engine. This is the first blog post in a new blog post series called End-to-End Infrastructure and Application Deployment.
The goal of this series is to learn best practices around the automation of infrastructure provisioning and application deployment.
We cover the concepts of Infrastructure as Code, CI/CD, secrets management, dynamic secrets, the secret zero problem, service mesh, and more. Our cloud of choice is Azure for this series. Our focus for this blog post is on the first step and that is to create a Vault Azure Secrets Engine. The purpose is to deliver Azure credentials dynamically for provisioning resources in Azure.
Overview of the End-to-End Infrastructure and Deployment Blog Series
Let’s take a look at what this blog series has to offer.
The Big Picture
Below is an overview diagram of this 4 part blog series.
We’ve broken up this blog series into 4 parts:
Part 1: HashiCorp Vault Azure Secrets Engine
This is the topic of this blog post and it’s really the first step to secure our pipeline. The purpose here is to create dynamic short-lived credentials for Azure. We will then use these credentials to provision the Jenkins VM and app VMs in Azure. The credentials are only valid for 1 day and they expire after that.
Here we discuss the secret zero problem and how to solve it. This is often referred to as Vault secure introduction. The issue is that we need to provide the Vault authentication token to our Jenkins pipeline and to our application. Once we have the token, then we can access secrets in Vault. The challenge is how to deliver this Vault token securely. We address secure introduction by using Vault AppRoles, response wrapping, and the Vault-agent.
Finally, we put everything together in this part. Now that we have the Jenkins VM running and we’ve addressed the secret zero problem, we can finally run the pipeline to build our application. Below is the workflow:
A developer commits and pushes code into GitHub
The Jenkins pipeline automatically starts due to a webhook from GitHub to Jenkins
Terraform completes the provisioning and passes the 3 VMs' fully qualified domain names (FQDNs) to Ansible
Ansible configures these VMs to do the following:
Download and install the Consul and Envoy binaries for the service mesh
Pulls the MongoDB Docker image and starts the container
Downloads the Python dependencies for the Webblog app and starts the application
Below are some tools that we use in this series along with topics to learn. You’ll find those relevant to this post in bold italics font.
Some Tools Used in this Series
*Featured in this post
Topics to Learn in this Blog Series
1. Vault Azure Secrets Engine*
2. Packer Images in Azure
3. Terraform Building VMs in Azure based on Packer Images
4. Ansible to Configure an Azure VM
5. The Secret Zero Problem and Vault Secure Introduction
6. Vault App Role
7. Vault Dynamic Database Secrets for MongoDB
8. Vault Transit Secrets Engine
9. Advanced CI/CD Pipeline Workflow using:
Consul Service Mesh
*Featured in this post
Vault Azure Secrets Engine Explanation
Now let’s focus on part 1 of this series and discuss how to create dynamic Azure credentials using HashiCorp Vault. Take a look at the workflow diagram below:
There are 2 personas involved in this workflow:
DevOps engineer or an app
The Vault admin is responsible for the following:
Enabling the Azure secrets engine
Configuring the engine
Creating Vault roles
The DevOps engineer or the app is the consumer of the Azure secrets returned by Vault.
Next, we’ll take a look at the configuration that the Vault admin needs to create.
Azure Secrets Engine Terraform Configuration
A Vault admin is required to run the steps below. This can be done using the Vault CLI or API. My preferred way of doing it is via the Vault provider in Terraform. We are re-using our existing Vault cluster from our previous Webblog series. The Vault admin configuration is located in the main.tf file
We’ve included the relevant Terraform configuration for enabling the Azure secrets engine in Vault below:
Once the credentials are retrieved, you can use them as Terraform variables to be used to provision the Jenkins VM and the application VMs that we will build later. We will see how this is done in the next 3 blog posts.
This Vault policy is used with a token to run the configuration commands. We use the root token in this demo for simplicity, however, in a production setting it’s not recommended to use the root token.
In this blog post, we introduced the new End-to-End Infrastructure and Deployment blog series. We also talked about the first part of the series and that is to create dynamic secrets for Azure. These secrets are created using the HashiCorp Vault Azure secrets engine. They are dynamic in nature and have a TTL of 1 day. This enables a DevOps engineer to carry out the following steps:
Log into Vault each day
Retrieve new Azure credentials for the day
Use these credentials in conjunction with Terraform to provision resources in a defined resource group in Azure
Credentials expire automatically at the end of the day