How do you secure Terraform deployments on AWS with Gitlab-CI and Vault? In previous articles we have seenIssues with CI/CD deployed in the cloudand how to solve these problemsUse Vault to dynamically generate secrets and validate Gitlab-CI pipelinesIn this third and final article, we will discuss secret recovery from the application side.
In the previous section, we saw how to solve various problems encountered in the process stage. But we use Gitlab CI is not limited to infrastructure implementation but also includes the application itself.
Summarizing what we have done so far at the workflow level, we have the following diagram:

As we can see, our application needs to retrieve the secrets stored in the Vault to connect to the database.
However, we want the interaction with the app and the vault to be as seamless as possible, and for that we need to reduce dependencies:
- About certification: Which method should we choose to make our application as transparent and secure as possible?
- About using secret keys: How to return dynamic secrets (short TTL) without affecting application code?
authentication application
When it comes to authenticating our apps using Vault, if we want it to be as seamless and secure as possible, it's important to base it on the environment where our app is deployed.
Our application is deployed on AWS, which is perfect because on the Vault side we haveAWS-style authentication methods.
There are two types of authentication:I amIEC2.
Since our application is deployed on an EC2 instance, we will use the EC2 type.
If we take a closer look at how this method works, we come across the following:

- 0 - So far Gitlab-CI has deployed our application to EC2 instances via Terraform and stored the database secrets in Vault.
- 1 – Once our EC2 instance is deployed, its metadata (such as instance ID, subnet, VPC ID where our EC2 instance is deployed, etc.) is obtained via the EC2 metadata service. You can findOfficial AWS Documentation.
- 2 - Our app is verified in Vault using AWS EC2 methodg PKCS7 signature.
- 3 – Vault authenticates and the EC2 instance hosting our application meets the authentication requirements (binding parameters) (eg is it in the correct VPC and subnet? is it the correct instance ID? etc.)
- 4 – If authentication is successful, Vault returns a token.
To implement this authentication method, you must:
- Vault must be able to verify the identity and metadata of an EC2 instanceto the target AWS account.
Specify authentication requirements (binding parameters) in the vault for which we want to allow the application to authenticate.
Identity and metadata verification
In order for Vault to view the information on our EC2 instance, it needs permission from the target account to describe the example as follows:ec2: Describes the case.
To do this, and following the same logic as in our previous demo, we'll create an IAM role that Vault will assume.
The IAM role must contain the following rules:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "ec2:DescribeInstances" ], "Resource": "*" } ]}
Irelationship of trustAuthorize the source account (the account where the vault resides) to assume the IAM role. Regarding the configuration of the vault,We configure it through Terraform:
resource "vault_auth_backend" "aws" { description = "Auth backend to auth project in AWS env" type = "aws" path = "${var.project_name}-aws"}πόρος "vault_aws_auth_backend_sts_role" "role" { backend = vault_auth_backend. aws.path account_id = split(":", var.vault_aws_assume_role)[4] sts_role = var.vault_aws_assume_role}
As we can see, we also define the IAM role (STS) that Vault should assume.
At this point Vault can verify the information on our EC2 instance.
Set authentication constraints (binding parameters)
In terms of security, under what conditions do we allow our app to authenticate to the Vault to retrieve its secrets?
If we take a closer look at Vault's EC2-style AWS authentication approach, we can base it onvarious standards, For example:AMI ID, Account ID, Region, VPC ID, Subnet ID, and IAM Role ARN,ARN presence profiletheEC2 instance ID.
We can specify multiple criteria and multiple values for each criterion. The value of each condition must be met for Vault to accept authentication.
In our case, our EC2 instances were created by Terraform. This is ideal because with Terraform we can retrieve all the snapshot properties and set them on the Treasury side asbinding parameters.which gives usNext Terraform clipCreate a Vault role for the AWS authentication backend:
资源 "vault_aws_auth_backend_role" "web" { backend = local.aws_backend role = var.project_name auth_type = "ec2" bound_ami_ids = [data.aws_ami.amazon-linux-2.id] bound_account_ids.ids. s = [data.aws_vpc.default.id] bound_subnet_ids = [aws_instance.web.subnet_id] bound_region = var.region token_ttl = var.project_token_ttl token_max_ttl = var.project_token_max_ttl",]project_token_max_ttl"]
In our case, we depend onAMI identifier,VPC number,subnet numberIAWS region.We could add an instance ID to strengthen the security of our authentication, but this pattern should be avoidedAuto Scaling Group.
At this point, our application can authenticate and retrieve its secret from the vault.
Secrets of using our application
On the Vault integration side of our application we will usecashier.
For those who want to know more details about Vault Agent integration, you can refer toThe articles below.
Let's take a closer look at our application workflow using Vault:

As we can see, the Vault agent manages two phases:
- AWSconfirmSpin using Vault and Vault Tokens
- secretrecoveryand their tobacco
underFormation of the Ministry of FinanceWe have:
auto_auth { 方法 { mount_path = "auth/${vault_auth_path}" type = "aws" config = { type = "ec2" role = "web" } } sink { type = "file" config = { path = "/home/ ec2-user/.vault-token" } }}predložak { izvor = "/var/www/secrets.tpl" odredište = "/var/www/secrets.json"}
Terraform can configure this file to override certain values, e.g.the role of the fundname orinstallation path.The days,As for the secret standards, we want to get the secret in JSON format, which gives us the following format:
{{{ με μυστικό "web-db/creds/web" }} "username":"{{ .Data.username }}", "password":"{{ .Data.password }}", "db_host": "${db_host}", "db_name":"${db_name}" {{ kraj }}}
Vault is responsible for all things Vault related (Vault tokens, secrets, refreshes, etc.)Let our app retrieve its secrets from the related files:
if (file_exists("/var/www/secrets.json")) { $secrets_json = file_get_contents("/var/www/secrets.json", "r"); $user = json_decode($json_secrets)->{'my friend'}; $pass = json_decode($json_secrets)->{'password'}; $host = json_decode($json_secrets)->{'db_host'}; $dbname = json_decode($secrets_json)->{'db_name ' };}else{ echo "This is not trying.";exception?}
This gives us the expected result:

Compared to what we do now, we have the following workflow:

- Terraform has a Vault service providerto simplify the interaction between the two tools.
- Some pipeline branches or Gitlab-CI tasks can be certified through VaultJWT authentication method.
- The Ministry of Finance allowsCloud provider credentials to createTo implement IaC through Terraform.for amazon, which can assume IAM roles across multiple AWS accounts, allowing our Terraform to implement IaC across multiple AWS accounts.
- The Ministry of Finance allowsCollect more types of secretsProvides information to projects in an environmentally neutral manner, including secrets generated by the IAC.
- Vault Agent simplifies vault integrationIn the application lifecycle that handles authentication and secrets
- The Vault Token used by the Vault Agent is short-lived and changes frequently.
- Application Secret (databaseIn our example) is short-lived and updated frequently via the Vault proxy.
- We allow our app to authenticate to Vault based on the environment it's in.For our AWS application, we rely on AWS EC2 authentication andbinding parametersFor example, Subnet ID, VPC ID, AWS Region, etc. where our app is located.
As we saw in this article, Vault allows us to protect Terraform end-to-end through Gitlab-CI, including IaC or our application itself.
Additionally, the Vault proxy allows us to reduce the application's dependency on Vault.
When used correctly, this integration is transparent to operators, developers, and applications. The secret becomes transparent to allshort life cycle.why would you know the secretWhat if we could have transparent secrets as a service?
Author :Mehdi Laruelle, Devoteam Revolve Alumni
FAQs
How to connect Terraform Cloud to GitLab? ›
- Step 1: On Terraform Cloud, Begin Adding a New VCS Provider. Go to your organization's settings and then click Providers. ...
- Step 2: On GitLab, Create a New Application. ...
- Step 3: On Terraform Cloud, Set up Your Provider. ...
- Step 4: On Terraform Cloud, Set Up SSH Keypair (Optional)
- Adding Runtime Environment variables on GitLab. ...
- Pushing the code to repo. ...
- Creating Pipeline from GitLab Runner. ...
- Create an EC2 instance to deploy the application and it can be public. ...
- Configure EC2 instance by installing necessary packages. ...
- Creating Pipeline and CodeDeploy. ...
- Create a Pipeline.
The Vault provider allows Terraform to read from, write to, and configure HashiCorp Vault. Interacting with Vault from Terraform causes any secrets that you read and write to be persisted in both Terraform's state file and in any generated plan files.
How do you deploy cloud resources using Terraform? ›- Create the directory.
- Create the Virtual Private Cloud network and subnet.
- Create the Compute Engine VM resource.
- Initialize Terraform.
- Validate the Terraform configuration.
- Apply the configuration.
- Let's Start Deploying. First, you need an account with AWS. ...
- Example Application. ...
- AWS Requirements. ...
- Staying Organized. ...
- The AWS Management Console. ...
- Create an IAM User. ...
- Create an S3 Bucket. ...
- Create your EC2 Instance.
- Pull the source code from source control.
- Lint any configuration files.
- Run unit tests against the AWS Lambda functions in codebase.
- Deploy the test pipeline.
- Run end-to-end tests against the test pipeline.
- Clean up test state machine and test infrastructure.
- Send approval to approvers.
- Optional: Start a Vault server in development mode with root as the root token if you don't have one running already. ...
- Set the client token in the VAULT_TOKEN environment variable. ...
- Initialize Terraform to pull Vault provider plugin. ...
- Execute the apply command to configure Vault.
- git clone this repo to your computer.
- Optional: build a Vault and Consul AMI. ...
- Install Terraform.
- Open variables.tf , set the environment variables specified at the top of the file, and fill in any other variables that don't have a default. ...
- Run terraform init .
- Run terraform apply .
Inside the keyvault folder, create the variables.tf file to store variables used by the module: Then, create the main.tf to create the Azure Key Vault and policies, inside the keyvault folder: Finally, we create the ouput.tf file in the same folder used to return the values of the Terraform module.
How do I deploy infrastructure using Terraform in AWS? ›- Scope - Identify the infrastructure for your project.
- Author - Write the configuration for your infrastructure.
- Initialize - Install the plugins Terraform needs to manage the infrastructure.
- Plan - Preview the changes Terraform will make to match your configuration.
How does Terraform deploy to AWS? ›
- AWS Credentials.
- Create an AWS Key Pair.
- Define AWS and Terraform Providers.
- Create a VPC, Subnet and other network components.
- Create Operating System Versions Variables.
- (Optional) Create a Bootstrapping script to install and/or configure applications.
- Create the main.tf file. Open your text/code editor and create a new directory. ...
- Create the variables.tf file. Once the main.tf file is created, it's time to set up the necessary variables. ...
- Create the EC2 environment. ...
- Clean up the environment.
The provider alias allows Terraform to differentiate the two AWS providers. To allow users in a different AWS account to assume a role, you must define an AssumeRole policy for that account. This configuration uses the aws_caller_identity data source to access the source account's ID.
How do I configure AWS credentials in Terraform cloud? ›- Configure AWS: Set up a trust configuration between AWS and Terraform Cloud. ...
- Configure Terraform Cloud: Add environment variables to the Terraform Cloud workspaces where you want to use Dynamic Credentials.
GitLab is a DevOps platform with bring-your-own-infrastructure flexibility. From the on-premise to cloud, run GitLab on AWS and deploy to your workloads and AWS infrastructure using a single solution for everyone on your pipeline.
How to deploy REST API in AWS EC2? ›...
Sign in to the API Gateway console at https://console.aws.amazon.com/apigateway .
- In the APIs navigation pane, choose the API you want to deploy.
- In the Resources navigation pane, choose Actions.
- From the Actions drop-down menu, choose Deploy API.
Terraform Cloud can be fully operated via API, CLI, and UI, which allows organizations to easily integrate it into their existing CI/CD pipelines, IT service management interfaces, and version control system processes.
How to build a CI CD pipeline in AWS in 5 minutes and 58 seconds? ›- Prerequisites. This pipeline uses the SAM template to build and deploy a serverless application. ...
- Create the Repository. ...
- The Initial Commit. ...
- Create an S3 Bucket. ...
- Create a Pipeline Role. ...
- Build the Pipeline. ...
- Update the Build Permissions. ...
- Run the Pipeline.
- Prerequisites.
- Step 1: Set up a GitHub account.
- Step 2: Create a GitHub repository.
- Step 3: Upload a sample application to your GitHub repository.
- Step 4: Provision an instance.
- Step 5: Create an application and deployment group.
- Step 6: Deploy the application to the instance.
- Step 7: Monitor and verify the deployment.
The first steps for securing your team's CI/CD pipeline include locking down configuration managers, systems that host repositories and the build servers. The pipeline should be monitored from end to end with watertight access control across the entire toolchain.
What is the difference between AWS CodeDeploy and CodePipeline? ›
The main difference between AWS CodeDeploy and AWS CodePipeline is the type of application they deploy. AWS CodeDeploy is used to deploy applications to Amazon EC2 instances, Amazon ECS containers, or any other instance type supported by AWS. AWS CodePipeline is used to deploy applications to Amazon S3 buckets.
How do I set up a CI CD pipeline for AWS Lambda? ›- Basic Requirements.
- Step 1: Setting Up the GitHub Repository.
- Step 2: Writing the Lambda Function.
- Step 3: Defining the Infrastructure with Terraform.
- Step 4: Writing the Workflow.
- Step 5: Running the Workflow.
- Engage: Sign up for the Technology Partner Program.
- Develop & Test: Understand and build using the API integration for Run Tasks.
- Review: Review integration with HashiCorp Alliances team.
- Release: Provide documentation for your Integration.
- Automate Terraform with GitHub Actions.
- Prerequisites.
- Set up Terraform Cloud.
- Set up a GitHub repository.
- Review Actions workflows.
- Create pull request.
- Review and merge pull request.
- Verify EC2 instance provisioned.
File encryption is an effective technique to store and manage access keys and secrets. Terraform users can use this technique to encrypt sensitive information stored in the config and state files. This technique relies on: Encrypting the secrets.
Is Terraform vault free? ›How do Terraform Cloud paid features work? Terraform Cloud is free to use but additional capabilities will be made available for purchase. These additional capabilities are offered as paid features, available for purchase on a per Organization basis.
Where is Terraform lock? ›Lock File Location
For that reason Terraform creates it and expects to find it in your current working directory when you run Terraform, which is also the directory containing the . tf files for the root module of your configuration. The lock file is always named . terraform.
Azure Key Vault is a secure secrets store, providing management for secrets, keys, and certificates, all backed by Hardware Security Modules.
How do I connect Terraform cloud to GitHub? ›Open Terraform Cloud in your browser and navigate to your organization settings. Then, select "VCS Providers". Click the "Add VCS Provider" button. Select "GitHub" then "GitHub.com (Custom)" from the dropdown.
How to connect GitLab to Google Cloud? ›- Open the Triggers page in the Google Cloud console. ...
- Click Connect Repository. ...
- Under Select source, select GitLab Enterprise Edition.
- Under Select Repository, select the GitLab Enterprise Edition repositories you wish to connect to Cloud Build.
How to deploy VPC AWS using Terraform? ›
- Open a terminal and navigate to the ~\terraform-vpc-demo directory. cd ~\terraform-vpc-demo. ...
- Run the terraform init command in the same directory. ...
- Now, run the terraform plan command. ...
- Next, tell Terraform actually to provision the AWS VPC and resources using terraform apply .
- git clone https://github.com/tfutils/tfenv.git ~/.tfenv mkdir ~/bin. Make sysmlink for tfenv/bin/* scripts into a path ~/bin because it has already added to $PATH.
- ln -s ~/.tfenv/bin/* ~/bin/ Install Terraform with tfenv . ...
- tfenv install 1.2.5. ...
- tfenv use 1.2.5. ...
- terraform --version.
Trunk based development is the most recommended git branching strategy and you can apply it to your terraform code as well. You can achieve this by separating your deployment code into multiple directories and having different triggers or pipelines for every folder.
What is the difference between terraform and terraform Cloud? ›Terraform Cloud vs.
Another option Hashicorp offers for Terraform Cloud is its Enterprise solution. The main difference is that Terraform Enterprise is self-hosted and offered as a private installation rather than a SaaS solution. Terraform Enterprise offers most of the same features as the Terraform Business tier.
It's recommended to use CI/CD tools like Jenkins, TravisCI, or CircleCI to automate these steps and integrate with version control systems like git to version control your terraform code.
Can terraform manage cross Cloud dependencies? ›Terraform lets you use the same workflow to manage multiple providers and handle cross-cloud dependencies.
Can we connect GitLab to AWS? ›The first thing that will come into everyone's mind is to simply store ready-to-use credentials of a technical user as part of Gitlab CI/CD Variables. Those will be provided to each step of your CI/CD pipeline (. gitlab-ci. yml) and you will be able to perform and deploy operations using AWS CLI or AWS SAM .
What cloud provider does GitLab use? ›Our GitLab.com core infrastructure is primarily hosted in Google Cloud Platform's (GCP) us-east1 region (see Regions and Zones).
How to create a Terraform pipeline in GitLab? ›- Set up GitLab project repository.
- Create the Terraform configuration files.
- Set up pipelines using .gitlab-ci.yml file.
- Set up AWS Credentials in Gitlab.
- Set up the remote backend for Terraform on Gitlab.
- Configure the backend in the provider block for local development.
GitLab can be classified as a tool in the "Code Collaboration & Version Control" category, while Terraform is grouped under "Infrastructure Build Tools". Some of the features offered by GitLab are: Manage git repositories with fine grained access controls that keep your code secure.