Copy files to EC2 and S3 bucket using Terraform
This blog post will guide you on how to use Terraform-Provisioners to copy/upload files to EC2 as well as the S3 bucket.
Table of Content
- Pre-requisites
- Setup AWS credentials in the Terraform file
- Setup an EC2 instance, a security group, and SSH key pair resources.
- Use file provisioner to upload the file to EC2
- Upload files to the S3 bucket using Terraform.
- Initialize and Apply the Terraform configuration.
- Conclusion
Let's take a look at the prerequisites:
1. Pre-requisites
- AWS Account - You must have a registered AWS account with active billing. If you are working in a corporate AWS environment, then you must have the necessary permissions to create and manage EC2 and S3 buckets.
- Terraform Installed - It very obvious that you should have Terraform installed on your working machine. Please refer to this guide on how to install the terraform.
- AWS CLI installed - The last thing you need is to have AWS CLI installed on your working machine. Although AWS CLI is not mandatory, it is always recommended to have AWS CLI installed for troubleshooting as well as using the credentials file inside the Terraform file.
2. Setup AWS credentials in the Terraform file
After completing the pre-requisites you must setup the AWS credentials correctly so that your Terraform code can authenticate and communicate with your AWS environment.
There are a few ways to setup your AWS credentials inside your Terraform file, Please choose either of the following:
- Using the credentials file - To use the AWS credentials file inside your Terraform file, you must install the AWS CLI beforehand.
Here is a Terraform code snippet:
1# Note - Please replace the path with your credentials files
2
3provider "aws" {
4 region = "eu-central-1"
5 shared_credentials_files = ["/<path>/<to-aws-credentials>/.aws/credentials"]
6}
- Hard Code Access Key and Secret Access key- The second way would be to hard code the access key and secret key, but I would not recommend this approach because, with this approach, your AWS credentials might end up in the versioning system.
Here is the example code snippet -
1# Replace the values with your AWS credentials
2
3provider "aws" {
4 region = "eu-central-1"
5 access_key = <PLACE-YOUR-ACCESS-KEY>
6 secret_key = <PLACE-YOUR-SECRET-KEY>
7}
- Export AWS Credentials as Environment Variables The third way would be to export the AWS credentials as environment variables.
Use the following command to export the AWS credentials -
1# Replace the values with your AWS credentials
2
3export AWS_ACCESS_KEY_ID="your_access_key"
4export AWS_SECRET_ACCESS_KEY="your_secret_key"
3. Setup an EC2 instance, a security group, and SSH key pair resources.
Let's setup the EC2 instance along with the security group so that the same EC2 instance can be used later to copy the files.
Here is the Terraform code stack for the same:
- Step 1- Resource block for an EC2 instance
- Step 2- Resource block for Security Group
- Step 3- Setup SSH Key Pair & private key)
1# Step 1 - Resource block for EC2 instance
2resource "aws_instance" "ec2_example" {
3
4 ami = "ami-0767046d1677be5a0"
5 instance_type = "t2.micro"
6 key_name = "aws_key"
7 vpc_security_group_ids = [aws_security_group.main.id]
8}
9
10# Step 2 - Resource block for Security Group
11resource "aws_security_group" "main" {
12 egress = [
13 {
14 cidr_blocks = ["0.0.0.0/0",]
15 description = ""
16 from_port = 0
17 ipv6_cidr_blocks = []
18 prefix_list_ids = []
19 protocol = "-1"
20 security_groups = []
21 self = false
22 to_port = 0
23 }
24 ]
25 ingress = [
26 {
27 cidr_blocks = ["0.0.0.0/0",]
28 description = ""
29 from_port = 22
30 ipv6_cidr_blocks = []
31 prefix_list_ids = []
32 protocol = "tcp"
33 security_groups = []
34 self = false
35 to_port = 22
36 }
37 ]
38}
39
40# Step 3 - Set up the SSH key pair
41# To generate SSH key refer to - https://jhooq.com/terraform-generate-ssh-key
42resource "aws_key_pair" "deployer" {
43 key_name = "aws_key"
44 public_key = "<PLACE-YOUR-PUBLIC-KEY>"
45}
4. Use file provisioner to upload the file to EC2
In the previous steps, we set up the EC2 instance. Now we need to use the file provisioner to copy/upload the files to the EC2 instance.
Let's update the code snipped from the Step-3 and add the file provisioner to the same terraform code tack.
Here is the code snippet:
1# Step 1: Resource block for EC2 instance
2# File provisioner: Added the file provisioner
3# To use the file provisioner, you need to specify the following:
4# Source: file to be copied from;
5# Destination: where file needs to be copied
6
7resource "aws_instance" "ec2_example" {
8
9 ami = "ami-0767046d1677be5a0"
10 instance_type = "t2.micro"
11 key_name = "aws_key"
12 vpc_security_group_ids = [aws_security_group.main.id]
13
14 # File provisioner with source and destination
15 provisioner "file" {
16 source = "/home/rahul/Jhooq/keys/aws/test-file.txt"
17 destination = "/home/ubuntu/test-file.txt"
18 }
19
20 # Connection is necessary for file provisioner to work
21 connection {
22 type = "ssh"
23 host = self.public_ip
24 user = "ubuntu"
25 private_key = file("/home/rahul/Jhooq/keys/aws/aws_key")
26 timeout = "4m"
27 }
28}
29
30
31# Step 2 - Resource block for Security Group
32resource "aws_security_group" "main" {
33 egress = [
34 {
35 cidr_blocks = ["0.0.0.0/0",]
36 description = ""
37 from_port = 0
38 ipv6_cidr_blocks = []
39 prefix_list_ids = []
40 protocol = "-1"
41 security_groups = []
42 self = false
43 to_port = 0
44 }
45 ]
46 ingress = [
47 {
48 cidr_blocks = ["0.0.0.0/0",]
49 description = ""
50 from_port = 22
51 ipv6_cidr_blocks = []
52 prefix_list_ids = []
53 protocol = "tcp"
54 security_groups = []
55 self = false
56 to_port = 22
57 }
58 ]
59}
60
61# Step 3 - Set up the SSH key pair
62# To generate SSH key refer to - https://jhooq.com/terraform-generate-ssh-key
63resource "aws_key_pair" "deployer" {
64 key_name = "aws_key"
65 public_key = "<PLACE-YOUR-PUBLIC-KEY>"
66}
5. Upload files to the S3 bucket using Terraform.
Uploading files to **S3 bucket is relatively easy compared to EC2 instance. There are a couple of key parameters you need to keep in mind while working with S3 Bucket for uploading the files.
- key - The destination directory where you want to upload the file
- source - The source directory from where you want to upload the file
Here is the code snippet-
1resource "aws_s3_bucket_object" "example" {
2 bucket = aws_s3_bucket.example.bucket
3 key = "path/to/remote/file"
4 source = "path/to/local/file"
5 etag = filemd5("path/to/local/file")
6 content_type = "text/plain"
7}
5.1 Uploading multiple files to an S3 bucket
Taking the previous example where we have uploaded only a single file to the S3 bucket, let's modify the same code to upload multiple files onto the S3 bucket.
for_each - For uploading more than one file, we must use for_each loop inside the aws_s3_bucket_object resource block.
1resource "aws_s3_bucket_object" "object1" {
2 for_each = fileset("uploads/", "*")
3 bucket = aws_s3_bucket.spacelift-test1-s3.id
4 key = each.value
5 source = "uploads/${each.value}"
6 etag = filemd5("uploads/${each.value}")
7}
6. Initialize and Apply the Terraform configuration.
Once you have completed your terraform stack, it is time to initialize and apply terraform configuration.
Use the following terraform command from the terminal:
1# Initialize terraform
2terraform init
3
4# Plan your changes
5terraform plan
6
7# Apply the changes
8terraform apply
7. Conclusion
By following the above steps, you should be able to copy and upload files to EC2 and AWS S3 buckets using Terraform. This method is particularly useful for automating the deployment of static assets and configuration files in cloud environments.
Posts in this Series
- Securing Sensitive Data in Terraform
- Boost Your AWS Security with Terraform : A Step-by-Step Guide
- How to Load Input Data from a File in Terraform?
- Can Terraform be used to provision on-premises infrastructure?
- Fixing the Terraform Error creating IAM Role. MalformedPolicyDocument Has prohibited field Resource
- In terraform how to handle null value with default value?
- Terraform use module output variables as inputs for another module?
- How to Reference a Resource Created by a Terraform Module?
- Understanding Terraform Escape Sequences
- How to fix private-dns-enabled cannot be set because there is already a conflicting DNS domain?
- Use Terraform to manage AWS IAM Policies, Roles and Users
- How to split Your Terraform main.tf File into Multiple Files
- How to use Terraform variable within variable
- Mastering the Terraform Lookup Function for Dynamic Keys
- Copy files to EC2 and S3 bucket using Terraform
- Troubleshooting Error creating EC2 Subnet InvalidSubnet Range The CIDR is Invalid
- Troubleshooting InvalidParameter Security group and subnet belong to different networks
- Managing strings in Terraform: A comprehensive guide
- How to use terraform depends_on meta argument?
- What is user_data in Terraform?
- Why you should not store terraform state file(.tfstate) inside Git Repository?
- How to import existing resource using terraform import comand?
- Terraform - A detailed guide on setting up ALB(Application Load Balancer) and SSL?
- Testing Infrastructure as Code with Terraform?
- How to remove a resource from Terraform state?
- What is Terraform null Resource?
- In terraform how to skip creation of resource if the resource already exist?
- How to setup Virtual machine on Google Cloud Platform
- How to use Terraform locals?
- Terraform Guide - Docker Containers & AWS ECR(elastic container registry)?
- How to generate SSH key in Terraform using tls_private_key?
- How to fix-Terraform Error acquiring the state lock ConditionalCheckFiledException?
- Terraform Template - A complete guide?
- How to use Terragrunt?
- Terraform and AWS Multi account Setup?
- Terraform and AWS credentials handling?
- How to fix-error configuring S3 Backend no valid credential sources for S3 Backend found?
- Terraform state locking using DynamoDB (aws_dynamodb_table)?
- Managing Terraform states?
- Securing AWS secrets using HashiCorp Vault with Terraform?
- How to use Workspaces in Terraform?
- How to run specific terraform resource, module, target?
- How Terraform modules works?
- Secure AWS EC2s & GCP VMs with Terraform SSH Keys!
- What is terraform provisioner?
- Is terraform destroy needed before terraform apply?
- How to fix terraform error Your query returned no results. Please change your search criteria and try again?
- How to use Terraform Data sources?
- How to use Terraform resource meta arguments?
- How to use Terraform Dynamic blocks?
- Terraform - How to nuke AWS resources and save additional AWS infrastructure cost?
- Understanding terraform count, for_each and for loop?
- How to use Terraform output values?
- How to fix error configuring Terraform AWS Provider error validating provider credentials error calling sts GetCallerIdentity SignatureDoesNotMatch?
- How to fix Invalid function argument on line in provider credentials file google Invalid value for path parameter no file exists
- How to fix error value for undeclared variable a variable named was assigned on the command line?
- What is variable.tf and terraform.tfvars?
- How to use Terraform Variables - Locals,Input,Output
- Terraform create EC2 Instance on AWS
- How to fix Error creating service account googleapi Error 403 Identity and Access Management (IAM) API has not been used in project before or it is disabled
- Install terraform on Ubuntu 20.04, CentOS 8, MacOS, Windows 10, Fedora 33, Red hat 8 and Solaris 11