End-to-End AWS-Terraform complete Automation using EFS

Anshika Sharma
5 min readAug 21, 2020

--

AWS-Terraform integration

Hello everyone, today i am going to show a great setup of Automated AWS and Terraform integration using EFS.

For having a deep details about Terraform and AWS , please refer the below link :

So, lets know something about EFS service of AWS.

What is EFS in AWS?

Amazon Elastic File System (Amazon EFS) provides a simple, scalable, fully managed elastic NFS file system for use with AWS Cloud services and on-premises resources. It is built to scale on demand to petabytes without disrupting applications, growing and shrinking automatically as you add and remove files, eliminating the need to provision and manage capacity to accommodate growth.

How AWS and EFS connected

Now, firstly clear to our agenda that what we wanted to create

Step 1 : Create Security group which allow the port 80.

Step 2 : Launch EC2 instance.

Step 3 : In this Ec2 instance use the existing key or provided key and security group which we have created in step 1.

Step 4 : Launch one Volume using the EFS service and attach it in your vpc, then mount that volume into /var/www/html

Step 5: Developer have uploded the code into github repo also the repo has some images.

Step 6: Copy the github repo code into /var/www/html

Step 7: Create S3 bucket, and copy/deploy the images from github repo into the s3 bucket and change the permission to public readable.

Step 8: Create a Cloudfront using s3 bucket(which contains images) and use the Cloudfront URL to update in code in /var/www/html

Lets start the integration. We will create the integration in multiple steps.

Step 1. Configuring AWS, for configuring AWS :

After configuring, check that the configuration is properly done or not. For this, run “aws configure” command.

aws configure command

Step 2. Start creating the terraform code for our main agenda

  1. Connect to the AWS IAM user, with the code :
Connecting to AWS

2. Create the Security group and add some rules to this security group that will allow all the traffic, with the code :

creating security group

3. Create an AWS EC2 instance and provision it in order to install git and httpd, also to enable the httpd service, with the code :

creating EC2 instance

4. Creating a file system using EFS in AWS using terraform code :

creating a file system

Now, we will mount the target, i.e., our EC2 instance.

Mounting the target

Finally, our EFS service for the Ec2 instance named web01 has been setup.

5. Creating the S3 bucket and provision it for cloning the data from the git repository.

Creating S3 bucket and cloning some data

Now, our S3 bucket has been created , we can move for creating Cloudfront.

6. Creating CloudFront , with the code :

Firstly, what is CloudFront?

Amazon CloudFront is a fast content delivery network (CDN) service that securely delivers data, videos, applications, and APIs to customers globally with low latency, high transfer speeds, all within a developer-friendly environment.

creating CloudFront
creating CloudFront (continue)

7. Finally we will launch the web page remotely , with the code :

Launching the Web page

8. Now , we will initialize terraform to apply this code, with the command “terraform init” :

Initializing terraform

9. After initializing, we will apply the code, with the command for apply “terraform apply -auto-approve” :

Applying Terraform code
Applying terraform code (continue)
Applied Terraform code

10. Look at the output, web page launched remotely :

Static Web Page

11. Destroy the environment which we have created, using a single command “terraform destroy -auto-approve” :

destroying the environment
destroying the environment (continue)
Destroy Complete

Hence, we have successfully created a great integration of AWS — Terraform using EFS.

For complete reference to the code, GitHub link :

Further, we will create more such integrations. Feel free to ask for any query.

Thank You!!!

--

--

Anshika Sharma
Anshika Sharma

Written by Anshika Sharma

I am a tech enthusiast, researcher and work for integrations. I love to explore and learn about the new technologies and their right concepts from its core.

No responses yet