AWS, Cloud Computing

4 Mins Read

Deploy Your Application Using AWS CodePipeline

Voiced by Amazon Polly

Introduction

Modern day environment needs DevOps best practices to integrate and deploy your application faster and securely on the cloud.

AWS DevOps combines development and operations practices, leveraging AWS services to automate infrastructure provisioning, CI/CD pipelines, monitoring, and scaling for efficient and reliable software delivery.

This blog will guide you through deploying a static website using AWS Code Pipeline. By following these steps, you will have set up an AWS CodePipeline that retrieves code from GitHub, builds it using AWS CodeBuild, and stores the artifacts in the Amazon S3 bucket for deployment.

This process enables continuous integration and deployment of your static website.

Pioneers in Cloud Consulting & Migration Services

  • Reduced infrastructural costs
  • Accelerated application deployment
Get Started

Step-by-Step Guide

Step 1: Creating Amazon S3 Buckets

To begin, you need to create two Amazon S3 buckets. The first bucket will be used for storing artifacts, which will not be publicly accessible and will be encrypted.

The second bucket will host your website. Enable versioning for this bucket, which allows you to keep track of changes to your website over time. Make this bucket public so that website visitors can access it.

step1

step1b

Step 2: Connecting to GitHub

Connecting AWS with GitHub allows for seamless integration and automation of code deployment, enabling continuous integration and delivery pipelines to fetch code, trigger builds, and deploy applications easily.

step2

Step 3: Creating AWS CodeBuild

Now, it’s time to create an AWS CodePipeline. Before that, you will create a build stage, although your code is already compiled in this case. This step will demonstrate how to set up a build stage.

AWS CodeBuild:

  • Enter a name for the AWS CodeBuild project.
  • Choose GitHub as the source provider. Other options include Amazon S3, AWS CodeCommit, Bitbucket, etc.
  • Provide a name for the AWS CodeBuild role.
  • In the build configuration, specify a buildspec.yml file as the build specification. This file defines the build steps and actions.
  • Enable Amazon CloudWatch logs to track the build progress and logs.
  • Create the build.

step3

The buildspec.yml file will contain the instructions for AWS CodeBuild to perform the build steps. You can define the necessary commands and actions required to build your code.

Step 4: Creating the AWS CodePipeline

In this step, you will create the AWS CodePipeline, which will automate fetching code from GitHub, building it, and deploying it to the destination Amazon S3 bucket. Here’s how to proceed:

  • Give the AWS CodePipeline a suitable name that identifies its purpose.
  • Create a new service role or choose an existing one with the necessary permissions to perform actions within the pipeline.
  • Specify a custom location for artifacts. This is the Amazon S3 bucket you created in Step 1, where the AWS CodeBuild artifacts will be stored.
  • Choose “GitHub” as the source provider and connect your GitHub account.
  • Select the repository that contains your website’s code.
  • Add a build stage to the pipeline and configure the build setup. This is where you will specify the AWS CodeBuild project you created in Step 2.
  • Add a deploy stage to the pipeline and specify the destination Amazon S3 bucket where you want to deploy your website.

step4

  • Review the configuration of your AWS CodePipeline to ensure everything is set up correctly.
  • Create the AWS CodePipeline.

Once the AWS CodePipeline is created, it will start executing the pipeline stages automatically. The pipeline will fetch the code from your GitHub repository, initiate the AWS CodeBuild project to build the code, and upload the resulting artifacts to the Amazon S3 artifact bucket. Finally, the artifacts will be deployed to the specified destination Amazon S3 bucket.

This automated process allows for efficient and consistent deployment of your website whenever changes are made to the code.

Step 5: Reviewing the code pipeline flow

The Proof of Concept (POC) flow begins by creating two Amazon S3 buckets: one for encrypted artifacts and another for hosting the website with versioning enabled. Next, AWS is connected with GitHub to facilitate seamless integration and automation of code deployment. An AWS CodeBuild project specifies GitHub as the source provider and uses a buildspec.yml file to define the build steps. Following this, an AWS CodePipeline is set up with a custom artifacts bucket, connecting to the GitHub repository, enabling webhooks for automated triggering, and configuring build and deploy stages. The pipeline is reviewed and created. Once created, the AWS CodePipeline initiates the process by fetching code from GitHub, triggering AWS CodeBuild to build the code, and uploading resulting artifacts to the artifact bucket. Finally, the artifacts are deployed to the specified destination Amazon S3 bucket, ensuring efficient and consistent website deployment whenever there are code changes.

step5

Conclusion

This blog provides a comprehensive guide on deploying a static website using AWS CodePipeline. The process involves creating Amazon S3 buckets for artifacts and hosting, connecting to GitHub for seamless integration, setting up AWS CodeBuild with a buildspec.yml file, creating the AWS CodePipeline with build and deploy stages, and reviewing the configuration. The AWS CodePipeline automates code fetching, building with AWS CodeBuild, and deploying artifacts to the specified Amazon S3 bucket, leveraging DevOps best practices for secure and rapid application delivery in the cloud.

Drop a query if you have any questions regarding Amazon S3 Dashboard and we will get back to you quickly.

Making IT Networks Enterprise-ready – Cloud Management Services

  • Accelerated cloud migration
  • End-to-end view of the cloud environment
Get Started

About CloudThat

CloudThat is an official AWS (Amazon Web Services) Advanced Consulting Partner and Training partner, AWS Migration Partner, AWS Data and Analytics Partner, AWS DevOps Competency Partner, Amazon QuickSight Service Delivery Partner, AWS EKS Service Delivery Partner, and Microsoft Gold Partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best-in-industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.

To get started, go through our Consultancy page and Managed Services PackageCloudThat’s offerings.

FAQs

1. Can we trigger the AWS code pipeline manually?

ANS: – Yes, you can trigger an AWS CodePipeline manually. AWS CodePipeline provides a web-based console, a CLI and API’s.

2. Can we encrypt artifacts stored in Amazon S3 buckets?

ANS: – Yes, we can encrypt the artifacts using AWS KMS (key management service).

3. What options can be used as a source for AWS CodeBuild?

ANS: – You can choose between various options such as GitHub, Amazon ECR, Local system, AWS CodeCommit, Bitbucket, and Amazon S3.

WRITTEN BY Akshay Mishra

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!