Rolling Updates in Kubernetes using Jenkins with Zero downtime
Hello everyone,
In this blog we are going to focus on a great concept of Rolling Updates. Also we are going to create rolling updates in the Kubernetes using an automation tool i.e, Jenkins.
First of all, lets discuss some terminologies that we are going to use in this integration.
- Jenkins: Jenkins is an open-source automation tool written in Java with plugins built for Continuous Integration purposes. Jenkins is used to build and test your software projects continuously making it easier for developers to integrate changes to the project, and making it easier for users to obtain a fresh build.
- Docker: Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and deploy it as one package.
- Kubernetes: Kubernetes is a container orchestration system for Docker containers that is more extensive than Docker Swarm and is meant to coordinate clusters of nodes at scale in production in an efficient manner.
- Rolling Updates: Rolling updates allow Deployments’ update to take place with zero downtime by incrementally updating Pods instances with new ones. The new Pods will be scheduled on Nodes with available resources.
- Pipeline: The pipeline is a set of instructions given in the form of code for continuous delivery and consists of instructions needed for the entire build process. With pipeline, you can build, test, and deliver the application. Node. The machine on which Jenkins runs is called a node.
Now come to the main agenda that what would be the integration with step-by-step approach :
- Create container image that’s has Linux and other basic configuration required to run Slave for Jenkins. ( example here we require kubectl to be configured )
2. When we launch the job it should automatically starts job on slave based on the label provided for dynamic approach.
3. Create a job chain of job1 & job2 using build pipeline plugin in Jenkins
4. Job1 : Pull the Github repo automatically when some developers push repo to Github and perform the following operations as:
a). Create the new image dynamically for the application and copy the application code into that corresponding docker image
b). Push that image to the docker hub (Public repository)
( Github code contain the application code and Dockerfile to create a new image )
5. Job2 ( Should be run on the dynamic slave of Jenkins configured with Kubernetes kubectl command): Launch the application on the top of Kubernetes cluster performing following operations:
a). If launching first time then create a deployment of the pod using the image created in the previous job. Else if deployment already exists then do rollout of the existing pod making zero downtime for the user.
b). If Application created first time, then Expose the application. Else don’t expose it.
In this integration, we are going to use RHEL8. To configure Kubernetes and kubectl in RHEL8 refer to the link below:
To configure Jenkins refer to the link below :
Now lets start the Integration , on step-by-step basis.
Step 1: Creating docker image for dynamic setup of kubectl and kubernetes.
For this , we will create a separate directory, using command “mkdir Task04” , you can give any name to your directory .
Create Dockerfile for building docker image.
Now, we will build the docker image using “docker build -t dynamicslave:latest . ” command .
We need to push this docker image to Docker Hub , a public repository. Also we will tag this docker image first.
Create a tag TARGET_IMAGE that refers to SOURCE_IMAGE
For pushing the docker image, we need to login to the docker hub using “docker login” command.
Push the Docker image using “docker push dynamicslave/dynamicslave:latest” command.
After some minutes, we can see the Docker hub that our docker image has been launched.
Step 2: Setup Docker for accessing it through jenkins.
We have to edit the IP and the port in the file docker.service
After this we need to restart the docker engine
Export the DOCKER_HOST using command “export DOCKER_HOST=IP of VM:Port” the port will be the same as we have added.
Step 3: Setup Jenkins in RHEL 8 , We can use any of the two ways, either setup it dynamically inside a container or normally in RHEL8.
Now, we will reload the daemon using “sudo systemctl daemon-reload” command.
After this we will start and enable jenkins service so that we need not start it after every shut down of VMs.
We can check the status of Jenkins service using “systemctl status jenkins” command.
Step 4: Access Jenkins Dashboard using IP of VM: 8080 port which is the default port for Jenkins.
For signing in we need the Authentication password , that we can get using “cat /var/lib/jenkins/secrets/initialAdminPassword” command.
We will choose install all the suggested plugins.
After some minutes we will be the login page.
Finally, we get the Jenkins Dashboard.
Now we have to configure the cloud environment. First you have to install the Docker plugin from manage plugins.
Now go to Manage jenkins–> Manage Node and Clouds –> configure cloud.
Open Configure Clouds .
So, all done. We have successfully configured the Docker cloud.
Step 5: Now we will start creating the jobs.
JOB1: Pull the Github repo automatically when some developer push code to Github repository and perform the following operations as:
1. Create the new image dynamically for the application and copy the application code into that corresponding docker image
2. Push that image to the docker hub (Public repository).
Git repository will be : “https://github.com/Anshika-Sharma-as/DevOps_Task04.git”
Now, its time for building job 02
JOB2: ( Should be run on the dynamic slave of Jenkins configured with Kubernetes kubectl command). Launch the application on the top of Kubernetes cluster performing following operations:
1. If launching first time then create a deployment of the pod using the image created in the previous job. Else if deployment already exists then do rollout of the existing pod making zero downtime for the user.
2. If Application created first time, then Expose the application. Else don’t expose it.
Now, we will make a post-build action so that as soon as job1 finish , automatically job2 will be started.
Step 6: Create Build Pipeline, for demonstrating the execution flow.
Finally, we will create a build pipeline. For this we need to add the “build pipeline” plugin.
After successful install, create a build pipeline view.
So, we have successfully created a build pipeline for visualizing the execution.
Step 7: Access the Web Page.
Finally, we have create an interesting integration which facilitate rolling update with zero downtime.
I am very to share this !!!!! Feel free to ask if you face any issue.
Thank You!!
“Keep sharing, Keep learning”