Rolling Updates in Kubernetes using Jenkins with Zero downtime

Anshika Sharma
8 min readAug 29, 2020

--

Logo

Hello everyone,

In this blog we are going to focus on a great concept of Rolling Updates. Also we are going to create rolling updates in the Kubernetes using an automation tool i.e, Jenkins.

First of all, lets discuss some terminologies that we are going to use in this integration.

  1. Jenkins: Jenkins is an open-source automation tool written in Java with plugins built for Continuous Integration purposes. Jenkins is used to build and test your software projects continuously making it easier for developers to integrate changes to the project, and making it easier for users to obtain a fresh build.
  2. Docker: Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and deploy it as one package.
  3. Kubernetes: Kubernetes is a container orchestration system for Docker containers that is more extensive than Docker Swarm and is meant to coordinate clusters of nodes at scale in production in an efficient manner.
  4. Rolling Updates: Rolling updates allow Deployments’ update to take place with zero downtime by incrementally updating Pods instances with new ones. The new Pods will be scheduled on Nodes with available resources.
  5. Pipeline: The pipeline is a set of instructions given in the form of code for continuous delivery and consists of instructions needed for the entire build process. With pipeline, you can build, test, and deliver the application. Node. The machine on which Jenkins runs is called a node.

Now come to the main agenda that what would be the integration with step-by-step approach :

  1. Create container image that’s has Linux and other basic configuration required to run Slave for Jenkins. ( example here we require kubectl to be configured )
    2. When we launch the job it should automatically starts job on slave based on the label provided for dynamic approach.
    3. Create a job chain of job1 & job2 using build pipeline plugin in Jenkins
    4. Job1 : Pull the Github repo automatically when some developers push repo to Github and perform the following operations as:
    a). Create the new image dynamically for the application and copy the application code into that corresponding docker image
    b). Push that image to the docker hub (Public repository)
    ( Github code contain the application code and Dockerfile to create a new image )
    5. Job2 ( Should be run on the dynamic slave of Jenkins configured with Kubernetes kubectl command): Launch the application on the top of Kubernetes cluster performing following operations:
    a). If launching first time then create a deployment of the pod using the image created in the previous job. Else if deployment already exists then do rollout of the existing pod making zero downtime for the user.
    b). If Application created first time, then Expose the application. Else don’t expose it.
Architecture of the integration

In this integration, we are going to use RHEL8. To configure Kubernetes and kubectl in RHEL8 refer to the link below:

To configure Jenkins refer to the link below :

Now lets start the Integration , on step-by-step basis.

Step 1: Creating docker image for dynamic setup of kubectl and kubernetes.

For this , we will create a separate directory, using command “mkdir Task04” , you can give any name to your directory .

making new directory
going to the directory

Create Dockerfile for building docker image.

creating Dockerfile
Dockerfile for configuring kubectl

Now, we will build the docker image using “docker build -t dynamicslave:latest . ” command .

Building Docker image
Building Docker image(continue)
Building Docker image(continue)

We need to push this docker image to Docker Hub , a public repository. Also we will tag this docker image first.

Create a tag TARGET_IMAGE that refers to SOURCE_IMAGE

Tagging Docker image

For pushing the docker image, we need to login to the docker hub using “docker login” command.

Login to Docker

Push the Docker image using “docker push dynamicslave/dynamicslave:latest” command.

Pushing docker image

After some minutes, we can see the Docker hub that our docker image has been launched.

Step 2: Setup Docker for accessing it through jenkins.

Docker status

We have to edit the IP and the port in the file docker.service

Adding IP and Port
Adding IP and port

After this we need to restart the docker engine

Restarting Docker service

Export the DOCKER_HOST using command “export DOCKER_HOST=IP of VM:Port” the port will be the same as we have added.

Exporting Docker_Host

Step 3: Setup Jenkins in RHEL 8 , We can use any of the two ways, either setup it dynamically inside a container or normally in RHEL8.

installing jenkins
Jenkins install complete

Now, we will reload the daemon using “sudo systemctl daemon-reload” command.

Reloading Daemon

After this we will start and enable jenkins service so that we need not start it after every shut down of VMs.

Starting and Enabling Jenkins

We can check the status of Jenkins service using “systemctl status jenkins” command.

Status of Jenkins service

Step 4: Access Jenkins Dashboard using IP of VM: 8080 port which is the default port for Jenkins.

IP to access Jenkins
Sign In page for Jenkins

For signing in we need the Authentication password , that we can get using “cat /var/lib/jenkins/secrets/initialAdminPassword” command.

Authentication password

We will choose install all the suggested plugins.

Install suggested Plugins
Plugin listed page for installation

After some minutes we will be the login page.

Creating Admin user
Install Configuration

Finally, we get the Jenkins Dashboard.

Jenkins Dashboard

Now we have to configure the cloud environment. First you have to install the Docker plugin from manage plugins.

Docker plugin installation

Now go to Manage jenkins–> Manage Node and Clouds –> configure cloud.

Manage Nodes and Clouds

Open Configure Clouds .

Configure clouds
Configure clouds
Choosing Docker cloud
Creating Docker cloud
Adding Docker template and adding credentials

So, all done. We have successfully configured the Docker cloud.

Step 5: Now we will start creating the jobs.

JOB1: Pull the Github repo automatically when some developer push code to Github repository and perform the following operations as:

1. Create the new image dynamically for the application and copy the application code into that corresponding docker image

2. Push that image to the docker hub (Public repository).

Git repository will be : https://github.com/Anshika-Sharma-as/DevOps_Task04.git

Dockerfile code
creating job1
Configuring Job1
Console output

Now, its time for building job 02

JOB2: ( Should be run on the dynamic slave of Jenkins configured with Kubernetes kubectl command). Launch the application on the top of Kubernetes cluster performing following operations:

1. If launching first time then create a deployment of the pod using the image created in the previous job. Else if deployment already exists then do rollout of the existing pod making zero downtime for the user.

2. If Application created first time, then Expose the application. Else don’t expose it.

Deploying kubernetes instances

Now, we will make a post-build action so that as soon as job1 finish , automatically job2 will be started.

Adding post-build action to job1 for job2

Step 6: Create Build Pipeline, for demonstrating the execution flow.

Finally, we will create a build pipeline. For this we need to add the “build pipeline” plugin.

After successful install, create a build pipeline view.

Creating build pipeline
Configuring build pipeline view

So, we have successfully created a build pipeline for visualizing the execution.

myView execution flow demonstration

Step 7: Access the Web Page.

Final Web Page

Finally, we have create an interesting integration which facilitate rolling update with zero downtime.

I am very to share this !!!!! Feel free to ask if you face any issue.

Thank You!!

“Keep sharing, Keep learning”

--

--

Anshika Sharma

I am a tech enthusiast, researcher and work for integrations. I love to explore and learn about the new technologies and their right concepts from its core.