CI/CD Pipeline With Github,Jenkins & kubernetes using Static & Dynamic cluster in Jenkins.

Radhika Sharma
5 min readJun 4, 2020

--

An amazing trio……

complete demonstration of pipeline

In this article we will create a complete CD(continuous delivery) & CI(continuous Integration ) pipeline through which as soon as developer push the code , it will directly launch in the production environment , without Human intervention with the help of SCM(source code management ) tool : GitHub, Continuous Integration tool: jenkins, Containerization tool : Docker & Container Orchestration tool: kubernetes .

Here we are also using the powerful concept of jenkins “Master-slave architecture” to run job in jenkins. Jenkins slave can be of two types:static or dynamic. In case of static slave , we have to create a node earlier before run any job , But dynamic slave or dynamic cluster launches as soon as requirement come ,run job inside cluster & terminate it ,if job complete.

LET’s START :-

1. Developer’s Job :

  • Developer (take an example)write the Application code & Dockerfile :which create the image to launch the container to run the application, It will have all the necessary software to run the application.
  • Here we are taking the example of web application so image should have httpd services started & It’s configuration and all required things so developer write the code accordingly.
dockerfile for httpd server
  • As soon as developer commit the code it pushes all the files automatically to the github because we are using post-commit hooks here.
automatically pushing all the files in GitHub after commit
  • In GitHub we are using webhooks which will send the event or a response to jenkins [using ngrok for communicating with jenkins’s private IP] ,& thus jenkins will trigger ,for run the job-1 which are using github-hook triggers method as a Build Trigger Option.

2. Jenkins’s Jobs:

  • The first job will be run on slave node [rhel-8-CLI node in my case ] , This job will build the Dockerfile which it pulls from github & push this image to the docker hub account.
  • Understand , Here images is created after the job-1 is run , so all the files related to web application will be copy at the time of image build , so if there is any change in code the image will be change & web-application also which is accessible to our client.
Here we have to set up label of static-cluster in order to run job-1 on another slave node[ rhel8 in my case]
For build & push image I am using execute shell option , you can use the by default option available in jenkins for building & pushing an image in docker hub.
  • Before pushing image in docker hub ,not forget to set your credentials for docker-hub account.
output of job-1 [pushing the httpd image in docker hub]
  • now If job-1 will be successfully run, then it will trigger the job-2 which will launch the dynamic slave using docker container.
  • For launching the dynamic cluster we have to do following setting to create a cloud node in jenkins:

→ Here we need to access the docker-daemon remotely , so we need to enable the tcp socket & do the following settings, Take reference from here.

enable tcp socket in docker-host

→ In docker client you need to export DOCKER_HOST variable.

→ we also need a image to launch a container , here because we will be launch our application inside the k8s cluster & we are launching dynamic cluster or slave so the image should consists following things

  • sshd service running on port 22.[If using ssh method for starting slave agent in slave node]
  • Jenkins user with password.
  • java installed
  • All the required application dependencies for the build. For example here for running k8s we need to install & configure kubectl inside image.
kube image(name of my image) to launch docker container as a dynamic slave of jenkins

→ push this image to the your docker -hub account.

→ For launching docker container using jenkins job we need Docker plugins, go in Dashboard –> Manage Jenkins –> Manage Plugins. & install docker plugin from there.

Now configure cloud Node:

give details like this
here I am using ssh key for authentication in order to launching slave agent

Note : This whole task of setting job-2 & cloud node should be done earlier before launching the job1.

  • For job-2 we need to set label of dynamic cluster.
set label which was given to the cloud node
command in execute shell option in order to launch deployment
  • Now the job-2 will be triggered by job-1,first of all It will launch the slave agent in dynamic slave node.
launching agent
output of agent connectivity
  • Here we can see that in remote system docker container is launched.
dynamic cluster launched
  • Now after agent launched it will run the job-2 commands given in execute shell, which will create a deployment using httpd image having our application code.
create deployment & exposed it
website running inside k8s cluster inside a pod
  • This website is running inside the pod launched by deployment , It is so powerful because if pod is deleted by any reason , deployment will launch the new pod again , Now we easily can do rolling updates or roll out , scaling & load-balancing of pods.
  • let’s take the scenario if developer change the code now , so in production environment it also have change without any down-time here roll-out come in role , look at this :
  • code change so developer again commit -> push the code on github -> job-1 run -> create new image with new code -> job-2 run -> now the pods is already exist -> this time roll-out of deployment will be done, because we give this command using shell scripting in execute shell option.
roll out of pods for new version release
build-pipeline view of jobs

done !! here we created a complete CD [continuous delivery ] pipeline.

Credit : I would like to thanks vimal daga sir, This was the one of the task given by him in DevOps Assembly Line Training.

Sign up to discover human stories that deepen your understanding of the world.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

No responses yet

Write a response