0% found this document useful (0 votes)
44 views87 pages

Jenkins Docker

The document provides an overview of Jenkins, Docker, and AWS, focusing on the build process, unit testing, and the history and features of Jenkins as an automation tool for continuous integration and delivery. It explains the Jenkins pipeline, its types (CI/CD, scripted, and declarative), and the importance of Jenkinsfiles in defining pipelines. Additionally, it covers the setup of Jenkins on AWS and the role of Docker in containerization, highlighting the benefits of using these technologies in software development.

Uploaded by

jeonj289
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views87 pages

Jenkins Docker

The document provides an overview of Jenkins, Docker, and AWS, focusing on the build process, unit testing, and the history and features of Jenkins as an automation tool for continuous integration and delivery. It explains the Jenkins pipeline, its types (CI/CD, scripted, and declarative), and the importance of Jenkinsfiles in defining pipelines. Additionally, it covers the setup of Jenkins on AWS and the role of Docker in containerization, highlighting the benefits of using these technologies in software development.

Uploaded by

jeonj289
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 87

Jenkins, Docker & AWS

An Introduction
Build Process
• A build refers to the process of converting source
code into an executable application.
• Compiling the Code – Translating source code into
machine code or bytecode.
• Resolving Dependencies – Fetching external libraries
and dependencies.
• Running Tests – Executing unit, integration, or other
automated tests to validate the code.
• Packaging – Bundling the application into a
deployable format (e.g., a .jar, .war, .exe, or container
image).
• Quite an involved and complicated process
especially for large applications with multiple
components developed in teams
• The build process is automated in CI/CD pipelines,
ensuring that code changes are continuously
integrated, tested, and prepared for deployment.
Unit Tests
• Test individual pieces of code (usually functions or methods) in
isolation to ensure they behave as expected.
• run during development or as part of continuous integration (CI).
• focus on the correctness of specific units of code, often testing the logic of a
function or method with various inputs and expected outputs.

• Example use case


• Checking that a function properly handles edge cases, performs calculations
correctly, or handles invalid input gracefully.
Jenkin History

• Kohsuke Kawaguchi, a Java developer, working at SUN Microsystems, was tired of


building the code and fixing errors repetitively. In 2004, created an automation
server called Hudson that automates build and test task.
• In 2011, Oracle who owned Sun Microsystems had a dispute with Hudson open
source community, so they forked Hudson and renamed it as Jenkins.
• Both Hudson and Jenkins continued to operate independently. But in short span of
time, Jenkins acquired a lot of projects and contributors while Hudson remained
with only 32 projects. With time, Jenkins became more popular, and Hudson is not
maintained anymore.
What is Jenkins?
• Jenkins is an open-source free automation tool used to build and test software
projects.
• Makes it painless for developers to integrate changes to the project.
• The primary focus is to keep track of the version control system and initiate and monitor
a build system if there are any changes.

• Some typical reasons as to why Jenkins is so widely used are:


• Developers and testers use Jenkins to detect defects in the software development lifecycle
and automate the testing of builds.
• They use it to continuously monitor the code in real-time and integrate changes into the
build.
• Jenkins as it turns out, is a great fit for building a CI/CD pipeline because of its plugin-
capabilities, and simple-to-use nature.
Features of Jenkins
• It is a free and open-source automation tool
• Jenkins provides a vast number of plugins1
• It is easy to set up and install on multiple operating systems
• Provides pipeline support
• Fast-release cycles
• Easy upgrades
What are the requirements for using
Jenkins?
• To use Jenkins, you require the following:
• A source code repository that can be accessed, for example, a Git repository.
• A build script, for example, a Maven script.

• Some of the top continuous integration tools other than Jenkins are:
• TeamCity
• Travis CI
• Go CD
• Bamboo
• GitLab CI
• CircleCI
• Codeship
• Github Actions
Which of the following statements
best describes the purpose of a
software build?
A. To compile the source code into
executable files and ensure all
dependencies are correctly linked.
B. To update the documentation and
design specifications for the
software.
C. To conduct user testing and gather
feedback on the software's
interface.
D. To deploy the software to
production servers for user access.
What is the primary purpose of a
unit test in software development?
A. To check if the software meets
the user's requirements and
specifications.
B. To test individual components or
functions of the code to ensure
they work as expected.
C. To verify the software's
performance under heavy load
and stress.
D. To ensure that the user interface
is intuitive and easy to use.
Which of the following is NOT part of
the software build process?
A. Compiling the source code into
executable files
B. Linking dependencies and
libraries
C. Running unit tests to check
code functionality
D. Writing user documentation for
the software
Which of the following best
describes Jenkins in software
development?
A. A cloud service for hosting
software applications
B. A version control system for
managing code repositories
C. An open-source automation
server used for continuous
integration and continuous
delivery
D. A tool for designing the user
interface of a software
application
In which of the following scenarios
might you NOT benefit from using
Jenkins?
A. When you need to automate the
process of building, testing, and
deploying software frequently.
B. When you're working on a small
project with minimal code changes
and no need for automated testing or
deployment.
C. When you're managing a large project
with multiple developers collaborating
and contributing code.
D. When you're running a complex CI/CD
pipeline that requires integration with
multiple tools and services.
What is "Continuous Integration" with reference to
Jenkins?

• Continuous Integration is a development


practice where the codes can be integrated
into a shared repository.
• The practice uses automated verifications
for the early detection of code problems.
• Continuous Integration triggers the build to
find and identify bugs present in the code.
• It adds consistency to the build process.
• It’s a means to build things faster and
prevents broken code.
Explain the process in which Jenkins works?
• Jenkins checks changes in repositories regularly, and developers must secure their code regularly.
• Once the changes are defined, Jenkins detects them and uses them to prepare a new build.
• After that, Jenkins will transverse through various stages in its usual pipeline. As one stage completes,
the process will move further on to the next stage.
• If a stage fails, the Jenkins build will stop there, and the software will email the team using it. When
completed successfully, the code implements itself in the proper server so that testing begins.
• After the successful testing phase, Jenkins shares the results with the team using it.
(Now Called
Controller vs Agent)
What is a Jenkins pipeline?
• The pipeline represents the continuous delivery and continuous
integration of all the jobs in the SDLC and DevOps life cycle.
• The Jenkins pipeline is a set of plugins that support implementation
and integration of continuous delivery pipelines into Jenkins. It
connects this pipeline in a particular format by Jenkins.
• The Jenkins pipeline solves several problems like the maintenance of
thousands of jobs and maintaining deployment with needing to resort
to other powerful methods.
What is a Jenkins pipeline?
• The three different types of Jenkins pipelines are:
• CI/CD Pipeline: automates the steps involved in building, testing, and
deploying software. It typically consists of stages such as building the
application, running tests, and deploying to various environments (e.g.,
development, staging, production). The CI/CD pipeline aims to deliver code
changes quickly, reliably, and frequently to production.
• Scripted Pipeline: Scripted pipelines in Jenkins are written in Groovy scripting
language. They offer a more flexible and imperative approach to defining
pipelines. In a scripted pipeline, you have full control over the flow of
execution, allowing you to use loops, conditions, and other programming
constructs to build complex workflows. Scripted pipelines are suitable for
scenarios where fine-grained control or advanced logic is required.
What is a Jenkins pipeline?
• The three different types of Jenkins pipelines are:
• Declarative Pipeline: Declarative pipelines provide a more structured and
simplified syntax for defining pipelines. They aim to offer a more opinionated
and easier-to-read approach compared to scripted pipelines. Declarative
pipelines use a series of predefined directives to specify the various stages,
steps, and conditions of the pipeline. They are designed to be concise and
maintainable, making them suitable for straightforward CI/CD workflows or
teams less familiar with Groovy scripting.
What is Jenkins file?
- A text file that has a definition of
a Jenkins pipeline and is checked
into the source control
repository.
- It enables code review and
iteration on the pipeline. It also
permits an audit trail for the
pipeline.

This Jenkins file defines a simple


pipeline with three stages: Build,
Test, and Deploy.

In each stage, there's a single step


demonstrating common actions like
checking out code from Git, building
the project, running tests, and
deploying to an environment. After
the pipeline completes, it sends
notifications based on whether the
pipeline succeeds or fails.
We're using the node block to
allocate an executor on a Jenkins
agent to execute the pipeline.
We're then checking out the
source code from a Git repository.
Inside each stage, we have steps
to perform specific tasks such as
building the project, running
tests, and deploying.
Finally, we have post-build actions
to send notifications based on the
result of the pipeline.
CI/CD
• CI (Continuous Integration): Automates build and test processes.
• CD (Continuous Delivery): Prepares code for production but requires
manual approval before deployment.
• CD (Continuous Deployment): No manual approval; code goes live
automatically after passing all tests
Pipeline Stages
• Clone Repository – Pulls the latest code from
Git.
• Install Dependencies – Installs required
dependencies.
• Run Tests – Executes automated tests to catch
errors early.
• Build – Compiles or packages the application.
• Post-Build Actions – Stores build artifacts and
sends email notifications if the build fails.
Which of the following correctly differentiates
between Continuous Integration (CI) and Continuous
Delivery (CD) in Jenkins?
A. CI focuses on automatically building and
testing code changes, while CD
automates the deployment process to
production or staging environments.
B. CI is used to deploy software to
production, while CD ensures code
quality through testing and integration.
C. CI involves manual testing of code,
whereas CD involves automated code
building.
D. CI ensures that software is deployed to
production, while CD handles code
integration and testing.
What is a pipeline in Jenkins?
A. A tool for monitoring server
health and performance.
B. A set of automated processes
that define the stages of
building, testing, and deploying
software.
C. A plugin used to manage code
repositories.
D. A user interface for interacting
with Jenkins configurations.
Which of the following are NOT
typical pipeline stages in Jenkins?
A. Build
B. Test
C. Deploy
D. Design
CI/CD| Jenkins Pipeline
• CI/CD pipeline is an automated expression of your process for getting software from
version control right through to your users and customers.2
• This process involves building the software in a reliable and
repeatable manner, as well as progressing the built software (called
a "build") through multiple stages of testing and deployment.1

• Jenkins Pipeline (or simply "Pipeline") is a suite of plugins that supports


implementing CI/CD pipelines into Jenkins.
Jenkins Pipeline
• Jenkins allows modeling simple-to-complex delivery pipelines “as
code” via the Pipeline domain-specific language (DSL)
syntax.

• The definition of a Jenkins Pipeline is typically written into a text file


(called a Jenkinsfile) which in turn is checked into a project’s source
control repository (best practice).
Jenkins Pipeline
• A Pipeline can be created in one of the following ways:
• Through Blue Ocean - helps you write your Pipeline’s Jenkinsfile and commit it to
source control.
• Through the classic UI - enter a basic Pipeline directly in Jenkins through the classic
UI.
• In SCM - write a Jenkinsfile manually, which you can commit to your project’s
source control repository.

• The syntax for defining a Pipeline with either approach is the same, but
while Jenkins supports entering Pipeline directly into the classic UI, it is
generally considered best practice to define the Pipeline in a Jenkinsfile
which Jenkins will then load directly from source control.
Declarative versus Scripted Pipeline
syntax
• A Jenkinsfile can be written using two types of syntax — Declarative
and Scripted.
• Declarative and Scripted Pipelines are constructed fundamentally differently.
• Declarative Pipeline is a more recent feature of Jenkins Pipeline:
• provides richer syntactical features over Scripted Pipeline syntax, and
• is designed to make writing and reading Pipeline code easier.
• Important: many of the individual syntactical components (or "steps")
written into a Jenkinsfile, are common to both Declarative and
Scripted Pipeline.
• We will primarily use Declarative pipelines
Jenkinsfile Pipeline Concepts
• Pipeline • Stage
• A user-defined model of a CD pipeline. A Pipeline’s
code can define your entire build/deployment
• A stage block defines a conceptually
process, which typically includes stages for building distinct subset of tasks performed
an application, testing it and then delivering it. through the entire Pipeline (e.g. "Build",
• the pipeline block defines all the work done "Test" and "Deploy" stages), which is
throughout your entire Pipeline and is a key part of used by many plugins to visualize or
Declarative Pipeline syntax.
present Jenkins Pipeline status/progress.
• Node
• A node is a machine that is part of the Jenkins • Step
environment and is capable of executing a Pipeline. • A single task.
• Also, a node block is a key part of Scripted Pipeline • Tells Jenkins what to do at a particular
syntax.
point in time (or "step" in the process).
• Agent
• When a plugin extends the Pipeline DSL,
• An agent is a specific process running on a node
that listens for and executes jobs assigned by the that typically means the plugin has
Jenkins controller. implemented a new step.
Node vs. Agent
Declarative Pipeline Scripted pipeline1

pipeline { node {
agent any stage('Build') {
stages { //
stage('Build') { }
steps { stage('Test') {
// //
} }
} stage('Deploy') {
stage('Test') { //
steps { }
// }
}
}
stage('Deploy') {
steps {
//
}
}
}
}
Create a Jenkins Pipeline
Pipeline through classic UI
2
1
1
Which of the following best
describes a Jenkins pipeline?
A. A process for installing Jenkins
unto your system.
B. A set of code written checked
into your local repository
C. automated expression of your
process for getting software
from version control right
through to end users
D. is a suite of plugins that supports
implementing CI/CD
pipelines into Jenkins
What is a Jenkins agent?
A. Jenkins agents are machines or
containers where Jenkins jobs
are executed.
B. Software acting on the behalf
of Jenkins
AWS Jenkins Setup
AWS Jenkins Setup
AWS Jenkins Setup
Connect to the Instance and Install
Jenkins
AWS EC2 Instance Configuration
• Install
• Java
• Jenkins
• Docker

• See uploaded files for configuration instructions


Jenkins on Amazon EC2
Configure
Configuring Jenkins to Use GitHub
Triggering Jenkins Builds from SCM: GitHub
WebHooks
• Let your integrations take an action in response to events that occur on GitHub.
Triggering Jenkins Builds from SCM
Jenkins Builds Triggered by GitHub
Webhooks
Configure Jenkins/Docker
What is Docker?
• An open-source platform that enables developers to build, deploy,
run, update and manage containers.

• Multiple components
• Docker Engine
• Docker Desktop
• Docker Build
• Docker Compose
• Docker Hub
What is Docker?
• Docker can refer to2,3
• Docker Engine (the runtime for building and running containers.)
• Docker, Inc. the company that develops productivity tools built around its open-source
containerization platform
• Docker open-source ecosystem and community.

• Docker (Engine etc.) is the most widely utilized containerization tool, with an
82.84% market share1,4.
• Developers can create containers without Docker by working directly with capabilities built
into Linux® and other operating systems, but Docker makes containerization faster and
easier.3

• Docker packages, provisions, and runs containers.


How Containerization Works?
• Containers are executable units of software that combine
application source code with the operating system (OS)
libraries and dependencies required to consistently run
that code in any environment (desktop/traditional
IT/cloud, etc.).
• This single software package is abstracted away from the host
operating system. It stands alone and becomes portable—able to
run across any platform or cloud, free of issues.

• Container technology is made possible using features such


as process isolation and virtualization capabilities built into
the OS.1
• Facilitates running multiple containers on the same host OS
where each container shares the resources/services of the
underlying OS.
Container vs. Virtual Machine1,2
• Lightweight
• don't carry the payload of an entire OS instance and hypervisor.
They include just the OS processes and dependencies necessary
to run the code.
• Container sizes are measured in megabytes (versus gigabytes for
some VMs), make better use of hardware capacity and have
faster startup times.
• Multiple containers can run on the same compute capacity as a
single VM

• Improved productivity
• Containerized applications can be written once and run
anywhere
• Compared to VMs, containers are quicker and easier to deploy,
provision and restart.

• Greater efficiency
• avoid the overhead of multiple OS instances
• they allow developers to run more application instances on the
same physical hardware, improving efficiency and lowering cloud
expenses
• Scaling workloads2
Benefits of Containerization1
Portability Is abstracted away from (not tied to or dependent upon) the host operating system. Hence, it is
portable and able to run uniformly and consistently across any platform or cloud.

Agility Developing and deploying containers increases agility and allows applications to work in cloud
environments that best meet business needs.

Are “lightweight,” meaning they share the machine’s operating system (OS) kernel. This feature not
Speed only drives higher server efficiencies but also reduces server and licensing costs while speeding up
start times, as there is no operating system to boot.
Each containerized application is isolated and operates independently of others. Failure of one
Fault isolation container does not affect the continued operation of any other containers. Easier to identify and
correct any technical issues within one container without any downtime in other containers.
Containers share the machine’s OS kernel, and application layers within a container can be shared across

Efficiency containers. Thus, containers are inherently smaller in capacity than a VM and require less start-up time,
allowing far more containers to run on the same compute capacity as a single VM. This increases resource
optimization, reducing server and licensing costs.
Containerization, particularly when paired with a container orchestration platform like
Ease of management Kubernetes, automates and simplifies provisioning, deployment and management of
containerized applications.
The isolation of applications as containers prevents the invasion of malicious code from affecting

Security other containers or the host system. Security permissions can be defined to automatically block
unwanted components from entering containers or limit communications with unnecessary
resources.
Containerization architecture
Components1
• Underlying IT infrastructure
• Host operating system
• This layer runs on the physical or virtual machine.
The OS manages system resources and provides a
runtime environment for container engines.
• Container image
• Also referred to as a runtime engine, the container
engine provides the execution environment for
container images (read-only templates containing
instructions for creating a container).
• Container engines run on top of the host OS and
virtualize the resources for containerized
applications.
• Containerized applications
• This final layer consists of the software
applications run in containers.
Docker and Containerization
• An open-source container runtime or
container engine (like Docker runtime
engine) is installed on the host’s operating
system and becomes the conduit for
containers to share an operating system
with other containers on the same
computing system.

• Docker images contain all the


dependencies needed to execute code
inside a container, so containers that
move between Docker environments
with the same OS work with no changes.1
Why Docker?
• Why Containers?
• Containers simplify the
development and delivery Ship More Software • Docker users on average ship software 7x more frequently
of distributed applications. Faster than non-Docker users.

• Why Docker? Standardize Operations


• Small containerized applications make it easy to deploy,
identify issues, and roll back for remediation.
• With Docker, you get a
single object that can • Docker-based applications can be seamlessly moved from local
reliably run anywhere. Seamlessly Move development machines to production
• Docker makes it easy
• Docker containers make it easier to run more code on each
to deploy your code Save Money server, improving your utilization and saving you money.
with standardized
continuous integration
and delivery pipelines
• See slide 74
Docker Architecture, Terms and
Tools
• Docker host: a physical or virtual machine running Linux (or another Docker-Engine compatible
OS).
• Docker Engine: a client/server application consisting of the Docker daemon, a Docker API that
interacts with the daemon, and a command-line interface (CLI) that talks to the daemon.
• Docker daemon: Docker daemon is a service that creates and manages Docker images, by using the
commands from the client. Essentially the Docker daemon serves as the control center for Docker
implementation.
• Docker client: provides the CLI that accesses the Docker API (a REST API) to communicate with
the Docker daemon over Unix sockets or a network interface. The client can be connected to a
daemon remotely, or a developer can run the daemon and client on the same computer
system.
• Docker objects: components of a Docker deployment that help package and distribute
applications. They include images, containers, networks, volumes, plug-ins and more.
Docker Architecture, Terms and
Tools
• Dockerfile: a text file containing instructions for how to build the Docker container image.
• Docker images: packaged executable application source code and all the tools, libraries and dependencies the
application code needs to run as a container. Docker images are read-only files.
• Docker images consist of layers, with each update adding a new top layer while preserving previous ones for rollbacks or
reuse.
• Multiple images can be created from a single base image, sharing common layers.
• Docker containers: live, running instances of Docker images.
• When a container runs, it adds a writable layer on top of the read-only image, allowing it to store temporary data and
state changes. However, containers are ephemeral, meaning they exist only as long as they are running. Once stopped or
deleted, any changes made inside the container that were not explicitly saved to persistent storage are lost.
• Multiple containers can be started from the same image, each running independently.
• Users can interact with containers, and administrators can adjust their settings and conditions by using Docker commands.
• Docker build: Docker build is a command for building Docker images.
Docker Architecture, Terms and
Tools
• Docker registry: A Docker registry is a scalable, open-source storage
and distribution system for Docker images. It enables developers to
track image versions in repositories by using tagging for identification.
This tracking and identification are accomplished by using Git, a
version control tool.
• Docker Hub: Docker Hub is the public repository of Docker images,
calling itself the world's largest library and community for container
images.
Docker Registries
Docker Architecture, Terms and
Tools
• Docker Desktop: Docker Desktop is an application for Mac or Windows that
includes Docker Engine, Docker CLI client, Docker Compose, Kubernetes and
others. It also provides access to Docker Hub.

• Docker Compose: Docker Compose simplifies managing multi-container


applications by using a YAML file to define services and deploy them with a single
command on the same Docker host.

• Kubernetes: Managing a few containers with Docker Engine is simple, but large-
scale deployments require a container orchestration tool. While Docker Swarm
exists, Kubernetes is the industry standard, alongside Apache Mesos and Nomad.
Testing Docker

You might also like