This lightning talk will show you how simple it is to apply CI to the creation of Docker images, ensuring that each time the source is changed, a new image is created, tagged, and published. I will then show how easy it is to then deploy containers from this image and run tests to verify the behaviour.
9. The New World Order: Containers Codify OS Config
9
ProdDev QA Staging
DEV Server/VM QA Server/VM STG Server/VMPROD Server/VM
<PROD OS config><STG OS config><QA OS config><DEV OS config>
App
<code>
<APP OS config>
App
<code>
<APP OS config>
App
<code>
<APP OS config>
App
<code>
<APP OS config>
13. How Can You Use Jenkins & Docker Together?
1. Run Jenkins Masters & Slaves in
Docker
2. Build, Test, & Deploy Docker Images
from Jenkins
14. 1. Run Jenkins Masters & Slaves in Docker
Docker (Cloud) – use Docker
images as standardized build
environments to improve isolation
and elasticity
Docker Custom Build
Environment – specify customized
build environments as Docker
containers
CloudBees Docker Shared
Config – manage Docker (or
Swarm) host configuration centrally
in CloudBees Jenkins Operations
Center
15. 2. Build, Test, & Deploy Docker Images from Jenkins
Build and Publish – build projects
that have a Dockerfile and push
the resultant tagged image to
Docker Hub
Docker Traceability – identify
which build pushed a particular
container and displays the build /
image details in Jenkins
Docker Hub Notification – trigger
downstream jobs when a tagged
container is pushed to Docker Hub
17. Jenkins Workflow Primer
Jenkins powered CD pipelines
Jenkins Workflow
ProdDev
Perf Test
BuildCommit
Selenium
Test
Stage Deploy
Sonar
Test
Pipelines Need:
Branching
Looping
Restarts
Checkpoints
Manual Input
??
18. Key Workflow Features
18
Entire flow is one concise Groovy script using Workflow DSL
• For loops, try-finally, fork-join …
Can restart Jenkins while flow is running
Allocate slave nodes and workspaces
• As many as you want, when you want
Stages throttle concurrency of builds
Human input/approval integrated into flow
Standard project concepts: SCM, artifacts, plugins
20. Pipeline Stages
20
Build
and Unit
Test App
Test
Docker
Image
Publish
Docker
Image
SCM Checkout
mvn package
mvn sonar:sonar
mvn verify
docker build
docker tag
docker run
notify
cucumber
war img
Sonar
Analysis
Prepare
Release
Build
Docker
Image
Int Test
docker push
image.inside withServer
21. Build, unit test and package
21
Build
and Unit
Test App
Test
Docker
Image
Publish
Docker
Image
SCM Checkout
mvn package
mvn sonar:sonar
mvn verify
docker build
docker Tag
docker run
notify
cucumber
war img
Sonar
Analysis
Prepare
Release
Build
Docker
Image
Int Test
docker push
image.inside withServer
22. Build, unit test and package
stage 'Build App’
node('docker') {
docker.image(‘maven:3.3.3-jdk-8’).inside(‘-v /data:/data’ {
mkdir –p /data/mvn
writeFile file: 'settings.xml', text: ”(………)"
git 'https://github.com/cloudbees/mobile-deposit-api.git’
sh 'mvn –s settings.xml clean package’
…
Specify the Stage Name
Specify the slave label
Custom Build Env Mount volume from slave
.m2 repo location
co and build
24. Test the app
24
Build
and Unit
Test App
Test
Docker
Image
Publish
Docker
Image
SCM Checkout
mvn package
mvn sonar:sonar
mvn verify
docker build
docker Tag
docker run
notify
cucumber
war img
Sonar
Analysis
Prepare
Release
Build
Docker
Image
Int Test
docker push
image.inside withServer
25. Test the app
node('docker') {
docker.image(‘maven:3.3.3-jdk-8’).inside(‘-v /data:/data’ {
…
stage 'Sonar analysis’
sh 'mvn -s settings.xml sonar:sonar’
stage 'Integration-test’
sh 'mvn -s settings.xml verify’
step([$class: 'JUnitResultArchiver', testResults: '**/target/surefire-reports/TEST-*.xml'])
}
…
In same env as build
Sonar tests
Run API Tests
26. Build, test and publish Docker image
26
Build
and Unit
Test App
Test
Docker
Image
Publish
Docker
Image
SCM Checkout
mvn package
mvn sonar:sonar
mvn verify
docker build
docker Tag
docker run
notify
cucumber
war img
Sonar
Analysis
Prepare
Release
Build
Docker
Image
Int Test
docker push
image.inside withServer
30. Traceability
Builds on existing Jenkins artifact traceability
Allows the tracking of the creation and use of Docker containers in
Jenkins and their future use.
Combine with artifact fingerprinting for a comprehensive solution
Each Build shows the image fingerprints created
30
Identify which build pushed a particular container and display the
build / image details in Jenkins
Image fingerprints
31. Traceability – registering events
Jenkins can track actions against this image such as:
• Creating a container
• Container events such as start/stop
To achieve this, it is necessary to call the Traceability API – see
$(JENKINS_URL)/docker-traceability/api/
There are two endpoints to submit events to:
31
/docker-
traceability/submitContai
nerStatus
Allows to submit the current container status
snapshot with a minimal set of parameters. Outputs
of docker inspect $(containerId) can be directly
submitted to Jenkins server using this command.
/docker-
traceability/submitReport
Submits a report using the extended JSON API. This
endpoint can be used by scripts to submit the full
available info about the container and its
environment in a single command.
36. Docker Hub Notification
Trigger downstream jobs when a tagged container is pushed to Docker Hub
The Docker Hub Notification Trigger plugin lets you configure Jenkins to
trigger builds when an image is pushed to Docker Hub. E.g. to run
verification for the container.
What are the steps
Set up a WebHook Account for Notification
Set up your Docker Registry to make callbacks on Image events
Set up your builds
36
37. Docker Hub Notification – Docker Registry Webhook
37
In the format:
http://<user>:<token>@<jenkins_url>/dockerhub-webhook/notify
40. Docker and Jenkins with Workflow is the proven
CD Platform
40
+
TESTING
STAGING
PRODUCTION
Workflow CD Pipeline Triggers:
• New application code (i.e. feature, bug, etc.)
• Updated certified stack (security fix in Linux, etc.)
… will lead to a new gold image being built and available for…
… TESTING
… STAGING
… PRODUCTION
All taking place in a standardized/similar/consistent environment
<OS config>
Company
“Gold”
Docker Img
(~per app)
App
<code>
(git, etc.)
<OS config>
Certified
Docker
Images
(Ubuntu, etc.)
Jenkins Workflow
41. CloudBees: Leading the Way for Docker and CD
Docker Workflow – Provides first-class support for Jenkins Workflow to build real
world CD pipelines for containerized applications using Jenkins and Docker
Build and Publish – Builds projects that have a Dockerfile and pushes the
resultant tagged image to Docker Hub
Docker Hub Notification – Triggers downstream jobs when a tagged container is
pushed to Docker Hub
Docker Traceability – Identifies which build pushed a particular container that is
running in production and displays that on the Jenkins builds page
Docker – Uses Docker containers as standardized build environments to improve
isolation and elasticity – Dockerized Build Slaves
Docker Custom Build Environment – Specifies customized build environments
as Docker containers
42. Getting started
Docker plugin documentation:
http://documentation.cloudbees.com/docs/cje-user-guide/docker-
workflow.html
Workflow tutorial:
https://github.com/jenkinsci/workflow-plugin/blob/master/TUTORIAL.md
Example Source Code
https://github.com/harniman/mobile-deposit-api/blob/master/flow.groovy
43. How Do You Manage CD at Enterprise Scale?
43
CloudBees Jenkins Platform
Jenkins at Enterprise Scale for CI and CD
About me:
I work for CloudBees as a Solution Architect helping our customers understand how CloudBees Jenkins Platform can help them solve their goals.
I have been in engineering for over 20 years and have performed various java development and architecture roles including a stint as a build engineer and as a lead Dev Ops.
I came to CloudBees from Sky where I had responsibility for the online video platform, and as part of my time there I designed and built an online platform for sales and service using Infrastructure as Code principles – devops before it was called that! QAs deployed many times a day via a self service mechanism with db redeployment/upgrade and flexible mocking options. We deployed to prod weekly with full VM tear down and rebuild via a scripted ‘next”, “next” approach
I am interested in all things automation, devops, and especially how that applies in the cloud.
We’ve heard this Meme over and over. Marc Andreeson said “Software is eating the World.”
What does this mean? Wherever we look, products are being defined by the software they run as much as the physical appearance. For instance, is a car defined just by its style, or by the driver automation features implemented by software such as auto parking, lane assist, adaptive cruise control, self driving? What about the recent emissions scandal involving a certain German manufacturer? Has this been attributed to Hardware or Software?
The software stakes have never been higher. Quality needs have never been higher – who wants their self driving car to crash – but speed to market of new features becomes critical as software becomes a key differentiator.
So, how do we do that? How do we deliver better software faster?
How do we take code developed by developers and rapidly move it to production as new features for users?
Whilst maintaining quality.
Well, Automation is the key.
Just as the Tesla Motor Company built a fully automated factory floor to produce their leading edge cars, we need to build a fully automated software factory using automation technologies.
Lets look at the advantage Docker brings to speeding up this process.
A typical full stack configuration looks like this:
Develop Code
Commit to SCM
Build and test app with Jenkins
Provision environment with Puppet
Test
Environment and App code are not bound tightly together. Environment changes do not propagate with App changes.
Testers find bugs, developers have to spend time investigating why it worked in DEV and not in PROD and then re-working. This is not fast
Use Docker to manage the environment config alongside the application.Propagate the same configuration across all environments.
If it works in Dev, it will work in prod.
Focus on new innovation rather than fault finding.
What does this look like in reality?
We package all app related OS config with the application code.
The same tested package is propagated across the environments.
This takes the single binary concept to the next level.
(NB we still have to manage data network layers and provide consistent configuration. Other tooling can address these needs.
Images need to be built using a reliable, repeatable and automated process
These days it is not acceptable to build application artifacts by hand – so Docker Images need the same type of automation.
This is where Jenkins comes to our rescue
Jenkins is widely used for application CI and drives many CD initiaves (RebelLabs research showed 70% of Java projects use Jenkins)
Lets look at how Jenkins and Docker can be used together to take your delivery process to the next level.
Two patterns of use:
Use Docker to provide run-time environments for Jenkins components – Slaves and Masters (And Operations Center if running CloudBees Jenkins Platform)
Use Jenkins to build and test Docker Images
Firstly Docker can be leveraged as the runtime platform for Jenkins components such as Masters and Slaves.
There are standard docker images for Masters, and the CloudBees Jenkins Platform components. Docker can also be used to provision Slave nodes on demand using the Docker Slaves plugin. Various images exist, or roll your own with all required tools. Also integrates via Swam and Kubernetes for scaling across many Docker hosts
Sometimes you want a very controlled build environment – think clean room, or you need certain pre-configured credentials or other config to exist. The Custom Build Environment plugin allows you to achieve just this. Within your slave, a container is spun up from a predefined image, filesystems mounted from the slave and the build steps executed within the container.
Users of the CloudBees Jenkins Platform are able to leverage the Shared Config capability to distribute the docker host and image/label configuration across the whole cluster of masters from a central point.
I won’t go into details of these now, as we want to focus on pipelines.
The second area that Jenkins and Docker deliver is the ability to create a fully automated pipeline to Build, Test and Deploy Docker images.
The Build and Publish plugin provides an easy to use abstraction of the Docker command line and adds Jenkins Build Steps for build, tag, push etc
Docker Traceability extends Jenkins existing Fingerprint capability to allow identification of the underlying build that created a given image, and allows tracing back from a running container
Docker Hub Notification addresses two needs. How do I trigger a redeployment when an Image is pushed, and, given Docker’s layered approach, how do I rebuild my image if an upstream layer is changed – ie my Company pushes a new Ubuntu-secure-base
These plugins can be used in regular Jenkins jobs to assemble pipelines, but I want to show you how super simple this is using Jenkins Workflow.
Workflow is a new Job type. Launched in Nov 2014.
Workflow is available to the OSS users.
A job now becomes the whole pipeline, and has the power to model complex scenarios such as Branching, Looping, handling human input.
A workflow also runs in a detached manner, which means as long as the real work is being performed on executors, it survives a Master restart.
Jenkins Workflow has some really cool features…
Workflow has the concept of Stages.
This screen shot is using the Stage View plugin from CloudBees Jenkins Platform to show how a typical Docker pipeline might look.
Stages are fully customizable.
Lets look at this example pipeline in more detail.
We are going to build the app just like we do today – this will compile and unit test, produce a war (mvn package) , run Sonar analysis (mvn sonar:sonar) and then Integration test (mvn verify)
The difference here, is we are going to run this inside a specific Docker container using the Custom Build Environment plugin
We will then prepare the release – in this case it is grabbing the version from the POM
Now comes the real Docker integration.
We will create the docker image and tag it.
We then spin up a container from this image, notify Jenkins of the container (for traceability) and run tests against it – these could be functional, security, performance – maybe you will have multiple test phases run in parallel against multiple containers.
If the tests pass, we then publish the image to our registry – public or private – the choice is yours.
Lets look at the build step in more detail
We specify the stage name
Next, we need to run these steps on a slave. This slave needs docker installed.
Then we define the container we need to run the build in
And mount a additional data volume – we do this to provide a common maven repo cache
Which is why we use a custom settings file to point to the repo location
And then we perform a git checkout, and run mvn from the command line
A note about docker slaves
In the global config, you specify various docker images that can be used as slaves, and map to labels.
There is an existing docker-in-docker plugin that I am using to spin up a container that also can run docker,
Next we will look at the sonar and integration test steps
We add stage names
And then we run the mvn targets from the command line
This is typical workflow pipeline construction
Now we’ll focus on the image creation and use.
First we need to ensure we have access to a docker host.
You can see here I am referencing a Jenkins Credentials via its ID. The docker plugins are fully integrated with Jenkins Credentials API.
Within this block that performs the bind, I will then execute the workflow steps
You can see more stages defined – I’m not going to cover these in detail – it’s the same as before
Next we need to ensue we are executing commands within the context of the correct directory on the filesystem that contains the dockerfile
We then invoke the build – providing the tag at the same time – note we obtain a reference to the image
Once the image is built, we want to provision a container. Note we also grab a handle to this so we can address it later.
The next step is to notify Jenkins that we have created a container from this image (I’ll show more details a bit later)
If the tests pass, we bind to the registry (the default is dockerhub) – note we also supply credentials reference here too, and the push the image
And voila, we have a tagged version that is fully tested.
So I mentioned earlier I would talk more about traceability.
Identify which build pushed a particular container and display the build / image details in Jenkins
After we have spun up a container, we need to call the Jenkins traceability endpoint with details.
Fortunately we can pass in the output of “docker inspect”
What does this give us?
On the left hand menu we have a new Docker Traceability item
It shows the containers known to Jenkins
Clicking on one reveals
The container’s events – as logged via the Traceability API
And the Build that created the image so you can trace back to the source.
A final word on Docker Hub Notifications
Trigger downstream jobs when a tagged container is pushed to Docker Hub
Need to configure Docker registry with a WebHook and provide the user and token to access
Then you configure the trigger conditions on the jobs
It can either be automatic from any Docker image used by the build – ie deploy container from image x
Or, you can list the depenant images explicity
OK, so in conclusion
Jenkins and Docker can be your key to Continuous Delivery.
The same automation engine that you already know and use for CI can fully power your docker based CD process as well.
Jenkins supports the creation and management of complex Delivery Pipelines
CloudBees has been working closely with Docker, the company, to create a number of Jenkins plugins that insure that Docker is a first-class entity in the CD/DevOps ecosystem.
How can you get started?
Documentation on the Docker extensions for Workflow
Workflow Tutorial
Take a look at the example application and pipeline on my Github
So, how do you manage Jenkins at Enterprise scale?
If you are going to use Jenkins for CI or CD, then it will become a crucial part of your application delivery environment. You need to be confident that it will be there when needed.
That’s where we come in.
<click build>
CloudBees is the enterprise Jenkins company.
We offer subscription based access to CloudBees Jenkins Enterprise which is an enhanced, robust, and highly available version of Jenkins that is built on the same open source core that you know and trust.