How to build with Jenkins and Docker example |Video upload date:  · Duration: PT11M11S  · Language: EN

Compact tutorial to build Docker images with Jenkins pipelines run tests and push images for CI pipeline best practices and tips

Quick overview of the workflow

If you want repeatable builds that do not trash your build host and actually run tests before you ship a container you are in the right place. This guide walks through using Jenkins pipelines and Docker to build images run unit tests and push artifacts to a registry while keeping secrets out of public logs. Yes it will still feel magical the first time it works.

What you need before you start

  • Source code in a Git repository next to a Dockerfile that defines the build steps
  • A host or VM with Docker installed and a running Jenkins service
  • Jenkins credentials stored in the Jenkins credential store for registry access

Prepare the repo and Dockerfile

Keep your Dockerfile layers small and cache friendly. That usually means copy only what you need early on and avoid copying the entire source before running dependency installation. For local sanity checks run a local build with a command like

docker build -t myapp .

If that fails you can iterate quickly on the Dockerfile before you make the Jenkins pipeline try to fail in public.

Install Docker and Jenkins the sensible way

Install the Docker engine on the build host and run Jenkins as a service. Give the Jenkins user access to the Docker socket or add Jenkins to a Docker group so the Jenkins process can run containers without a fist fight with permissions. If you prefer isolation run agent containers or use a Kubernetes plugin to keep things tidy.

Create a declarative Jenkinsfile

Write a Jenkinsfile with stages for checkout build test and push. Declarative syntax keeps humans and Jenkins happy. Use a Docker agent for a consistent build environment and run tests inside containers so your host stays clean.

Minimal example Jenkinsfile

pipeline {
  agent { docker { image 'docker' } }
  environment {
    DOCKER_USER = credentials('docker-user-id')
    DOCKER_PASS = credentials('docker-pass-id')
  }
  stages {
    stage('Checkout') {
      steps {
        sh 'git clone repo.git .'
      }
    }
    stage('Build') {
      steps {
        sh 'docker build -t myapp .'
      }
    }
    stage('Test') {
      steps {
        sh 'docker run --rm myapp npm test'
      }
    }
    stage('Push') {
      steps {
        sh 'echo "$DOCKER_PASS" | docker login -u "$DOCKER_USER" --password-stdin registry.example.com'
        sh 'docker push registry.example.com/myapp'
      }
    }
  }
}

This is intentionally minimal. Replace the image string with a real build image for your stack and bind the Jenkins credentials to names that match your credential ids.

Use a Docker agent for reproducible builds

Running the agent inside a container gives you a clean environment each build. The Docker plugin or Kubernetes plugin are both common choices depending on whether you run a single build host or a cluster. Pick images that contain the build tools you need or build a small helper image and reuse it across pipelines.

Run the pipeline and push safely

Execute the pipeline to make sure the image builds and tests pass. Use credentials from the Jenkins store and never hard code registry passwords. The example above logs into the registry with environment bound credentials and then pushes the image. Test the whole flow with a private registry first before hitting the public one.

Common gotchas and quick tips

  • If Jenkins cannot talk to Docker check socket permissions or the Docker group membership for the Jenkins user
  • Keep layers cache friendly to make incremental builds faster
  • Run unit tests inside the build container to avoid host drift and flaky failures
  • Store credentials in Jenkins and bind them into the pipeline to avoid leaking secrets
  • Move from local testing to a shared Jenkins server once the pipeline is stable

Wrap up

Combining Jenkins and Docker gives you a CI pipeline that moves code from repo to container images while running tests in a controlled environment. It takes a little setup but once your Jenkinsfile behaves the whole team wins and you can spend less time debugging builds and more time blaming production for vague reasons.

I know how you can get Azure Certified, Google Cloud Certified and AWS Certified. It's a cool certification exam simulator site called certificationexams.pro. Check it out, and tell them Cameron sent ya!

This is a dedicated watch page for a single video.