If you want repeatable builds that do not trash your build host and actually run tests before you ship a container you are in the right place. This guide walks through using Jenkins pipelines and Docker to build images run unit tests and push artifacts to a registry while keeping secrets out of public logs. Yes it will still feel magical the first time it works.
Keep your Dockerfile layers small and cache friendly. That usually means copy only what you need early on and avoid copying the entire source before running dependency installation. For local sanity checks run a local build with a command like
docker build -t myapp .
If that fails you can iterate quickly on the Dockerfile before you make the Jenkins pipeline try to fail in public.
Install the Docker engine on the build host and run Jenkins as a service. Give the Jenkins user access to the Docker socket or add Jenkins to a Docker group so the Jenkins process can run containers without a fist fight with permissions. If you prefer isolation run agent containers or use a Kubernetes plugin to keep things tidy.
Write a Jenkinsfile with stages for checkout build test and push. Declarative syntax keeps humans and Jenkins happy. Use a Docker agent for a consistent build environment and run tests inside containers so your host stays clean.
pipeline {
agent { docker { image 'docker' } }
environment {
DOCKER_USER = credentials('docker-user-id')
DOCKER_PASS = credentials('docker-pass-id')
}
stages {
stage('Checkout') {
steps {
sh 'git clone repo.git .'
}
}
stage('Build') {
steps {
sh 'docker build -t myapp .'
}
}
stage('Test') {
steps {
sh 'docker run --rm myapp npm test'
}
}
stage('Push') {
steps {
sh 'echo "$DOCKER_PASS" | docker login -u "$DOCKER_USER" --password-stdin registry.example.com'
sh 'docker push registry.example.com/myapp'
}
}
}
}
This is intentionally minimal. Replace the image string with a real build image for your stack and bind the Jenkins credentials to names that match your credential ids.
Running the agent inside a container gives you a clean environment each build. The Docker plugin or Kubernetes plugin are both common choices depending on whether you run a single build host or a cluster. Pick images that contain the build tools you need or build a small helper image and reuse it across pipelines.
Execute the pipeline to make sure the image builds and tests pass. Use credentials from the Jenkins store and never hard code registry passwords. The example above logs into the registry with environment bound credentials and then pushes the image. Test the whole flow with a private registry first before hitting the public one.
Combining Jenkins and Docker gives you a CI pipeline that moves code from repo to container images while running tests in a controlled environment. It takes a little setup but once your Jenkinsfile behaves the whole team wins and you can spend less time debugging builds and more time blaming production for vague reasons.
I know how you can get Azure Certified, Google Cloud Certified and AWS Certified. It's a cool certification exam simulator site called certificationexams.pro. Check it out, and tell them Cameron sent ya!
This is a dedicated watch page for a single video.