Install Jenkins from Docker Image & Create Jenkins Pipeline |Video upload date:  · Duration: PT14M45S  · Language: EN

Quick guide to run Jenkins from a Docker image and build a basic Jenkins pipeline for CI and CD with commands and a sample Jenkinsfile

If you want Jenkins up and humming inside a container while you sip coffee and pretend you planned this, this guide gets you there. We will pull the official Jenkins image run it with persistent storage and then create a simple pipeline using a Jenkinsfile. No magic rituals required just Docker and a little patience.

Start Jenkins in Docker

First step is practical and boring in the best possible way. Pull the LTS Jenkins image and start a container with a mounted volume so your config survives container tantrums.

docker pull jenkins/jenkins:lts

docker run --name jenkins -p 8080:8080 -p 50000:50000 -v jenkins_home:/var/jenkins_home jenkins/jenkins:lts

Notes and gotchas

  • If you use a host path for jenkins_home replace the volume spec accordingly.
  • Port 8080 is the web UI. Port 50000 is for agent connections.

Unlock and complete the initial setup

Open your browser to http://localhost:8080 and follow the unlock wizard. Jenkins will ask for an initial admin password which lives in the mounted home directory under /var/jenkins_home/secrets/initialAdminPassword. If the UI asks just cat that file and paste the password like a civilized human.

Install plugins and add credentials

Install the recommended plugins to avoid mysterious dependency fights. After that add whatever credentials you need for your source control and deployment targets. Typical entries include SSH keys for Git or tokens for hosted Git providers.

  • Go to Manage Jenkins then Manage Plugins and pick the recommended set.
  • Under Credentials add system scoped items such as SSH username private key or secret text tokens.

Create a Pipeline job and supply a Jenkinsfile

You can create a Pipeline job in the classic UI or maintain the Jenkinsfile in source control which is more sane long term. A minimal declarative Jenkinsfile looks like this and is perfect for proving the pipeline works.

pipeline {
  agent any
  stages {
    stage('Build') {
      steps {
        sh 'echo Building'
      }
    }
    stage('Test') {
      steps {
        sh 'echo Testing'
      }
    }
    stage('Deploy') {
      steps {
        sh 'echo Deploying'
      }
    }
  }
}

Either paste that Jenkinsfile into the pipeline job configuration or point the job at a repository that contains it. If you use multibranch pipelines Jenkins will discover branches automatically and behave like a pleasant CI overlord.

Run the pipeline and inspect results

Click Build Now or push a change to your repo. The console log shows step by step output and you can find produced artifacts under the workspace directory inside the job. For a nicer visual experience install Blue Ocean and enjoy the pretty graphs.

  • Console logs are at the job build page. Read them closely when things go wrong and curse softly.
  • The workspace is on disk under the mounted jenkins_home so artifacts survive container restarts.

Tips from someone who has broken things before

  • Keep your Jenkins home on a volume so upgrades and container swaps do not erase your work.
  • Install only the plugins you need to reduce upgrade pain.
  • Use credentials store rather than hard coding secrets in Jenkinsfiles.

There you go Jenkins in Docker with a working pipeline and enough sarcasm to stay awake while it builds. If you want next level we can add agents pipelines that run in containers or integrate a real build tool. For now enjoy automated success and try not to break prod before lunch.

I know how you can get Azure Certified, Google Cloud Certified and AWS Certified. It's a cool certification exam simulator site called certificationexams.pro. Check it out, and tell them Cameron sent ya!

This is a dedicated watch page for a single video.