Jenkins Course
Pipeline Syntax
Every directive in a Declarative pipeline has a specific job. This lesson is your reference — the complete syntax with real examples, so you know exactly what to write and why.
This lesson covers
agent variants → environment → options → triggers → when conditions → parallel stages → input → the Snippet Generator — every syntax feature with working examples
Lesson 13 showed you a complete Jenkinsfile and explained what each block does at a high level. This lesson goes one level deeper — showing you the variants within each directive, the options inside each block, and the patterns that make pipelines production-grade.
Think of this lesson as the reference card you keep open when writing a new Jenkinsfile. By the end, you won't need to search the documentation for basic syntax — you'll have it.
agent — Where Your Pipeline Runs
The agent directive has several variants. You'll use different ones depending on where and how you want your pipeline to run.
// Variant 1: any — run on any available agent
// Simple, but not recommended for production — no control over the environment
agent any
// Variant 2: label — run on an agent with a specific label
// Most common production choice — routes to the right machine
agent { label 'linux && docker' }
// Variant 3: none — no global agent
// Use this when different stages need different agents
// Each stage must then declare its own agent
agent none
// Variant 4: docker — run inside a Docker container on the agent
// Jenkins pulls the image, runs the pipeline inside the container, then removes it
// Guarantees a clean, reproducible environment for every build
agent {
docker {
image 'maven:3.9-eclipse-temurin-21' // the Docker image to use
args '-v $HOME/.m2:/root/.m2' // mount Maven cache to speed up builds
}
}
// Variant 5: dockerfile — build and use your own Dockerfile
// Jenkins builds the image from the Dockerfile in your repo, then uses it
agent {
dockerfile {
filename 'docker/Dockerfile.ci' // path to Dockerfile inside the repo
dir 'docker' // context directory for the Docker build
}
}
When to use each variant
- any — learning and experiments only. Never in production.
- label — most production pipelines. Routes to the right machine type.
- none — when stages need to run on different agent types (e.g. build on Linux, test on Windows).
- docker — when you want a clean, isolated build environment. Popular for teams practicing containerised CI.
- dockerfile — when your build environment is complex enough to need a custom Dockerfile.
environment — Variables That Travel Everywhere
The environment block can be placed at the pipeline level (applies to all stages) or inside a specific stage (applies only to that stage). Both approaches are valid and are often used together.
pipeline {
agent { label 'linux' }
// Pipeline-level environment — every stage can read these
environment {
APP_NAME = 'order-service'
REGISTRY_URL = 'registry.acmecorp.com'
// credentials() injects a username/password credential
// Creates two variables: DOCKER_CREDS_USR and DOCKER_CREDS_PSW
DOCKER_CREDS = credentials('docker-hub-creds')
// For a secret text credential, it creates a single variable
SLACK_TOKEN = credentials('slack-webhook-token')
}
stages {
stage('Build') {
// Stage-level environment — only available in this stage
// Overrides pipeline-level variables if the name matches
environment {
BUILD_TARGET = 'production'
}
steps {
// Reference pipeline-level and stage-level vars the same way
echo "Building ${APP_NAME} for target: ${BUILD_TARGET}"
sh "docker build -t ${REGISTRY_URL}/${APP_NAME}:${BUILD_NUMBER} ."
}
}
stage('Test') {
steps {
// BUILD_TARGET is not available here — it was stage-scoped
// APP_NAME is available — it was pipeline-scoped
echo "Testing ${APP_NAME}"
sh './gradlew test'
}
}
}
}
Important: Never put actual secret values in the environment block directly. Always use credentials() to reference a stored Jenkins credential. A hardcoded password in your Jenkinsfile is a hardcoded password in your Git history — forever.
options — Pipeline Behaviour Settings
The options block controls how the pipeline itself behaves — timeouts, retries, build retention, and more. These are the options you'll reach for most often:
pipeline {
agent { label 'linux' }
options {
// Abort the pipeline if it runs longer than this
timeout(time: 45, unit: 'MINUTES')
// Keep only the last 30 builds — saves disk space on the Jenkins master
buildDiscarder(logRotator(numToKeepStr: '30'))
// Add a timestamp to every line of console output
// Makes it easy to spot which step is slow
timestamps()
// If this pipeline is already running and a new build is triggered,
// skip the new build rather than queue it
// Useful for long-running pipelines triggered by frequent commits
disableConcurrentBuilds()
// Retry the entire pipeline up to 2 times on failure before giving up
// Use sparingly — usually better to fix the flakiness than mask it
retry(2)
// Skip the automatic SCM checkout that Jenkins does at the start
// Use when you want full control over how/when code is checked out
skipDefaultCheckout()
}
stages {
stage('Build') {
// options can also be set at the stage level
options {
// Retry just this stage up to 3 times if it fails
retry(3)
// Timeout just this stage — useful for flaky integration tests
timeout(time: 10, unit: 'MINUTES')
}
steps {
sh './gradlew build'
}
}
}
}
triggers — What Starts the Pipeline Automatically
Without a trigger, your pipeline only runs when someone clicks Build Now. Triggers make it automatic. There are three main types:
pipeline {
agent { label 'linux' }
triggers {
// Trigger 1: cron — run on a fixed schedule using cron syntax
// This runs every night at midnight
// H means Jenkins picks a random minute to spread load across the hour
cron('H 0 * * *')
// Trigger 2: pollSCM — check the repo for new commits on a schedule
// This checks every 5 minutes — if a new commit is found, build it
// Less efficient than webhooks (Lesson 19) but works without external access
pollSCM('H/5 * * * *')
// Trigger 3: upstream — run when another Jenkins job completes successfully
// Useful for chaining jobs: when 'order-service-test' passes, run this pipeline
upstream(upstreamProjects: 'order-service-test', threshold: hudson.model.Result.SUCCESS)
}
stages {
stage('Build') {
steps {
sh './gradlew build'
}
}
}
}
Note on cron syntax: H 0 * * * reads as: H (random minute) at hour 0, every day, every month, every weekday. The five fields are minute, hour, day-of-month, month, day-of-week. The H symbol is Jenkins-specific — it picks a consistent random value per job so not all jobs fire at exactly midnight and overload the server.
when — Conditional Stages
The when directive is one of the most powerful tools in Declarative syntax. It lets you skip entire stages based on conditions — without messy if/else logic inside steps.
pipeline {
agent { label 'linux' }
stages {
// Only run this stage when building the main branch
stage('Deploy to Production') {
when { branch 'main' }
steps { sh './deploy.sh production' }
}
// Only run when an environment variable equals a specific value
stage('Integration Tests') {
when { environment name: 'RUN_INTEGRATION', value: 'true' }
steps { sh './gradlew integrationTest' }
}
// Only run when a specific file has changed in this commit
// Useful for monorepos — only rebuild the service that changed
stage('Rebuild Frontend') {
when { changeset 'frontend/**' }
steps { sh 'npm run build' }
}
// Combine conditions with allOf (AND) — all must be true
stage('Deploy to Prod — Gated') {
when {
allOf {
branch 'main'
not { changeLogMatches pattern: '.*\\[skip ci\\].*' }
}
}
steps { sh './deploy.sh production' }
}
// Combine conditions with anyOf (OR) — at least one must be true
stage('Notify Team') {
when {
anyOf {
branch 'main'
branch 'release/*'
}
}
steps { echo 'Notifying team of release branch build' }
}
}
}
parallel — Running Stages at the Same Time
By default, stages run one after another. If your unit tests take 2 minutes and your integration tests take 3 minutes, running them sequentially costs 5 minutes. Running them in parallel costs 3 minutes. On a busy team, this adds up fast.
Sequential (5 min total) vs Parallel (3 min total)
pipeline {
agent { label 'linux' }
stages {
stage('Checkout') {
steps { checkout scm }
}
// Parallel stage — runs the inner stages simultaneously
// Each inner stage can run on a different agent if needed
stage('Run Tests in Parallel') {
parallel {
stage('Unit Tests') {
// This branch runs on any linux agent
agent { label 'linux' }
steps {
sh './gradlew test'
}
post {
always { junit 'build/test-results/test/**/*.xml' }
}
}
stage('Integration Tests') {
// This branch runs on a different agent simultaneously
agent { label 'linux && docker' }
steps {
sh './gradlew integrationTest'
}
post {
always { junit 'build/test-results/integrationTest/**/*.xml' }
}
}
stage('Static Analysis') {
agent { label 'linux' }
steps {
sh './gradlew checkstyle pmd'
}
}
}
// If any parallel branch fails, fail the whole parallel stage
// failFast: true stops all other branches immediately on first failure
// failFast: false (default) lets all branches finish before failing
}
stage('Deploy') {
when { branch 'main' }
steps { sh './deploy.sh staging' }
}
}
}
input — Human Approval Gates
Sometimes you want a pipeline to pause and wait for a human to approve before continuing — particularly before deploying to production. The input directive does exactly this. The pipeline pauses, the approver sees a button in the Jenkins UI, and the pipeline continues only when they click it.
pipeline {
agent { label 'linux' }
stages {
stage('Test') {
steps { sh './gradlew test' }
}
stage('Approve Production Deploy') {
// input pauses the pipeline and waits for a human
// The executor is freed while waiting — it doesn't block an agent slot
input {
// The message shown to the approver in the Jenkins UI
message 'Deploy payments-service to PRODUCTION?'
// Text on the approval button
ok 'Yes, deploy now'
// Only these users/groups can approve — everyone else sees it but can't approve
submitter 'admin,release-managers'
// Optional: collect additional input from the approver
parameters {
choice(
name: 'DEPLOY_REGION',
choices: ['us-east-1', 'eu-west-1', 'ap-southeast-1'],
description: 'Which region to deploy to?'
)
}
}
steps {
// DEPLOY_REGION is available here — it was collected in the input block above
echo "Deploying to region: ${DEPLOY_REGION}"
sh "./deploy.sh production ${DEPLOY_REGION}"
}
}
}
}
Where to practice all syntax examples: The Jenkins documentation has a live Snippet Generator built into every Jenkins install. Go to your Pipeline job → click Pipeline Syntax in the left sidebar → choose any step from the dropdown — Jenkins generates the correct syntax for you automatically. This is the fastest way to get the exact syntax for unfamiliar steps. Find it at http://YOUR-JENKINS-URL/pipeline-syntax/.
The Snippet Generator — Jenkins' Built-In Syntax Helper
You don't need to memorise every step's syntax. Jenkins ships with a tool called the Snippet Generator that lets you fill in a form and get the correct Groovy code for any pipeline step. It's one of the most underused features in Jenkins.
// some block
}
Choose a step, fill in the options, click Generate — and Jenkins gives you copy-paste ready code. Use this for any step whose exact syntax you're unsure about. It's available at /pipeline-syntax on any running Jenkins instance.
Teacher's Note
Use the Snippet Generator freely — even experienced Jenkins engineers use it. Remembering exact syntax is less valuable than knowing which directive to reach for and why.
Practice Questions
1. What keyword do you use in a Declarative pipeline to run multiple stages at the same time?
2. Which Declarative pipeline directive pauses the build and waits for a human to approve before continuing?
3. Inside a when block, which keyword requires ALL conditions to be true before the stage runs?
Quiz
1. What does the disableConcurrentBuilds() option do?
2. Which agent variant runs the pipeline inside a Docker container pulled from a registry?
3. What does the Jenkins Snippet Generator do?
Up Next · Lesson 15
Stages and Steps
Stages and steps in depth — nesting, sequential stages inside parallel, the most useful built-in steps, and how to structure a pipeline that's easy to debug.