Using the Pipeline Plugin to Accelerate Continuous Delivery -- Part 2

Written by: apemberton

4 min read

In this blog series, we will provide an introduction and step-by-step guide on how to use the Pipeline plugin. This is Part 2:


Integrating Your Tools

For a real-life pipeline, Jenkins needs to integrate with other tools, jobs and the underlying environment.

Tools

Jenkins has a core capability to integrate with tools. Tools can be added and even automatically installed on your build nodes. From Pipeline, you can simply use the tool DSL syntax:

def mvnHome = tool 'M3' 
<span style="line-height: 1.6;">sh "${</span> mvnHome<span style="line-height: 1.6;">}/bin/</span> mvn<span style="line-height: 1.6;"> -B verify" </span> 

In addition to returning the path where the tool is installed, the tool command ensures the named tool is installed on the current node.

Global Variables

The env global variable allows accessing environment variables available on your nodes:

echo env.PATH

Because the env variable is global, changing it directly is discouraged as it changes the environment globally, so the withEnv syntax is preferred (see example in Full Syntax Reference Card at the end of this blog series).

The currentBuild global variable can retrieve and update the following properties:

currentBuild.result 
<span style="line-height: 1.6;">currentBuild.displayName
currentBuild.description</span> 

Existing Jobs

Existing jobs can be triggered from your pipeline via the build command (e.g.: build ' existingfreestyle-job ' ). You can also pass parameters to your external jobs as follows:

def<span style="line-height: 1.6;"> job = build job: 'say-hello', parameters: [[$class: '</span> StringParameterValue<span style="line-height: 1.6;">', name: 'who', value: '</span> Blog<span style="line-height: 1.6;"> Readers']] </span> 

Controlling Flow

Because the Pipeline plugin is based on the Groovy language, there are many powerful flow control mechanisms familiar to developers and operations teams, alike. In addition to standard Groovy flow control mechanisms like ‘if statements’, try/catch and closures there are several flow control elements specific to Pipeline.

Handling Approvals

Pipeline supports approvals, manual or automated, through the input step:

<span style="line-height: 1.6;">input 'Are you sure?' </span> 

With the submitter parameter, the input step integrates the Jenkins security system to restrict the allowed approvers.

The input step in Jenkins Pipeline Stage View UI:

Timing

Timeouts allow pipeline creators to set an amount of time to wait before aborting a build:

timeout(time: 30, unit: 'SECONDS') { … }

Parallel stages add a ton of horsepower to Pipeline, allowing simultaneous execution of build steps on the current node or across multiple nodes, thus increasing build speed:

parallel 'quality scan': {
     node {sh 'mvn sonar:sonar'}
}, 'integration test': {
     node {sh 'mvn verify'}
}

Jenkins can also wait for a specific condition to be true:

waitUntil { … }

Handling Errors

Jenkins Pipeline has several features for controlling flow by managing error conditions in your pipeline. Of course, because Pipeline is based on Groovy, standard try/catch semantics apply:

try {

} catch (e) {

}

Pipeline creators can also create error conditions if needed based on custom logic:

if(!sources) {
     error 'No sources'
}

Jenkins can also retry specific Pipeline steps if there is variability in the steps for some reason:

retry(5) { … }

Script Security

As you've seen, Pipeline is quite powerful. Of course, with power comes risk, so Pipeline has a robust security and approval framework that integrates with Jenkins core security.

By default, when creating pipelines as a regular user (that is, without the Overall/RunScripts permission), the Groovy Sandbox is enabled. When the Sandbox is enabled, Pipeline creators will only be allowed to use pre-approved methods in their flow.

When adding pre-approved methods to a pipeline, script changes do not require approval. When adding a new method (such as a Java API), users will see a RejectedAccessException and an administrator will be prompted to approve usage of the specific new API or method.

Deselecting the Use Groovy Sandbox option changes this behavior. When the Sandbox is disabled, pipeline edits require administrator approval. Each change or update by a non-administrator user requires approval by an administrator. Users will see an UnapprovedUsageException until their script is approved. Approving individual edits may not scale well, so the Groovy Sandbox is recommended for larger environments.

Accessing Files

During your pipeline development, you will very likely need to read and write files in your workspace.

Stashing Files

Stashing files between stages is a convenient way to keep files from your workspace to share them between different nodes:

stage 'build'
     node{
          git 'https://github.com/cloudbees/todo-api.git'
          stash includes: 'pom.xml', name: 'pom'
     }
stage name: 'test', concurrency: 3
     node { 
          unstash 'pom'
          sh 'cat pom.xml'
     }

Stash can be used to prevent cloning the same files from source control during different stages, while also ensuring the same exact files are used during compilation and tested in later pipeline stages.

Archiving

Like other Jenkins job types, pipelines can archive their artifacts:

archive includes: '*.jar', excludes: '*-sources.jar'

Archives allow you to maintain binaries from your build in Jenkins for easy access later. Unlike stash, archive keeps artifacts around after a pipeline execution is complete (where stash is temporary).

Beyond stashing and archiving files, the following Pipeline elements also work with the file system (more details at the end of this blog series):

pwd()
dir(''){}
writeFile file: 'target/results.txt', text: ''
readFile 'target/results.txt'
fileExists 'target/results.txt'

Using the Pipeline Plugin to Accelerate Continuous Delivery -- Part 1
Using the Pipeline Plugin to Accelerate Continuous Delivery -- Part 2
Using the Pipeline Plugin to Accelerate Continuous Delivery -- Part 3

Stay up-to-date with the latest insights

Sign up today for the CloudBees newsletter and get our latest and greatest how-to’s and developer insights, product updates and company news!