George V. Reilly

Decrypting Blackbox secrets at build time with Paperkey

“Security is 1% technology plus 99% following the procedures correctly” — Tom Limoncelli

Having dealt with GPG last week at work, I remembered that I had intended to write a blog post about how we used GPG, Blackbox, and Paperkey to store secrets in Git at my previous job.

We used Blackbox to manage secrets that were needed during de­vel­op­ment, build, deployment, and runtime. These secrets included AWS cre­den­tials, Docker registry cre­den­tials, our private PyPI cre­den­tials, database cre­den­tials, and cer­tifi­cates. We wanted these secrets to be under version control, but also to be secure.

For example, we had a cre­den­ that exported en­vi­ron­ment variables, which was managed by Blackbox:

# Save current value 

Jenkins #6: Miscellenea

[Pre­vi­ous­ly published at the now defunct MetaBrite Dev Blog.]

A collection of mis­cel­la­neous tips on using Pipelines in Jenkins 2.0.

#6 in a series on Jenkins Pipelines

En­vi­ron­ment Variables

Use the withEnv step to set en­vi­ron­ment variables. Don't manipulate the env global variable.

The confusing example that you see in the documents, PATH+WHATEVER=/something, simply means to prepend /something to $PATH. The +WHATEVER has no other effect.


The withEnv step should not be used to introduce secrets into the build en­vi­ron­ment. Use the with­Cre­den­tials plugin instead.

    [$class: 'StringBinding', credentialsId: 'GPG_SECRET', variable: 'GPG_SECRET'],
    [$class: 'AmazonWebServicesCredentialsBinding',
     credentialsId: '0defaced-cafe-f00d-badd-0000000ff1ce',
     accessKeyVariable: 'AWS_ACCESS_KEY_ID',

Jenkins #5: Groovy

[Pre­vi­ous­ly published at the now defunct MetaBrite Dev Blog.]

Jenkins Pipelines are written in a Groovy DSL. This is a good choice but there are surprises.

#5 in a series on Jenkins Pipelines

Groovy as a DSL

Groovy lends itself to writing DSLs (Domain-Specific Languages) with a minimum of syntactic overhead. You can frequently omit the paren­the­ses, commas, and semicolons that litter other languages.

Groovy has in­ter­po­lat­ed GStrings, lists, maps, functions, and closures.


Closures are anonymous functions where state can be captured at de­c­la­ra­tion time to be executed later. The blocks that follow many Pipeline steps (node, stage, etc) are closures.

Here's an example of a Closure called ac­cep­tance_in­te­gra­tion_tests, where the re­lease_lev­el parameter continue.

Jenkins #4: The sh Step

[Pre­vi­ous­ly published at the now defunct MetaBrite Dev Blog.]

If there isn't a built-in Pipeline step to accomplish something, you'll almost certainly use the sh step.

#4 in a series on Jenkins Pipelines

The sh step runs the Bourne shell—/bin/sh, not Bash aka the Bourne-again shell—with the -x (xtrace) and -e (errexit) options.

The xtrace option means that every step in the sh block is echoed to the Jenkins log, after commands have been expanded by the shell. This is useful but you could echo the contents of passwords or secret keys in­ad­ver­tent­ly. Use set +x in your sh block to control this.

The errexit option means that the continue.

Jenkins #3: GitHub Integration

[Pre­vi­ous­ly published at the now defunct MetaBrite Dev Blog.]

Much of our code is in one large GitHub repository, from which several different ap­pli­ca­tions are built. When changes are pushed to the master branch, we want only the ap­pli­ca­tions in affected di­rec­to­ries to be built. This was not easy to get right with “Pipeline script from SCM” builds.

#3 in a series on Jenkins Pipelines


Jenkins #2: EC2 Slaves

[Pre­vi­ous­ly published at the now defunct MetaBrite Dev Blog.]

The “slave” ter­mi­nol­o­gy is un­for­tu­nate, but the utility of running a Jenkins build on a node that you've configured at Amazon's EC2 is undeniable.

#2 in a series on Jenkins Pipelines

We needed to install system packages on our build nodes, such as Docker or Postgres. For obvious reasons, Cloud­Bees—our Jenkins hosting provider—­won't let you do that on their systems. You must provide your own build nodes, where you are free to install whatever you like.

We already use Amazon Web Services, so we chose to configure our CloudBees account with EC2 slaves. We had a long and fruitless detour through On-Premise Executors, which I continue.

Jenkins #1: Migrating to Pipelines

[Pre­vi­ous­ly published at the now defunct MetaBrite Dev Blog.]

The MetaBrite dev team migrated most of their builds from At­las­sian's Bamboo Cloud to Jenkins Pipelines in late 2016/early 2017. This is a series of blog posts about that experience.

Jenkins Pipeline Series

The series so far:


For three years, we used At­las­sian's hosted Bamboo Cloud service to build and deploy most of our code. In the summer of 2016, Atlassian announced that they were going to dis­con­tin­ue Bamboo Cloud on January 31st, 2017.

We looked around for a suitable re­place­ment. We did not find continue.

Computed Parallel Branches in Jenkins Pipeline

I've been using Jenkins lately, setting up Pipeline builds. I have mixed feelings about that, but I'm quite liking Groovy.

Here's an example of a Closure called ac­cep­tance_in­te­gra­tion_tests, where the re­lease_lev­el parameter is a String which must be either "dev" or "prod".

def acceptance_integration_tests = { String release_level ->
    assert release_level =~ /^(dev|prod)$/
    String arg = "--${release_level}"

    def branches = [
        "${release_level}_acceptance_tests": {
            run_tests("ci_acceptance_test", arg, '**/*nosetests.xml')