Use AWS credentials with Jenkins pipelines

Rationale

Let’s supose you save some files to a S3 bucket. However you don’t want to hardcode the credentials in your pipeline (that pipeline may be stored in a git repository).

To do so, you need to save those credentials in the Jenkins credentials repository and then in the Jenkins pipeline use them.

### Stuff to install in Jenkins

We only need the Pipeline: AWS Steps. With plugin, we can retrieve the credentials in the Jenkins credentials repository and use them in our pipelines.

Usage

Once we have both the credentials and the plugin in place, we can code our pipeline.

I dunno if there is another way to do it, but I usually configure the AWS like this:

pipeline{
    agent any
    options {
        timestamps()
        disableConcurrentBuilds()
        withAWS(region: 'eu-central-1', credentials: 'Jenkins-AWS-S3-Credentials')
    }

The really relevant stuff is the withAWS. Note that the credentials argument, you need to use the same you gave your credentials when you saved them in the Jenkins credentials repository. You’d probably also need to set the region, otherwise the withAWS will try to use the us-west-1 (I believe).

Finally, you may call S3Upload:

steps {
    s3Upload(bucket: '$bucketName', includePathPattern: '$patternOfFiles')
}

That will upload the selected files with $patternOfFile in your workspace and upload them to the named $bucketName.

You can find more information regarding the Pipeline: AWS Steps in the project site (https://jenkins.io/doc/pipeline/steps/pipeline-aws/#pipeline-aws-steps)