An independent guide to building modern software for serverless and native cloud

Defining CodePipeline Resources in CloudFormation

This tutorial references the code in the aws-connectedcar-dotnet-serverless repository. If you're new to this course, see the introduction for information about setting up your workstation and getting the sample code.

One of the benefits of working with CodePipeline and its various component parts is it can all be defined in CloudFormation templates. In this tutorial we’ll show you all the resources needed to build a pipeline that can deploy the sample code. This includes the IAM policies and service roles, two CodeBuild projects, and the parent CodePipeline resource with all its stages and actions.

Defining IAM Policy & Role Resources

Let’s start with the three IAM policies and the corresponding service roles that are assumed at runtime by CodeBuild, CodePipeline, and CloudFormation, respectively. Here's the policy and the service role used by the CodeBuild service, found in the pipeline.yaml template of the sample code:

The actions itemized in lines 63-70 of the policy, shown above, are somewhat “least-privilege” insofar as they don’t grant broad administrative permissions to the build agents in the pipeline. Note that sometimes it takes some trial and error to uncover all the actions that the specific build action needs. A production version of a policy like this would, ideally, specify the specific actions to be allowed.

Next, we have the policy and role that’s assumed at runtime by the parent pipeline, shown below. In this case, the allowed actions that you see in the policy on lines 98-101 are primarily those needed by the pipeline to access the other build and deployment services, such as CodeBuild and CloudFormation:

Lastly, here’s the policy and role needed by the CloudFormation service to create the resources for the solution. Up to now, you’ve been creating CloudFormation stacks with deployment scripts that run under your locally configured user account that has admin access. When CloudFormation is run from the pipeline, it needs the permissions in the policy shown below on lines 128-137 to deploy the solution.

Defining CodeBuild Resources

Next, let’s look at the resources for the two CodeBuild projects, starting with the BuildProject, shown below:

Properties that are of interest in this resource include line 159, which references the service role for CodeBuild that we looked at above; lines 160-161 specify that the input and output artifacts for this build will be managed by the parent pipeline; lines 163-165 specify the type, capacity, and image of the build agent; and the environment variables in lines 167-175 pass the parameter values (set by our pipeline script) through to the build agent.

As we described in the previous lesson, the first CodeBuild project builds the Lambda deployment package and uploads it, along with all the CloudFormation templates and OpenAPI files, to the S3 deployment bucket. This prepares all the artifacts needed to run the CloudFormation stage of the pipeline. The second CodeBuild project is used to run automated API tests, which executes after the CloudFormation deployment has completed. This second CodeBuild project is shown below:

This second CodeBuild resource is similar to the first. It uses a different build agent image, to be compatible with the specific software being installed. You’ll also notice the BuildSpec property on line 207, which is where the path to the target buildspec file is specified (we’ll cover buildspec files in the next two tutorials).

Defining the CodePipeline Resource

The last resource in the pipeline.yaml template is the one that defines the parent pipeline. The first properties to look at can be seen below. The ArtifactStore on lines 212-214 specifies the type and location of the store, which in this case is the same S3 bucket we’ve been using for deployments. Lines 215-217 specify the pipeline service role that we looked at above.

Moving down the template, here’s the Source stage, which is the first of the four stages included in the sample pipeline:

You can have multiple actions for each stage in a pipeline, and in this case we have two actions, one for each of the repositories from which we’re checking out code. Here, the ActionTypeId element specifies the Source category on line 223, and specifies the Provider for the action on line 226, which in this case is Github. Lines 227-228 specify the logical name for the output artifacts, which are SourceArtifact for this action, and CommonArtifact for the second action.

Lines 230-234 show the configuration properties that specify the Github repository name, owner, and branch. Line 230, for the “OAuthToken” property, makes use of what’s called a dynamic reference. What this dynamic reference does at runtime is retrieve the value for the named secret. We’ll show how this secret is set up, in the labs, later in this section.

Next, we have the build stage, shown below:

Like the source stage, the action specifies a category on line 255 and a provider on line 258. The logical name of the InputArtifacts for this action is the same name as the OutputArtifacts of the first action in the previous stage. This action sends its outputs to the “BuildArtifact” destination.

The next step in the pipeline is the Deploy stage. This stage, like the deployment scripts we’ve been using up to now, creates or updates the CloudFormation stack:

As with the previous stages, line 272 specifies the Deploy action category and line 275 specifies the provider. This stage also reads from the SourceArtifact location, which is where it finds the template file identified on line 288. The template parameters that are passed through to CloudFormation for this stage are specified on line 283. Note that these will be static values, changeable only if the pipeline stack is updated.

The final step in the pipeline is the Test stage. This is another CodeBuild stage that executes the commands in the specified test.buildspec.yml file. In this case, the commands set up and call Newman, which is the command-line tool for executing Postman test collections: