Implement Continuous Delivery with SAP Commerce Cloud
14 min read
Overview
Every SAP Commerce Cloud project has more than one developer working on a feature or bug fix at any given time. Setting up a Continuous Delivery pipeline is critical to ensuring you can deliver code changes you can trust into your production system. This article focuses on some recommendations and tools you can use when setting up your Continuous Delivery pipeline. It also provides a sample Jenkins pipeline that can be used as a starting point for any SAP Commerce Cloud project.
Table of Contents
Move Away from Manual Deployments
In the article Build and Deploy Your First SAP Commerce Cloud Project you learned how to manually setup, build and deploy a starter solution on your local machine and in your SAP Commerce Cloud environments. You may also have read the series of articles dedicated to Effective Code Reviews, including how to Measure Code Quality with Sonar. Although these articles focus on useful topics, they do not address how you can create a Continuous Delivery pipeline where the changes made by your developers on their local machines make it into production with minimal intervention.
To be able to deliver your code changes continually, you first need to set up a continuous integration (CI) environment to fully automate build and testing of your committed code. You could rely on the builder that is part of SAP Commerce Cloud, but typically you only want to do this once you're confident that your build is ready for your cloud environments. Having your own CI environment in place will allow you full control over integration issues that come up after each commit.
An important factor to get the most out of your CI is how frequently developers integrate their code and commit to the release as this can have a direct effect on when issues are identified and the time spent fixing and analyzing integration related issues. We recommend the code is integrated frequently to leverage the greatest benefit from a CI environment and reducing time being spent integrating code. Once you are confident in your CI process you can focus on how you can take the next step by delivering continuously instead of in larger releases.
Continuous Integration Environment Key Components
The suggested software components for a CI environment:
- Git-based code repository
- Continuous Integration Software e.g Jenkins, Bamboo
- Code Quality Management e.g Sonar
- Release/Distribution management e.g. Nexus
- Integration Server to build, run and execute tests your solution
Automated Build, Deployment and Testing
Manually building and running an SAP Commerce solution can be error-prone and time-consuming and is an ideal area for automating. Most CI servers support some form of scripting to automate build, run and testing tasks. It is recommended you have at least 2 types of builds in your CI environment:
- Incremental Build: Usually triggered by a commit of changes to source control, this is a fast build starting with the output of the previous build and updating it with any changes committed to source control since that previous build. The aim of this build is to provide rapid feedback and, therefore, the build is designed to complete as fast as possible, omitting any slow test cases and most of the post build steps performed by the full build.
- Nightly Build: A regular, full, 'from scratch' build. Usually run nightly, or at least weekly, this build starts with a completely empty workspace, does fresh checkouts from source control, and after building the system and performs a number of comprehensive tests, code analysis, and reporting tasks.
An example of the steps in each type of build job is as follows:
Steps for Full Build | Steps for Incremental Build |
---|---|
|
|
You may also consider separating out your jobs further to allow for quicker feedback. These may include:
- Functional Tests Job: A job allowing to automatically execute functional tests on a given environment
- Quality Job: A job dedicated to the execution of quality tools such as Sonar.
- Performance Test Job: A job allowing the automatic execution of performance tests on a given environment
When you're confident in your CI pipeline you can move towards a CD (Continuous Delivery) pipeline which automatically pushes successful builds to your Cloud Environments as can be seen in the diagram below. Although you could also automatically build and deploy to your SAP Commerce Cloud Stage and Production environments, these steps are often done manually to ensure the build is deployed at the right time without affecting business operations. If you have full confidence in your development practices and automated testing then you could theoretically create a pipeline that leverages SAP Commerce Cloud's zero downtime deployments to automatically promote your code to production. This would allow you to get features out much quicker, but does rely a high level of teamwork across your organization (developers, testers, sys admin, business users) to ensure a CD pipeline works smoothly.
Setup a Working Jenkins Pipeline in 30 minutes
In this section we walk you through how to setup a sample CI/CD pipeline that has many of the steps outlined in the diagram above, except for executing performance tests and packaging/storing the build.
Prerequisites
The project was created to work with SAP Core Commerce 2005 following the recommended repository structure for a Commerce Cloud in the Public Cloud project (see https://github.com/SAP-samples/cloud-commerce-sample-setup/ for examples). In the Jenkins project we have chosen to use Scripted Pipeline (Declarative Pipeline) a general-purpose DSL built with Groovy (more info here: https://www.jenkins.io/doc/book/pipeline/syntax/#declarative-pipeline) to provide flexibility and reuse.
Before using the example, you will need to have the following:
- Working Jenkins instance with admin access to Credentials/Global Configuration/Plugins.
- 2 git repositories:
- Your Commerce Cloud code. This is the same repository that you configured in Cloud Portal.
- A repository for storing the Jenkins pipeline configurations
- Sonarqube (https://docs.sonarqube.org/latest/setup/overview/). See Measure Code Quality with Sonar for more on how SonarQube is used
- Core Commerce zip (link) for the version you want to use (optional: Commerce Integration Pack if you will be using any of the available integrations)
- (optional) Additional servers that can be used as subordinate nodes for executing your pipelines. You may choose just to run the subordinate node on the same machine as your Jenkins installation
Configure Jenkins Nodes
To create new subordinate node go to Manage Jenkins > Manage Nodes and Clouds > New Node and define new Node with the "Labels" field to include the value: "subordinate"
You can add more nodes or labels to split up your pipelines. By default, we have the sample pipelines (see pipelines folder) executing on nodes labelled with 'subordinate'
The result should be as follow:
Plugins
Install the following plugins:
-
Extensible Choice Parameter (https://plugins.jenkins.io/extensible-choice-parameter/) - This plugin adds "Extensible Choice" as a build parameter. You can select how to retrieve choices, including the way to share choices among all jobs.
-
GitHub Pull Request Builder (https://plugins.jenkins.io/ghprb/) - This plugin builds pull requests in github and report results.
-
Environment Dashboard (https://plugins.jenkins.io/environment-dashboard/) - This Jenkins plugin creates a custom view which can be used as a dashboard to display which code release versions have been deployed to which test and production environments (or devices).
-
SonarQube Scanner (https://plugins.jenkins.io/sonar/) - This plugin allow easy integration of SonarQube ™, the open source platform for Continuous Inspection of code quality.
-
Job DSL (https://plugins.jenkins.io/job-dsl/) - The Job DSL plugin attempts to solve this problem by allowing jobs to be defined in a programmatic form in a human readable file.
- Masked Password (https://plugins.jenkins.io/mask-passwords/) - This plugin allows masking passwords that may appear in the console, including the ones defined as build parameters.
- Pipeline Utility Steps (https://plugins.jenkins.io/pipeline-utility-steps/) - Small, miscellaneous, cross platform utility steps for Jenkins Pipeline jobs.
Credentials
Create the following credentials (Manage Jenkins->Credentials)
-
Github user/token for your code repository containing your Commerce solution code. Name this credential with id:githubCodeRepoCredentials
- SAP Commerce Cloud API token with id:commerceCloudCredentials (for building and deploying using APIs to SAP Commerce Cloud; with username as subscription ID and password as API token). See product documentation page on generating tokens.
Additional Configurations
- Create configuration for Sonar and add host and token to Sonar Plugin with name sonarQubeConfiguration - Mange Jenkins > Global Tool Configuration > Scroll for SonarQube Scanner > Add sonar scanner > name
- In Manage Jenkins→Configure System, click add Global Shared Library - with name "shared-library"
Execution Steps
- Download the sample pipeline code to your local machine.
-
Execute the following, filling in all the details:
. ./customize.sh
-
Review dsl/builder.groovy to ensure all the fields you want committed to your repository are set correctly. These will become the default parameters for your pipeline jobs.
- Commit and push the changes to the repository you'll be using for managing your Jenkins configurations
-
In order to use the "Build every day" job, you need to place the right SAP Commerce artifact in the root workspace folder of the machine that will be doing the builds using the naming convention from Download Center (e.g. CXCOM2005 -*.ZIP)
If you will be using the Integration Extension Packs for Core Commerce 2005+ you will need to uncomment the lines in extractCommerce.groovy and also include the zip file in your workspace (same place you put the CXCOM zip file
subordinate/workspace/CXCOM200500P_3-70004955.ZIP
The sonarqube artifact only works with commerce cloud version 2005 currently as there was a problem with the Sonar ant task that is provided with previous versions, so if Sonar is desired to be run, then please the project must use a 2005 commerce release.
- In Jenkins, click the "New Item" option from the left toolbar
- Give it a name to identify the job that will be used to build out the seed jobs, select 'pipeline' and click 'ok'
- On the next screen scroll down to the 'pipeline' section and select "Pipeline script from SCM". Fill in the remaining details to point to the repository you commited your jenkins configs to in step #5
- Once the job is created, build it. It will generate all the seed jobs in the /pipelines folder
- Determine which pipeline job you wish to run and execute, filling in any parameters that differ from default
Additional Details
This section covers some more details on how the provided code is setup.
Project Structure - Jenkinsfile
Jenkinsfile is the main file with code-based definition of the Jenkins Pipelines. It is a single
file so it can grow to be really large. If you look at the code you can see the use of Shared Libraries, which
were used to reduce redundancies.
With Shared Libraries project will be separated on pipeline definition (in pipelines folder) and method definition (in vars folder). Both directories have code-based Groovy files.
Project Structure:
├── Jenkinsfile ├── README.md ├── customize.sh ├── dsl │ └── builder.groovy ├── pipelines │ └── pipelineBuildEveryDay.groovy │ └── pipelinePackageAndDeploy.groovy └── vars ├── addProperty.groovy └── buildCommerceCloud.groovy └── buildCommerceCloudCheck.groovy ├── checkoutRepository.groovy ├── cleanProjectDir.groovy └── commerceCloudDeploy.groovy └── commerceCloudDeployCheck.groovy ├── executeAntTasks.groovy ├── executeInstallScript.groovy ├── extractCommerce.groovy ├── failIfBuildUnstable.groovy ├── replaceInProperties.groovy └── sonarqubeCheck.groovy
Purpose
-
dsl - contains the main DSL seed job to generate the pipeline jobs and views
-
pipelines - contains pipeline definitions that utilize the commands defined in vars
- vars - contains groovy commands that are used in the pipelines
Customization Script
The customize shell script should be run at the start of your project.The script will replace placeholders in dsl/builder.groovy, which is used to generate the jobs. You will have to set:
- URL for the Git repository containing your commerce code
- URL for the Git repository containing your jenkins (this project)
- URL for your Sonar instance
- packages to test
Defined Jobs
The code comes with 2 sample pipelines that you can use to get started:
- Build Every Day
- Description: A job to build project every day on develop branch.
- Steps:
- Prepare environment
- Extract Core Commerce zip
- Checkout custom code branch
- Setup platform
- Run Sonarqube
- Run all tests
- Package and Deploy
- Description: A job to build a package and deploy to your Commerce Cloud environments. This will typically be done when you're ready to promote a stable build to your public cloud environments.
- Steps:
- Build code package on SAP Commerce Cloud
- Deploy package on SAP Commerce Cloud environment
Conclusion
You should now have an idea of the differences between Continuous Integration and Continous Delivery when it comes to SAP Commerce Cloud. You are free to use the provided sample pipelines in your Jenkins server to quickly take the first steps to building a CD pipeline. Often once you've got a stable CI/CD pipeline performance can be the next issue to tackle, so we would encourage you to look at our series of articles on Managing Performance in an SAP Commerce Cloud Project.