Creating a CI/CD pipeline between Jenkins and Mobile Cloud Services

July 7, 2017 | 13 minute read
Text Size 100%:

Introduction

Continuous Integration and Continuous Delivery (CI/CD) is a popular practice in modern development teams. The idea behind these concepts is to automate as much as possible from the deployment cycle with as little human interaction as possible.
This is done by writing tests at several levels. Each time a change is made, the tests will run and provide instant feedback if something goes wrong. This is the Continuous Integration part of the pipeline. One of the biggest advantage of this is that the developer can get instant feedback if a change breaks the build. This way he can fix it while everything is still fresh in his mind.

Continuous Delivery is the next part of the pipeline which automates the deployment to production when the CI step is successful. This is very important in the modern time when go to market time is very important. Older paradigms aren't viable anymore where a deployment can only happen once every 3 months. By setting up a pipeline that automates everything, you not only reduce the go to market time but also reduce the human interactions.

Main Article

In this article we will take a closer look at using Node.JS in combination with Jasmine to test our code and configure an automated test script that will run every time a developer pushes his code to a specific branch in the code repository.

CI/CD with Mobile Cloud Service

Starting from Mobile Cloud Service 17.2.5, Oracle released a toolset that allow us to integrate testing and deploying from any CI/CD tool that supports a shell command during its build step.
In this article we will take a look at how to use that toolset to setup a pipeline from Jenkins.
Following diagram shows how a pipeline might look:

CI/CD Pipeline

These are the steps from such a typical pipeline:

  • Developer makes changes to code and commits to source control
  • Jenkins picks up the commits and triggers a new build
  • Checkout of the code is initiated by Jenkins and compilation of code (if required) is done
  • Unit tests of the code are run
  • Deployment to a test Mobile Cloud Service instance is initiated
  • Integration tests are done which run against the deployed code on MCS
  • Deploy code to production instance of MCS

if any of the above steps fail, it will stop the build and the developer will be notified.
By providing tests that have 100% code coverage, you guarantee that all your code is tested and your change of finding a bug early in the process is increased. By getting that feedback so early in the process, it also takes less time to fix it.
When you have to fix a bug from code you written a month ago, it takes you more time because you have to familiarize yourself again with that part of the code. That's why a CI pipeline is so important. The earlier on in the process you need to fix it, the easier it becomes for the developer.

In this article we will focus on the first part of the pipeline:

cicdPipeline

The components used in this approach are the following:

  • Testing Framework: I am using Jasmine as it's one of the most popular testing frameworks for Node.JS.
  • Source Control Management: In this post we will be using a GIT repository to store the code. I'm using BitBucket as the host but you can also use GitHub or any other source control management. In a later blog post I will also focus on setting up a similar pipeline using Oracle's Developer Cloud Service.
  • CI/CD tool: We will be using Jenkins to setup a pipeline for automated testing and deployment. You can use any tooling as long as it supports executing a shell command as part of the build.
  • Mobile Cloud Service: Oracle's MBaaS product based upon Node.JS

Test Driven Development

A good approach when working with a CI/CD pipeline is using Test Driven Development (TDD). This requires you to write tests before writing the implementation. It's a shift in the mindset of the developer but it provides a good practice that will support the best results from your pipeline.

By writing tests first, you also have to think about functionality before anything else. It also promotes lose coupling because your tests don't care about platform specific infrastructure. It only cares about functionality and business logic so the developer is more likely to write code that is loosely coupled with the underlying infrastructure.

In this example we wil write a simple Human Resource API that allow us to do following things:

  • Request a list of employees
  • Hire a new employee: this includes some business logic to set a default amount of vacation and salary
  • Request the available vacation from an employer
  • Book a vacation
  • Cancel a vacation
  • Fire an employee

Setting up the project

Before we can dive into writing our tests, we first need to define our project on Mobile Cloud Service and define a basic set of API's for the project so we can download the scaffold.

Based upon the above description of our project, I have created following resources in an API on MCS:

resources

Now we can download the scaffold from selecting Implementation on the left hand side and click the green JavaScript Scaffold button.
Once you have downloaded it, unzip in the folder you want.

 Writing unit tests

Before we can write our tests, we need to add the Jasmine module to our project. This can be done by executing following command from the directory where the package.json is stored:

npm install jasmine --save

This will download Jasmine as well as add the dependency to the package.json file.

By default Jasmine looks in a folder called specs for your test specifications. When we execute a Jasmin command, it will look for all the javaScript files in that directory and execute the tests.
In our example we create a file called hrSpec.js in the specs folder:

specs

Our test is as follows:

var HR = require("../hr").HR; var hr = new HR(); describe("HR API", function() {     describe("Employees", function() {         it("Returns a list of employees", function() {             var employees = hr.employees;             expect(employees.length).toBeGreaterThan(0);         });         it("Hire an employeeHire an employee", function() {             var employeesBefore = hr.employees.length;             var emp = {firstName:"Tom",lastName:"Tomke",department:"Finance"};             var newEmployee = hr.hire(emp);             var employeesAfter = hr.employees.length;             expect(employeesAfter).toBeGreaterThan(employeesBefore);             expect(newEmployee.salary).toBeGreaterThan(1000);             expect(newEmployee.vacation).not.toBe(undefined);         });     }); });

Obviously when we run jasmine, all the tests will fail because we don't have the required files.
The test however show that we need to create an HR object that contains the required functionality.

The next step is to implement our HR object so our tests can be executed:

In order to do so, I create a file hr.js in the root of my project with following content:

var employees = require("./employees"); var HR = function() {     var self = this;     self.employees = employees; } HR.prototype.hire = function(employee){     employee.salary = 10000;     employee.vacation = {available:20,balance:20,taken:0};     this.employees.push(employee);     return employee; } exports.HR = HR;

I also created an employees.json file which contains the list of employees:

[   {     "firstName":"John",     "lastName":"Doe",     "department":"HR",     "salary":"15000",     "vacation":{       "available":"20",       "balance":"20",       "taken":"0"     }   },   {     "firstName":"Jane",     "lastName":"Do",     "department":"HR",     "salary":"15000",     "vacation":{       "available":"20",       "balance":"18",       "taken":"2"     }   },   {     "firstName":"Mark",     "lastName":"Smith",     "department":"IT",     "salary":"16000",     "vacation":{       "available":"20",       "balance":"10",       "taken":"10"     }   },   {     "firstName":"Fred",     "lastName":"Flins",     "department":"Finance",     "salary":"20000",     "vacation":{       "available":"30",       "balance":"20",       "taken":"10"     }   } ]

The last thing we need to do in order to execute our tests, is adding a test script to our package.json:

{   "name": "hr",   "version": "1.0.0",   "description": "HR",   "main": "hrAPI.js",   "scripts": {     "test": "./node_modules/.bin/jasmine-node specs"   },   "oracleMobile": {     "dependencies": {       "apis": {},       "connectors": {}     }   },   "dependencies": {     "jasmine": "^2.6.0",     "jasmine-node": "^1.14.5"   } }

Notice that in the scripts section we have added an entry test which points to our jasmine module. By doing so, we can easily execute the unit tests by executing following command:

npm test

When everything goes well, we will see following output in the console:

D:\projects\Oracle\hrAPI>npm test > hr@1.0.0 test D:\projects\Oracle\hrAPI > jasmine-node specs .. Finished in 0.008 seconds 2 tests, 4 assertions, 0 failures, 0 skipped 

Adding code to source control

Now that we have our initial implementation, it's time to add it to our source control.
First we need to initialize our directory as a git repo:

git init

The next step is to add our files to the repo, commit and push:

git add . git commit -m "Initial code" git remote add origin <repoURL> git push origin master

When all goes well we have committed and pushed our code to the remote repository.

Before we can configure Jenkins to deploy anything to MCS, we need to setup the MCS tooling:

Installing MCS Tooling

The tooling is part of the MCS SDK which you can download from OTN: http://www.oracle.com/technetwork/topics/cloud/downloads/mobile-cloud-service-3636470.html
Depending on the target platform you are building a mobile application for, you need to downlod the Android, iOS or Windows version of the SDK. All 3 contain a zip file mcs-tools.zip.

Unzip that directory and open a command shell in that directory.

npm install -g

This will make the tooling globally available. In order to test if the installation is fine, you can execute

mcs-deploy --version

It should return "17.2.5" or higher if the SDK has been updated.

Configuring toolsConfig.json

The MCS tooling uses a toolsConfig file that contains information about the deployment like mobile backend ID, base URL and so on.
A default toolsConfig.json has been created for you when you downloaded the scaffold. The only thing to do is to set the correct values. All these values can be found in the Mobile Backend settings page:

mbe

baseUrl: the base URL for your MCS instance.
mobileBackendID: the mobile backend ID that is provided under HTTP Basic section of the MBE settings page
anonymousKey: the key for anonymous authorization which can be displayed by clicking the show link next to Anonymous Key from the MBE settings page
tokenEndpoint: The endpoint to obtain an OAuth token which can be found under the Environment URLs section of the MBE settings page
clientId and cliendSecret: Cliend ID and secret that can be found in the OAuth Consumer section.

This is how the first part of my toolsConfig.json looks like:

"apiName":"HR",     "apiVersion":"1.0",     "apiId":"3ee1fcb4-136e-4490-9a44-69bb0e6054c4",     "baseUrl":"https://mcs1-usoracleateamtrial90030.mobileenv.us2.oraclecloud.com:443",     "authorization":{         "mobileBackendID":"6c5e7b3e-6a81-46a0-8112-80b273b0579d",         "anonymousKey":"*******************",         "oauth":{             "tokenEndpoint":"https://usoracleateamtrial90030.identity.us.oraclecloud.com/oam/oauth2/tokens",             "clientId":"bd3e9164-30c4-4f94-9524-991deb9cc4d9",             "clientSecret":"*************"         }     }

We can validate the settings and try our initial deployment to MCS.
Open a terminal window in the directory of the project and we can deploy our code to MCS:

D:\projects\Oracle\hrAPI>mcs-deploy toolsConfig.json Warning: Configuration property "proxy" is undefined To display help and examples associated with warnings, use the --verbose option prompt: Team member username:  yannick.ongena@oracle.com prompt: Team member password:  ******** Deployment completed successfully

This shows that we are ready with the first deployment and everything is working.

The next step is to automate everything using Jenkins so that every time we commit a change to the master branch, it will automatically trigger the build which will fire the unit tests and deploy to MCS.

Configuring a new item in Jenkins

In Jenkins we will create new item that will trigger a build once we make a change in our source control management. This can be done my configuring the Source Code Management section of our item:

scm

Because I'm using bitBucket, I point the Git repository to my repository on BitBucket but you can enter whatever Git repository you are using.

In the trigger section I configure it to trigger whenever I make a change in the code:

buildTrigger

I'm using the BitBucket plugin for Jenkins which exposes a webhook that I need to add to my BitBucket repository.
There are similar plugins for GitHub and other providers.

In addition to this, you can also choose to schedule a build if you don't want to build on every commit but instead want to build over night or every two hours or so.
There's also a polling mechanism where Jenkins will poll your SCM repository and trigger a build whenever there is a change. This is the least recommended way as it will generate a lot of traffic to your SCM repository for the polling.

I also use the Jenkins credential store to store my credentials to MCS so they are injected in the build process and I don't have to hard code them into the build. This also results in masking the password in the console output. This is the preferred way to work with credential and I would recommend everybody to follow the same practice!

credentials

By checking the Use secret texts or files, you have access to the global credential store from Jenkins and you can bind it to an environment variable.
In our case, I bind mcsUser and mcsPass to the credentials I use to access MCS. These variables will become available during the build process and I can reference them.

The next part is to configure the build steps. We will 3 different steps:

  • Install the node modules
  • Execute unit tests
  • Deploy to MCS

buildSteps

The last build step is the one we use to deploy to MCS and where we need to reference our credentials from above.
In windows we use %mcsUser% to reference the environment variable, in unix based operating systems, that would be $mcsUser so do pay attention to the way your operating system references environment variables!

We can save the item and we are ready.

Testing the pipeline

Now everything should be in place and we can test our pipeline by making a change in our code.

So far, we actually haven't implemented the actual API on MCS. We only wrote some code that passes our tests but our API isn't working as it should.
We should also write some tests to make sure the actual API works but that's a topic for another blog posts.

For now, we will implement one of the API's by using our HR object created before:

First of all, at the top of the hrAPI.js we reference our HR object:

var HR = require("./hr").HR; var hr = new HR();

We use that object in our API:

service.get('/mobile/custom/HR/employees', function(req,res) {   var result = hr.employees;   var statusCode = 200;   res.status(statusCode).send(result);  });

Once we commit these changes, it should trigger a build in Jenkins.
When the job is finished, we can look at the console output of the job and see what happened:

Started by user Yannick Ongena Building in workspace D:\Program Files (x86)\Jenkins\workspace\HrAPI  > git.exe rev-parse --is-inside-work-tree # timeout=10 Fetching changes from the remote Git repository  > git.exe config remote.origin.url https://yongena@bitbucket.org/yongena/hrapi.git # timeout=10 Fetching upstream changes from https://yongena@bitbucket.org/yongena/hrapi.git  > git.exe --version # timeout=10 using GIT_ASKPASS to set credentials  > git.exe fetch --tags --progress https://yongena@bitbucket.org/yongena/hrapi.git +refs/heads/*:refs/remotes/origin/* Seen branch in repository origin/master Seen 1 remote branch  > git.exe tag -l # timeout=10 Checking out Revision b74dcd8f7fdd43f18de84ee76eedd3c5c8fd4086 (origin/master) Commit message: "version"  > git.exe config core.sparsecheckout # timeout=10  > git.exe checkout -f b74dcd8f7fdd43f18de84ee76eedd3c5c8fd4086  > git.exe rev-list b74dcd8f7fdd43f18de84ee76eedd3c5c8fd4086 # timeout=10 [HrAPI] $ cmd /c call C:\WINDOWS\TEMP\jenkins5478395335972590795.bat D:\Program Files (x86)\Jenkins\workspace\HrAPI>npm install npm WARN package.json hr@1.0.0 No repository field. npm WARN package.json hr@1.0.0 No README data npm WARN package.json hr@1.0.0 No license field. [HrAPI] $ cmd /c call C:\WINDOWS\TEMP\jenkins4768277725396900554.bat D:\Program Files (x86)\Jenkins\workspace\HrAPI>npm test > hr@1.0.0 test D:\Program Files (x86)\Jenkins\workspace\HrAPI > jasmine-node specs .. Finished in 0.006 seconds 2 tests, 4 assertions, 0 failures, 0 skipped [HrAPI] $ cmd /c call C:\WINDOWS\TEMP\jenkins7848470485361423407.bat D:\Program Files (x86)\Jenkins\workspace\HrAPI>mcs-deploy toolsConfig.json -u **** -p **** Warning: Configuration property "proxy" is undefined To display help and examples associated with warnings, use the --verbose option Deployment completed successfully Finished: SUCCESS

You can easily see that the different build steps are executed and when they are all successful, our code is deployed to MCS within minutes of our commit.

Next Steps

The above scenario shows the initial pipeline between Jenkins and MCS but we are far from finished.
In order to support a good CI/CD pipeline we also need to have following items:

  • Integration tests: these tests will call and validate the individual API endpoints we created on MCS. These are similar to unit tests but they will test if our code works well in context of the MCS environment
  • Local MCS sandbox: the MCS tooling is not just a toolset to deploy, it also comes with a framework to simulate certain aspect of MCS so we can run our code from our local machine as if the code was deployed to MCS. This is useful if you make calls to platform APIs like notifications, storage and connectors. These things are also tested using the integration tests but it's good to be able to test them locally before deploying.
  • Branching support: in some cases you want to branch out certain features of the API. In these cases you want to work with different versions of an API. We can create a pipeline that supports this by using different branches in the code repository and each branch will use a different toolsConfig file so we can deploy to a different API.
  • End 2 End Tests: so far we have isolated our pipeline to MCS only however when we build a mobile application, we want to be able to test our mobile application whenever we make a change to our API so we are sure we aren't braking anything for our users. This is done using end 2 end testing.
  • Load testing: another form of testing is load test. We need to make sure our environment is capable of performing well under load. These tests don't need to run every time we commit however they should run once in a while and these can be configured in our pipeline as well.

I will write separate blog post for some of the items above after which I will update this post with a link to the post.

Conclusion

Continuous Integration and delivery is an important aspect of modern development. It tailors towards the need for the business to go to market as soon as possible. By adopting a CI/CD paradigm, the quality of your products will improve a lot as developers will receive feedback as soon as they make changes. Because most of the actions are automated, human error is eliminated as much as possible.

By providing the tooling for Mobile Cloud Service, Oracle enables its users to benefit from the advantages of the modern development ideas like test driven development and Continuous Integration/Delivery.

Yannick Ongena


Previous Post

Integrating Microsoft SharePoint Document Management with Oracle Sales Cloud

Bill Jacobs | 18 min read

Next Post


Oracle Data Integrator Best Practices: Using Check Knowledge Modules on both On-Premises and Cloud Computing

Benjamin Perez-Goytia | 9 min read