X

Best Practices from Oracle Development's A‑Team

Using Oracle Developer Cloud Service for semi-automated environment promotion handoffs

Continuous Integration and Continous Delivery (CICD) promise great rewards if your organization can embrace the significant challenges involved.

However, traditional production environment management practice is often favoured in enterprise environments, because other concerns are much more important to the enterprise than the "latest features".

Here I am going to discuss a way to potentially bring some of the automation rewards of CICD all the way to the production environment, for such an enterprise. I am going to focus on using Oracle Developer Cloud Service's (DevCS) featureset to enable this in a relatively straightforward manner.

Organizational assumptions

Typically, production and (often) pre-production environments are managed by an organization other than the development team building the software targetted to them. The production environment management team we'll call "Operations", and the non-production environment team we'll call "development".

As an aside, DevOps is usually considered the practice of having the development team own all environments associated with their software, including production - merging Development and Operations into a single unit for that software piece.

Within DevCS, we'll establish two projects: one for the day-to-day development, and one for the operations team to use to manage the production systems. Projects within DevCS have distinct membership (though they all share the same Oracle Cloud Identity domain), which means the Developer members won't, by default, have access to the Operations project (there may be a temporary grant to setup this system, but it can be subsequently revoked once working).

Development process

Development will have a "Development integration server" and maybe a "System integration environment" where code from various sources can be brought together for testing and further promotion. There may be "throwaway" environments for immediate unit testing as well.

Development uses CICD automation to build and deliver software through a CICD-like pipeline, ultimately delivering tested and verified software to some system prior to production.

The build flow is modelled in the DevCS development project, including deployment tasks and artifact creation and tracking. DevCS is able to fully track artifacts from a build in it's maven structured repository.

Development then need to deliver the software package to the Operations team who will take this forward into the production environments. Often this step is fraught with difficulty, as Operations has typically deployed the software package using traditional manual methods. What is worse, for a reasonably paced CICD cycle, they might have to action this on a relatively frequent basis.

Automation would significantly reduce the risks associated with this step, as well as potentially providing an audit trail of activity and artifacts all the way to production.

Cross-boundary deliverables

A typical deployment promotion will be composed of many parts, and usually those parts will need to move together as a unit to the next environment. They are tagged as versions, with artifact tracking in the CICD system, for future identification and attribution.

Moving this from development to production should be a matter of using the version identifiers already in the development environment, to identify the artifacts to move from development to production. Automation should be able to take this list of artifacts and use deployment scripts to promote them from development into production.

Walking through an example

The diagram below sketches a potential workflow, demonstrating how the artifact IDs are captured in one system and communicated in the other, and how human and computer automation work together to accomplish the deployment task.

Flow diagram

The first step is to create some kind of tracking ticket which can be referred to, containing this list of artifact identifiers. This ticket is probably going to be hosted in the Development project, since Development is probably much more cognisant of a build fulfilling the promotion criterion. We can use DevCS custom ticket fields to enumerate the artifacts we wish to deploy. In the below screenshot, we have defined a custom field "Artifact versions", and are populating it with a comma separated list of maven artifact coordinates. The task id in this case is task 1, so we would report "1" as the number to the operations team.

Example tracking ticket

The second step is to trigger a simple build job in the Operations DevCS project that retrieves the ticket data, and acts upon it by triggering further build and deploy steps in the Operations environment. This build would be manually triggered by an Operations team member in coordination with the Development team. Rather than communicating a complex list of artifacts, the Developer simply gives Operations the issue number, and the automation does the rest. The below screenshot shows such a build task.

Reading and triggering builds

Here we review an example build script that can retrieve a DevCS ticket from another project sharing the same identity domain, and use the custom field data to trigger a suite of build jobs in the local DevCS project.

It comprises two python scripts, the first, runbuild.py is run by the Operations DevCS build job directly, and it uses the second triggersubbuild.py to launch build jobs in the Operations DevCS with appropriate parameters. Both python scripts use the REST API of DevCS to accomplish their tasks so could trivially be written in any language. I use python3 for convenience.


Script runbuild.py

 import?requests import?os import?argparse import????????sys ? parser = argparse.ArgumentParser(description='Fetch a DevCS issue and trigger downstream builds',                                  epilog='''                                  The issue should have custom fields populated with maven artifact                                  coordinates to be passed to the individual build jobs''') parser.add_argument("--issueuser", dest='issueuser', default=os.environ.get('DEVCS_USER', 'missing'),                     help='DevCS User for reading issues, defaults to DEVCS_USER in the environment') parser.add_argument("--issuepasswd", dest='issuepasswd', default=os.environ.get('DEVCS_PASSWD', 'missing'),                     help='DevCS passwd for reading issues, defaults to DEVCS_PASSWD in the environment') parser.add_argument("--jobuser", dest='jobuser', default=os.environ.get('DEVCS_USER', 'missing'),                     help='DevCS User for launching jobs, defaults to DEVCS_USER in the environment') parser.add_argument("--jobpasswd", dest='jobpasswd', default=os.environ.get('DEVCS_PASSWD', 'missing'),                     help='DevCS passwd for launching jobs, defaults to DEVCS_PASSWD in the environment') parser.add_argument("--iddomain", dest='iddomain', help='DevCS identity domain', required=True) parser.add_argument("--region", dest='region', help='DevCS region (us2 or em2 usually)', required=True) parser.add_argument("--org", dest='org', help='DevCS organization', required=True) parser.add_argument("--issueproject", dest='issueproject',                     help='DevCS project (typically a name and code number: test_20181) with the issue',                     required=True) parser.add_argument("--jobproject", dest='jobproject',                     help='DevCS project (typically a name and code number: test_20181) to send build jobs to',                     required=True) parser.add_argument("--issueid", dest='issueid',                     help='DevCS issue ID to read from the project containing custom issue fields',                     default=os.environ.get('ISSUE_ID', '0')) cmdargs = parser.parse_args() ? issuefieldmap = {     'artifact_versions': 'triggerbuild' } ? issuesession = requests.Session() issuesession.auth = (cmdargs.issueuser, cmdargs.issuepasswd) ? issueurl = 'https://developer.{a.region}.oraclecloud.com/' \           '{a.org}-{a.iddomain}/rest/{a.org}-{a.iddomain}_{a.issueproject}' \           '/issues/v2/issues/{a.issueid}'.format(a=cmdargs) ? resp = issuesession.get(issueurl, params={"fields": "customFields,description,summary"},                         headers={'content-type': 'application/json'}) if resp.status_code != 200:     print("Received error {} from endpoint when reading issue {}, please check arguments and try again"           .format(resp.status_code, cmdargs.issueid))     print("=====")     print(resp.content)     sys.exit(1) print("Received issue response successfully.") data = resp.json() print("Issue summary: {}".format(data['issue']['summary'])) print("Issue description: {}".format(data['issue']['description'])) print("Dispatching jobs") for field, job in issuefieldmap.items():     artifacts = data['issue']['customFields'][field]     if artifacts is None:         continue     for artifact in artifacts.split(','):         os.system('python3 triggersubbuild.py --iddomain {a.iddomain} '                   '--region {a.region} --org {a.org} '                   '--project {a.jobproject} --job {job} '                   'JOBTYPE={field} ARTIFACT={artifact}'.format(a=cmdargs, job=job, field=field, artifact=artifact)) print("Job dispatch complete") 


This script will read an arbitrary list of custom fields from the issue (issuefieldmap) and will then attempt to parse the values as a comma separated list of maven style artifacts, and then triggers a new build job using the second script, passing it appropriate parameters from the ticket (ARTIFACT and JOBTYPE).


Script triggersubbuild.py

 import?os import?requests import?argparse import?sys ? # StoreDictKeyPair for turning a=b into a dict from https://stackoverflow.com/a/42355279 class StoreDictKeyPair(argparse.Action):     def __init__(self, option_strings, dest, nargs=None, **kwargs):         self._nargs = nargs         super(StoreDictKeyPair, self).__init__(option_strings, dest, nargs=nargs, **kwargs)     def __call__(self, parser, namespace, values, option_string=None):         my_dict = {}         for kv in values:             k, v = kv.split("=")             my_dict[k] = v         setattr(namespace, self.dest, my_dict) parser = argparse.ArgumentParser(description='Trigger a DevCS build',                                  epilog='Job parameters are supplied as key=value pairs to the command line') parser.add_argument("--user", dest='user', default=os.environ.get('DEVCS_USER', 'missing'),                     help='DevCS User, defaults to DEVCS_USER in the environment') parser.add_argument("--passwd", dest='passwd', default=os.environ.get('DEVCS_PASSWD', 'missing'),                     help='DevCS passwd, defaults to DEVCS_PASSWD in the environment') parser.add_argument("--iddomain", dest='iddomain', help='DevCS identity domain', required=True) parser.add_argument("--region", dest='region', help='DevCS region (us2 or em2 usually)', required=True) parser.add_argument("--org", dest='org', help='DevCS organization', required=True) parser.add_argument("--project", dest='project',                     help='DevCS project (typically a name and code number: e.g. test_20181)', required=True) parser.add_argument("--job", dest='job', help='DevCS Job name to trigger', required=True) parser.add_argument("properties", metavar='KEY=VALUE', action=StoreDictKeyPair, nargs=argparse.REMAINDER,                     help='KEY=VALUE pairs to pass to the Job as parameters') cmdargs = parser.parse_args() url = 'https://developer.{a.region}.oraclecloud.com/' \       '{a.org}-{a.iddomain}/s/{a.org}-{a.iddomain}_{a.project}/hudson/rest/' \       'projects/{a.org}-{a.iddomain}_{a.project}.{a.job}/scheduleWithParams'.format(a=cmdargs) ? session = requests.Session() session.auth = (cmdargs.user, cmdargs.passwd) ? params = {k: (None, v) for k, v in cmdargs.properties.items()} r = session.post(url=url, files=params) if r.status_code != 204:     print("An error occurred, job {} with parameters {} not submitted, status code {}".           format(cmdargs.job, cmdargs.properties, r.status_code))     print("=====")     print(r.content)     sys.exit(1) ? print("Job {} submitted successfully with parameters: {}".format(cmdargs.job, cmdargs.properties)) sys.exit(0) 

This script uses the DevCS hudson API to schedule a build job with the specified name and parameter values.

Build jobs

Let's review how the build tasks themselves are setup in DevCS.

Firstly, there are some common parameters for these jobs. Best practice suggests that you should alway use parameters to define credentials, so we will always have DEVCS_USER (String type) and DEVCS_PASSWD (password type) parameters.

Secondly, the python above uses a deconstructed DevCS URL as arguments to the scripts. Here's how to deconstruct your URL

URL parts

Highlighted with blue circles are the key parts of the URL:

  • the region is us2
  • the organization is dummyorg99999
  • the iddomain is iddomain1234
  • the project is test_98765
  • the job is triggerbuild

You'll want to ensure you use the correct values for your own build jobs.

Primary build job

For the main job being triggered by the Operations person, there will be one additional parameter: ISSUE_ID (String type).

Build parameters

The build step itself will invoke the runbuild.py script listed above. Typically one would have a special git repository setup to contain this and the other script, and use that for the workspace.

Build steps

Example triggered build job

For a build job being triggered by the task above, one could have ARTIFACT and JOBTYPE parameters (both String type) and this job would have a couple of build steps. First would be a build step to retrieve the artifact from the Developer Maven repository (where the artifact is stored), subsequent build steps may be required to accomplish specific deployment tasks, or a pre-defined deploy task could be used (to use a pre-defined deploy task, you would need to capture the artifact into the Operations DevCS by using the "archive artifact" post build option).

This is a sample "mavensettings.xml" file you can use to refer to an artifact repository, and run a "get" operation against it.

 <?xml version="1.0"?> <settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 https://maven.apache.org/xsd/settings-1.0.0.xsd">   <localRepository/>   <interactiveMode/>   <usePluginRegistry/>   <offline/>   <pluginGroups/>   <servers>     <server>       <id>developmentdevcs</id>       <username>${env.DEVCS_USER}</username>       <password>${env.DEVCS_PASSWD}</password>        <configuration>          <wagonProvider>httpclient</wagonProvider>          <httpConfiguration>           <all>             <usePreemptive>true</usePreemptive>           </all>          </httpConfiguration>        </configuration>     </server>   </servers>   <mirrors>   </mirrors>   <proxies/>   <profiles/>   <activeProfiles/> </settings> 

This would be used with the maven configuration, running org.apache.maven.plugins:maven-dependency-plugin:2.8:get as the goal.

Example maven get build step

The above image shows what a successfully configured "get" build step looks like. It'll store the artifact it downloads as "output.jar".

Short term workaround note

There is a small issue with the DevCS environment wherein authentication is specially handled when running on DevCS. Unfortunately, for cross-project operations such as outlined here, you may get a "forbidden" error (HTTP Error code 403), but only when running on DevCS. This is solved by using the preemptive authentication as shown in the sample above.

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.Captcha