X

Best Practices from Oracle Development's A‑Team

Development Patterns in Oracle Sales Cloud Application Composer, Part 2

Introduction

In Part 1 of this post (http://www.ateam-oracle.com/development-patterns-in-oracle-sales-cloud-application-composer-part-1), we used the experiences of three tradespersons – a plumber, a carpenter, and an electrician – to make the case that planning is critical before building or remodeling a house. During their brief hypothetical conversation while working on a job site, they convinced each other that formal planning, normally executed in building construction projects by drafting blueprints, ensures that all of the individual sub-systems work together efficiently in the completed house. We used the jist of their conversation to reinforce the necessity of planning in software development, especially when mapping out how all of the individual components of a software development project will work together, as much as possible optimizing the relationships among the various components. This kind of planning is as much a fundamental requirement for successful software development projects as blueprints are for building construction.

Laying out the structural framework for more complex software development projects greatly increases the odds of successful outcomes. This should come as no surprise, but nonetheless planning is often given short shrift or even ignored entirely.  Also generally accepted, but also occasionally ignored, is the practice of not re-inventing the wheel with every new project. With both building construction and software development, it would be redundant to start the planning and design stages of every new project from scratch. Normally, proven and optimized patterns are available to jumpstart planning and design, and they should be utilized whenever possible. In fact, the main goal of Part 1 was to suggest a framework for how global functions and trigger functions could interact in an Oracle Sales Cloud (OSC) Application Composer extensibility project, using the Oracle Documents Cloud Service (DOCS) as a representative integration target.

The plan for Part 2 (even blog posts benefit from plans!) is to continue exploring the relationships between the global function library we started to build and other extensibility artifacts, adding new features to the extensibility project with an additional trigger function, again using the Oracle Documents Cloud Service (now in version 1.1 of the REST API) as the integration target. If they are designed correctly, the global functions should be able to support the addition of new features without major refactoring.

Availability of New REST Resource in DOCS: Folder Sharing

Think of the global function library as similar to the building’s foundation. With the foundation in place, the fundamental means of interacting with Oracle Documents Cloud Service REST services from Sales Cloud is ready to support the superstructure, which in the case of Application Composer usually takes the form of object trigger functions. With working trigger functions (covered in Part 1 of the post) that support three outbound calls to the Oracle Documents Cloud Service – (1) creating a dedicated document folder when a new opportunity is created, (2) renaming the folder if and when the opportunity is renamed, and (3) deleting the folder with its content if the opportunity is deleted -- the extensions are functional/usable and can at least be tested end-to-end even if they are not quite ready for a production rollout.

One set of new features added to version 1.1 of the DOCS REST API allows for programmatic sharing of folders and documents. To round out the integration with Sales Cloud, we would like to take advantage of the new feature by writing a trigger function that adds or removes contributors from the DOCS folder set up for the opportunity whenever a new team member is added to or removed from the opportunity. Adding this new trigger function will be an additional test of how well the global functions are designed. If we can implement the trigger function with minimal effort, it is a good sign that the global functions have been built correctly.

Implementing the New Feature

As a refresher, below is the sequence of how the global functions work together when making a generic REST call from an object function or object trigger function:

  1. Prepare request payload if required.  Required format: Map<String, Object>
  2. Call global function: callRest( String restHttpMethod, String urlExtension, Map requestPayload)
    1. Inside callRest function: use map2Json function to convert requestPayload Map to JSON format.
  3. General error checking after function returns (e.g. check that response is returned in the expected format)
  4. Process response: json2Map(responsePayload) converts JSON response to Map
  5. Detailed error checking based on response content
  6. Process converted Map content returned in response as needed (function-specific)

In Sales Cloud Application Composer, the steps necessary for adding folder sharing support can be used as a prototype for adding virtually anything exposed in a REST API. Being able to address multiple sets of requirements is part of the advantage of having well-crafted global functions.

Below are the steps for designing and incorporating the new feature, including details and discussion of options:

  1. Identify the affected Sales Cloud object in the transaction. In this case we know we are working with an Opportunity or something related to an Opportunity (either or parent, child, or another object with a defined relationship).
  2. Decide on the best-fit trigger event. Normally all but a few of the sixteen or so triggers available can be eliminated.
  3. Create the trigger function and write the code. This step may require a few iterations.
  4. Add error handling code. How this step is implemented is going to depend on whether or not any supporting global functions existing for error handling.

Identify the affected Sales Cloud object in the transaction that will trigger the outbound web service call.  Typically, it is easiest to work from the top down to identify the object for which to write the trigger function. For example, in the folder sharing case, the known top-level object is Opportunity. In OSC Release 8/9 there are eleven candidate objects, consisting of four child objects and seven objects with defined one-to-many relationships. (The lists of child and related objects are shown on the overview page for the top level object.) It is obvious, as it will be in the vast majority of cases that the object of interest is Opportunity Team Member. The trigger will be fired whenever a team member is added or removed from the parent object. In the small number of cases where it may be impossible to isolate just one object, opening up the candidate child or related objects and examining the fields should lead to identifying (or eliminating) the object as the candidate for a trigger function.

Decide on the best-fit trigger event. For Release 8 and Release 9, refer to Table 1 for the available object triggers.

Table 1: Object Trigger Events for Groovy Function Code*

Event Fires When?
After Create New instance of an object is created
Before Modify Field is initially modified in an existing row
Before Invalidate Field is initially modified in an existing object, or when child row is created, removed,   modified
Before Remove Attempt is made to delete an object
Before Insert Before new object is inserted.
After Insert After new object is created.
Before Update Before existing object is modified in database.
After Update After existing object is modified in database.
Before Delete Before existing object is deleted in database.
After Delete After existing object is deleted in database.
Before Commit Before pending changes are committed.
After Commit After pending changes are committed.
Before Rollback Before pending changes are rolled back.
After Rollback After pending changes are rolled back.
After Changes After all changes have been posted but before transaction is committed.

*NOTE: Not all trigger events are exposed for every object. Minor variations exist across application containers and object types.

There are a number of behavioral caveats around the use of trigger events. Primarily, selecting the trigger that makes the most sense in the context of whether or not data updates are occurring in the trigger function code will dictate the correct event to use. For example, if a function is updating a field value it does not make any sense at all to do that in any of the “After…” events, as the main database transaction will have taken place already. From a performance perspective, this will force another set of transactions to the database, which is bad enough, but to add insult to injury, all validation code will run another time needlessly.  In the worst case, triggers may be called repeatedly, which may result in an endless loop (if App Composer did not have safeguards in place to prevent this from happening).

In some cases business logic will help make an informed choice of the best trigger event. For example, it makes little sense to add an opportunity team member to a DOCS folder as a contributor unless it is certain that the database transaction which adds or deletes the team member completes successfully. Since the team member trigger function is not making any data updates, it is not only safe, but also logical, to use one of the “After…” events.

Create the trigger function and write the code. Obviously this step will probably take up the bulk of the effort. To ease the amount of work required, look at what the global function(s) require for input parameters as well as what the global functions return. Structurally and functionally, that discovery process, in conjunction with the business need, will dictate a large part of what the trigger function needs to accomplish.

Below is the code for the new trigger function that adds a folder contributor:

println 'Entering AddFolderContributorTrigger' def docFolderGuid = nvl(Opportunity?.DocFolderGuid_c, '') if (docFolderGuid) {   def restParamsMap = adf.util.getDocCloudParameters()   // prepare URL extension   def urlExt = '/shares/' + docFolderGuid   // prepare request payload   def userGUID = adf.util.getDocCloudServiceUserGUID(Name)   def reqPayload = [userID:(userGUID), role:’Contributor’, message: ‘adding you to Opportunity folder’]   // make REST call (this is POST method) and save response payload   def respPayload = adf.util.callRest('POST', urlExt, reqPayload)   // convert JSON to Map for ease of handling individual attributes   def respMap = adf.util.json2Map(respPayload)   //TODO: better error checking required here   def errorCode = respMap.errorCode   if (errorCode != 0) {     // error occurred   } else {     println ‘Team member successfully added as contributor’   } } else {   println 'Opportunity folder has not been created for ' + Opportunity?.Name } println 'Exiting AddFolderContributorTrigger'

By leveraging the global functions, the object trigger script to add a contributor to a DOCS folder when a new opportunity team member is added is less than a dozen lines of code. The first block of code, after checking to see if a DOCS folder exists, obtains a DOCS user GUID by querying the service with a REST call. Then a URL extension string is built, a Map of required key:value pairs is populated, and both of which are fed to the global callRest function. The response from the function is converted to JSON and rudimentary error checking is performed.

Below is the code for the new trigger function that removes an existing folder contributor:

println 'Entering RemoveFolderContributorTrigger' def docFolderGuid = nvl(Opportunity?.DocFolderGuid_c, '') if (docFolderGuid) {   def restParamsMap = adf.util.getDocCloudParameters()   // prepare URL extension   def urlExt = '/shares/' + docFolderGuid + ‘/user’   // prepare request payload   def userGUID = adf.util.getDocCloudServiceUserGUID(Name)   def reqPayload = [userID:(userGUID), role:’Contributor’, message: ‘removing you from Opportunity folder’]   // make REST call (this is DELETE method) and save response payload   def respPayload = adf.util.callRest('DELETE', urlExt, reqPayload)   // convert JSON to Map for ease of handling individual attributes   def respMap = adf.util.json2Map(respPayload)   //TODO: better error checking required here   def errorCode = respMap.errorCode   if (errorCode != 0) {     // error occurred   } else {     println ‘Team member successfully removed from DOCS folder’ } } else {     println 'Opportunity folder has not been created for ' + Opportunity?.Name } println 'Exiting RemoveFolderContributorTrigger'

The script to remove a folder contributor is also less than a dozen lines of code, and relies upon the global functions in the same way as the add contributor script.  Obviously, the REST DELETE method is specified instead of using a POST, as per the DOCS REST specifications.

One additional function to obtain a specific user GUID, or unique id, from DOCS is needed.  This function takes a search string, representing a user name, as input, and after making a REST call into the DOCS users resource, returns the user GUID.  Below is the code for the function:

println 'Entering getDocCloudServiceUserGUID' def returnGUID = '' // prepare URL extension def urlExt = '/users/items?info=' + searchString // no request payload def reqPayload = [:] // make REST call (this is GET method) and save response payload def respPayload = adf.util.callRest('GET', urlExt, reqPayload) // convert JSON to Map for ease of handling individual attributes def respMap = adf.util.json2Map(respPayload) //TODO: better error checking required here def errorCode = respMap.errorCode if (errorCode != 0) {    // error occurred    println 'DocCloudService error; errorCode ' + errorCode } else {    // get user GUID    returnGUID = respMap.items[0].get('id') } println 'Exiting getDocCloudServiceUserGUID' return returnGUID

It may make the most sense to create this as an object function under the Opportunity object, or perhaps as a global function.  The differences are minor, and function location is a matter of developer preference.

Add error handling code. Given the simple integration architecture set up for this example -- a SaaS Sales Cloud instance making REST calls into a PaaS Documents Cloud Service instance -- admittedly there are not many options available, other than reporting that something bad happened, when unexpected errors occur at runtime. In an environment where user actions – for example saving a new opportunity – trigger synchronous outbound web service calls, interrupting the user experience by blocking the database transaction may not be optimal.

The error handling options are few: (1) continue with the Sales Cloud transaction, in this case completing the create or edit of an Opportunity object, (2) back out of the Sales Cloud transaction if any failures are detected in the web service calls, or (3) take a hybrid approach and give the user a certain degree of control over what to do after an error.  Due to the non-critical nature of the transactions between Sales Cloud and DOCS in this example, reporting the error and moving on suffice.  If there is a need to create a DOCS folder for an Opportunity after the fact, it would be possible to create an Action button that could call into the same global functions with the same logic as the object trigger functions.

Summary

Planning out what work is done in global functions and what gets done in object trigger scripts, if done correctly, can lead to major efficiencies when adding new features to an existing extensibility project. This example used existing global functions that make REST calls from Sales Cloud to Documents Cloud Service to implement support for maintaining a group of DOCS folder contributors as team members are added or removed from the Opportunity team. Due to prior planning and following guidelines laid out in Part 1 of this post, object trigger functions were extremely lightweight and were added to the extensibility project with minimal effort.

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.Captcha