Best Practices from Oracle Development's A‑Team

Load Data into Oracle HCM Cloud Using ICS


First of all, I would like to thank my A-Team colleague Christian Weeks for providing a few ingenious work-arounds for some limitations in the current version of ICS (16.4.5). These work-arounds will be discussed in detail in the later sections of this article. I would also thank another A-Team member, Angelo Santagata, who provided assistance with the proof of concept effort behind this article, and my manager, Pete Farkas, for reviewing this article and making it more reader friendly.


The requirement for integrating HCM (Human Capital Management) Cloud service with Oracle ICS (Integration Cloud Service) has come up often with various customers. Oracle ICS can function as a central hub for taking in external data feeds, correctly formatting it for HCM Cloud, and automating the final upload of external data into HCM Cloud via HCM Data Loader. With all feeds flowing through ICS, customers can have greater visibility into the end-to-end data flow between HCM Cloud and external systems. They can also quickly respond to exceptions. This greatly reduces the integration cost. This post intends to demonstrate one way to achieve this integration in ICS and covers the following topics:

  • User case description
  • Overview of HCM Data Loader
  • Detailed description of the ICS integration implemented

There are two additional posts related to this subject. They cover two other topics:

All three posts together will provide a complete end-to-end tutorial for someone new to ICS and HCM allowing them to get started in creating ICS integration applications with HCM.

Use Case

Oracle HCM manages persons (workers) of an organization. Each person in HCM has an associated profile. A person's profile can contain all kinds of information relevant to the person and the organization they belong to. Different categories of information in a profile are organized as Content Sections. Each content section is associated with a Content Type. One example of a content section (or content type) is "Competencies", which describes a person's various competency levels.

In our use case, a custom content type called "Additional Qualifications" has been created in HCM. This content type has three content items defined: Artist, Athlete and Photographer. The use case in this example calls for adding additional qualifications to a person's profile (see image below).


In this example, these additional qualifications come from a 3rd party data source available in XML, and need to be uploaded in batch into HCM Cloud. In addition, the upload process should start with an XML input message as described in sample-input-payload.xml.

Overview of HCM Data Loader

Oracle HCM Cloud provides a Data Loader feature. It allows data files in HDL (HCM Data Loader) format to be imported into HCM. An HDL data file contains meta data that describes actions HCM should take on the data as well as the attributes of data objects in the file. You can a sample below of an HDL data file that contains the data objects which are used in this demonstration.


As one can see, the HDL format uses | (vertical line) as a field separator. Line 1 and 3 are meta data lines. Line 2 and 4 describe two data objects. Line one indicates the object type is TalentProfile (second field) and it is followed by data object attributes. Line 3 describes an input object of type "ProfileItem" along with its attributes. This HDL data file tells HCM to add a content item (Artist) as a ProfileItem to the TalentProfile object associated with the person who has a PersonNumber of 98 (data attributes specified in the second line).

There are two ways to get the data file uploaded and imported to HCM - manual or programmatic. The manual approach is via the HCM web based UI. The web UI tool is useful for checking the validity of data attributes and values. Our plan is to cover the HCM web UI tool relevant to this POC in a future post.

The programmatic approach covered here will use the importAndLoadData operation in the HCM SOAP API . The importAndLoadData operation schedules an Import and Load HCM File Data process in HCM. Before this process can run, the HDL data file must already exist in UCM storage. Uploading the HDL data file is accomplished by calling the UCM Generic Soap Service.

For more information on HCM Data Loader, please check out the following documentations:

SOAP Web Services for Oracle HCM Cloud

HCM Data Loader Overview (blog)

SOAP Web Services for Oracle HCM Cloud

HCM Data Loader: Business Object Documentation (Doc ID 2020600.1) (Login to Oracle Support site required)

For more information on the UCM Generic SOAP Service:

UCM Web Services for Integration

Automating File Import/Export (blog)

UCM Services as Web Services (blog)


Logical Steps

Before we get into implementation details of the ICS Integration, let's take a logic look at what steps are required to implement our use case.

  1. 1. Accept an XML document as input
  2. 2. Convert input XML data into HDL format data
  3. 3. Zip HDL format data (HCM Data Loader requires zipped file)
  4. 4. Base64 encode zipped HDL format data (base64 encoding is required to transmit binary data in SOAP/XML web service)
  5. 5. Call UCM Generic SOAP Service to upload the zip file content (base64 encoded zipped data from the prior step)
  6. 6. Call HCM Data Loader SOAP Service to schedule an Import and Load HCM File Data process
  7. 7. Return the HCM processed ID to ICS caller

ICS Connections

ICS Integrations usually start with connections. There are four connections used in this exercise. See below image. Blank spaces are caused by removing customer name.



ACME_SOAP_TalentProfile_Input Connection: Trigger


This connection is used as a trigger in the start node of the integration. Notice security is set to username token, which is required to login to ICS. The WSDL file used in this connection can be accessed at talentprofile_service_ep.wsdl.

ACME_POC-FA-HCM-Conn Connection: Invoke


This connection is used to call the importAndLoadData operation of the HCM Data Loader SOAP API. Its WSDL URL points to an HCM Cloud instance (hcmhostname.acme.com) in Oracle's demo environment. Basic authentication must be configured at the bottom of the screen in order to connect to HCM instance successfully.

ACME_POC-FA-UCM-Conn Connection: Invoke


This connection is used to call the GenericSoapPort service of UCM to upload the data file to UCM. Basic authentication must be configured. The WSDL URL points to the UCM instance (ucmhostname.acme.com) associated with the HCM instance. Notice the slight difference in their host names.

ACME_POC-FTP Connection: Invoke


This FTP connection is used as a work-around for Base64 encoding (a solution found by my colleague Christian). Details will be discussed in the next section. The basic idea is that, after HDL data is zipped, the resulting binary data is written out to an FTP server. Then the same data is immediately read back as opaque data. The FTP adapter automatically Base64 encodes all opaque data and thus accomplishes our goal.

In the recently released 17.1.3 version of ICS, there is a new transformation function to encode and decode data in Base64. But I have not had a chance to try it out to see if it can be leveraged to make this FTP adapter work-around obsolete.

Security must also be configured although it is not shown in the image.



The ICS Integration Details


Please see the above image for the entire integration flow. The orchestration flow generally matches the logic flow described earlier. Details of each node are discussed in sequence in the following section.

1 addTalentProfileData

This start node of the integration is based on the SOAP_TalentProfile_Input connection (trigger). After the wizard completes, the summary page should look like the following:


A sample input XML data is at sample-input-payload.xml.

2 AssignFileName

There are two file names involved in HCM Data Loader. First, the zip file name can be any name with a zip extension. Second file name is the actual data file contained in the zip file. HCM Data Loader defines a file name for each data object. In our case, the data file name must be TalentProfile.dat.

In our implementation, the zip file name has a pattern of "TPyyyymmddhhmmss". The zip extension is appended in a later step. See TP201722718564.zip for a sample of the generated zip file.


3 writeInputAsHDLFormat (Mapper)

This third step maps an input XML data to a full XML data set that contains additional meta data labels required by HDL. A deeper look at translating XML to HDL format with NXSD is planned for a future post. Below is a picture of the mapper UI. The resulting full XML data can be viewed at sample-input-transformed-with-label.xml.


4 writeInputAsHDLFormat

The fourth step uses an early adopter feature called Stage File.

The Stage File activity allows read, write and zip operations to files local to ICS instance. When reading and writing files, Stage File allows translation of file content between XML and native format via a native schema file (.nxsd). For common native formats such CSV (Comma Separated Values), Stage File supplies a mapping tool for drag and drop mapping.

At this step of integration, Stage File is used to write the full XML data set with labels from step 3 to a temporary file with a translation defined by hcm-talentprofile.nxsd. The resulting HDL Format data file looks like TalentProfile.dat. Notice that the file name is fixed to TalentProfile.dat as required by HCM Data Loader.

Due to the complexity of the HDL format, the native schema file hcm-talentprofile.nxsd had to be created manually. Details of this nxsd file will be discussed in a future post.


5 zipHDLFile

As the name indicates, this step uses the same Stage File to zip the TalentProfile.dat created in step 4.


Work-around for Base64 Encoding

The Stage File activity supports read binary operation. So it would be ideal to use the read operation of Stage File with an opague nxsd to encode the file content into Base64 text. Unfortunately the Stage File in its early adoption version does not work with opaque nxsd. My colleague Christian found a work-around by using FTP adapter to read and encode binary data with opaque nxsd. Of course the drawback here is the introduction of an external FTP server into the system.


6 ftpSendZippedHDLFile (Mapper)

At this point of the integration, a zip file is created on the ICS instance local drive. It is ready to be sent over the FTP server. This step performs the mapping of file reference from step 5 to the input XML data for the FTP adapter in step 7.


7 ftpSendZippedHDLFile

Step 7 sends the zipped HDL file from ICS local drive to the FTP server. Notice the File Name Pattern is set. But in our implementation, this default file name pattern is overridden by the file name in the input XML data set in the Step 6.


8 ftpReadZippedHDLFileBase64 (Mapper)

Step 8 maps the file name and directory from the output of Step 7 to the input of Step 9.


9 ftpReadZippedHDLFileBase64

Step 9 does the actual read operation. It encodes zipped binary data with Base64 encoding as configured in the opaque.nxsd.


10 UploadHDLFileToUCM (Mapper)

At this point of the integration, a Base64 encoded text representing the zipped HDL data is ready to be uploaded to UCM. This step creates the XML payload required to invoke UCM Generic Soap Service to check in (upload) the file.


The most import fields are

  • dDocName: its value is used as content ID for this document. Content ID is required in invoking HCM importAndLoadData SOAP service.
  • Contents: this field contains the actual Base64 encoded HDL data

Other fields contain mostly static data.

A sample of the resulting SOAP payload should look like sample-ucm-payload.xml.


If you complete the step 10 and 11 below and run a test to see if a file is uploaded to UCM, you will get an error message from UCM like the following:

Content item 'UCMFA1846155' was not successfully checked in. A content type cannot contain spaces

Again, Christian diagnosed the cause and found a work-around. If this integration is exported and expanded into a folder, you can open the mapping file (.xsl) defined in step 10. The mapping file contains mappings similar to the following:

<nstrgmpr:Field xml:id="id_33">
   <xsl:attribute name="name" xml:id="id_34">dDocName</xsl:attribute>
   <xsl:value-of select="fn:upper-case($ReadZippedFile/nsmpr2:SyncReadFileResponse/nsmpr5:FileReadResponse/nsmpr5:FTPResponseHeader/nsmpr5:fileName)" xml:id="id_55"/>


It turns out that the attribute tag adds extra blank spaces and a line break to the resulting XM. For some reason, the UCM Generic SOAP service does not like these extra characters. The work-around to this issue is to remove the attribute tag like the following:

<nstrgmpr:Field xml:id="id_33" name="dDocName">
   <xsl:value-of select="fn:upper-case($ReadZippedFile/nsmpr2:SyncReadFileResponse/nsmpr5:FileReadResponse/nsmpr5:FTPResponseHeader/nsmpr5:fileName)" xml:id="id_55"/>

After the modifications are made to all Field tags, the folder should be repackaged (using jar). The new package can be imported back to ICS. You can choose to override the original integration or create new one.

11 UploadHDLFileToUCM

Step 11 executes a SOAP call to UCM to upload the zip file.


12 ScheduleImportProcessHcm (Mapper)

This step prepares XML payload for invoking HCM SOAP Data Loader. The required Content ID value should be taken from the dDocName field from the UCM XML payload. A sample of generated XML payload is at sample-hcm-payload.xml.


13 ScheduleImportProcessHcm

Now we can execute the Data Loader SOAP call.


14 addTalentProfileData (Mapper)

Finally, map the process ID returned from HCM to response XML payload.


15 addTalentProfileData (End node)

This node ends the entire integration and returns response message to caller. A sample response message looks like sample-hcm-response-message.xml.



Testing the ICS Integration

Any SOAP client can be used to test the ICS integration. I use SOAPUI as my test client. The WSDL file location should be at


A sample request message should look like sample-ics-integration-request.xml.

Notice the username token and timestamp setting in the security section of the SOAP message. In SOAPUI, username and password must be configured first. Then use the popup manual to add WSS username token and WS timestamp as shown below.


Verify Data Loading in HCM

To verify data is actually loaded into HCM and associated with the right person, we need to access HCM web UI. For one of the HCM instances in Oracle demo environment, the URL for the HCM web console should looke like this:


Login as a HR Specialist (betty.anderson); bring up Navigator manual and select Profiles. See image below.


Search for the person targeted in the data loading and you see "Additional Qualifications" section is added to that person's profile.




This blog post highlights the various techniques which can be used to insert data into HCM using ICS. It is however worth noting that this technique can, and should, only be used for sending small amounts of data into HCM. The approach documented above should not be used for bulk loading of data.

Related Blog Posts

Based on the experience I gained from this POC exercise, I realize that creating NXSD for HCM HDL format requires knowledge beyond familiarity with drag-and-drop. The blog post "Native Schema for Oracle HCM Cloud HDL Data Format" goes into more details on the NXSD schema used in this ICS integration.

Another important area of knowledge is understanding and managing HCM Talent Data Objects. As mentioned earlier, we plan to publish another blog post that will document basic knowledge and procedures in preparing Talent Data in HCM. We hope they will help jump start any future ICS and HCM related projects.


All artifacts mentioned in this post are available upon request.

Additional Information


The ResourceSystemOwner ID used in the sample data must exist in HCM. You can use the default value (FUSION). To create your own ID, follow the instructions at the link below.

Defining the Source-System Owner: Procedure

Join the discussion

Comments ( 1 )
  • Qader Tuesday, July 30, 2019
    Nice Article... Is it possible to get the integration file.
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.Captcha