Fusion HCM Cloud – Bulk Integration Automation Using Managed File Transfer (MFT) and Node.js

August 18, 2015 | 9 minute read
Text Size 100%:

Introduction

Fusion HCM Cloud provides a comprehensive set of tools, templates, and pre-packaged integration to cover various scenarios using modern and efficient technologies. One of the patterns is the bulk integration to load and extract data to/from the cloud.

The inbound tool is the File Based data loader (FBL) evolving into HCM Data Loaders (HDL). HDL is a powerful tool for bulk-loading data from any source to Oracle Fusion Human Capital Management (Oracle Fusion HCM). HDL supports one-time data migration and incremental load to support co-existence with Oracle Applications such as E-Business Suite (EBS) and PeopleSoft (PSFT).

HCM Extracts is an outbound integration tool that lets you choose HCM data, gathers it from the HCM database and archives it as XML. This archived raw XML data can be converted into a desired format and delivered to supported channels recipients.

HCM cloud implements Oracle WebCenter Content, a component of Fusion Middleware, to store and secure data files for both inbound and outbound bulk integration patterns.

Oracle Managed File Transfer (Oracle MFT) enables secure file exchange and management with internal systems and external partners. It protects against inadvertent access to unsecured files at every step in the end-to-end transfer of files. It is easy to use, especially for non technical staff, so you can leverage more resources to manage the transfer of files. The built in extensive reporting capabilities allow you to get quick status of a file transfer and resubmit it as required.

Node.js is a programming platform that allows you to execute server-side code that is similar to JavaScript in the browser. It enables real-time, two-way connections in web applications with push capability, allowing a non-blocking, event-driven I/O paradigm. Node.js is built on an event-driven, asynchronous model. The in-coming requests are non-blocking. Each request is passed off to an asynchronous callback handler. This frees up the main thread to respond to more requests.

This post focuses on how to automate HCM Cloud batch integration using MFT (Managed File Transfer) and Node.js. MFT can receive files, decrypt/encrypt files and invoke Service Oriented Architecture (SOA) composites for various HCM integration patterns.

 

Main Article

 

Managed File Transfer (MFT)

Oracle Managed File Transfer (MFT) is a high performance, standards-based, end-to-end managed file gateway. It features design, deployment, and monitoring of file transfers using a lightweight web-based design-time console that includes file encryption, scheduling, and embedded FTP and sFTP servers.

Oracle MFT provides built-in compression, decompression, encryption and decryption actions for transfer pre-processing and post-processing. You can create new pre-processing and post-processing actions, which are called callouts.

The callouts can be associated with either the source or the target. The sequence of processing action execution during a transfer is as follows:

  1. 1. Source pre processing actions
  2. 2. Target pre processing actions
  3. 3. Payload delivery
  4. 4. Target post processing actions
Source Pre-Processing

Source pre-processing is triggered right after a file has been received and has identified a matching Transfer. This is the best place to do file validation, compression/decompression, encryption/decryption and/or extend MFT.

Target Pre-Processing

Target pre-processing is triggered just before the file is delivered to the Target by the Transfer. This is the best place to send files to external locations and protocols not supported in MFT.

Target Post-Processing

Post-processing occurs after the file is delivered. This is the best place for notifications, analytic/reporting or maybe remote endpoint file rename.

For more information, please refer to the Oracle MFT document

 

HCM Inbound Flow

This is a typical Inbound FBL/HDL process flow:

inbound_mft

The FBL/HDL process for HCM is a two-phase web services process as follows:

  • Upload the data file to WCC/UCM using WCC GenericSoapPort web service
  • Invoke “LoaderIntegrationService” or “HCMDataLoader” to initiate the loading process.

The following diagram illustrates the MFT steps with respect to “Integration” for FBL/HDL:

inbound_mft_2

HCM Outbound Flow

This is a typical outbound batch Integration flow using HCM Extracts:

extractflow

 

The “Extract” process for HCM has the following steps:

  • An Extract report is generated in HCM either by user or through Enterprise Scheduler Service (ESS) – this report is stored in WCC under the hcm/dataloader/export account.
  • MFT scheduler can pull files from WCC
  • The data file(s) are either uploaded to the customer’s sFTP server as pass through or to Integration tools such as Service Oriented Architecture (SOA) for orchestrating and processing data to target applications in cloud or on-premise.

The following diagram illustrates the MFT orchestration steps in “Integration” for Extract:

 

outbound_mft

 

The extracted file could be delivered to the WebCenter Content server. HCM Extract has an ability to generate an encrypted output file. In Extract delivery options ensure the following options are correctly configured:

  • Select HCM Delivery Type to “HCM Connect”
  • Select an Encryption Mode of the four supported encryption types or select None
  • Specify the Integration Name – this value is used to build the title of the entry in WebCenter Content

 

Extracted File Naming Convention in WebCenter Content

The file will have the following properties:
Author: FUSION_APPSHCM_ESS_APPID
Security Group: FAFusionImportExport
Account: hcm/dataloader/export
Title: HEXTV1CON_{IntegrationName}_{EncryptionType}_{DateTimeStamp}

 

Fusion Applications Security

The content in WebCenter Content is secured through users, roles, privileges and accounts. The user could be any valid user with a role such as “Integration Specialist.” The role may have privileges such as read, write and delete. The accounts are predefined by each application. For example, HCM uses /hcm/dataloader/import and /hcm/dataloader/export respectively.
The FBL/HDL web services are secured through Oracle Web Service Manager (OWSM) using the following policy: oracle/wss11_saml_or_username_token_with_message_protection_service_policy.

The client must satisfy the message protection policy to ensure that the payload is encrypted or sent over the SSL transport layer.

A client policy that can be used to meet this requirement is: "oracle/wss11_username_token_with_message_protection_client_policy"

To use this policy, the message must be encrypted using a public key provided by the server. When the message reaches the server it can be decrypted by the server's private key. A KeyStore is used to import the certificate and it is referenced in the subsequent client code.

The public key can be obtained from the certificate provided in the service WSDL file.

Encryption of Data File using Pretty Good Privacy (PGP)

All data files transit over a network via SSL. In addition, HCM Cloud supports encryption of data files at rest using PGP.
Fusion HCM supports the following types of encryption:

  • PGP Signed
  • PGP Unsigned
  • PGPX509 Signed
  • PGPX509 Unsigned

To use this PGP Encryption capability, a customer must exchange encryption keys with Fusion for the following:

  • Fusion can decrypt inbound files
  • Fusion can encrypt outbound files
  • Customer can encrypt files sent to Fusion
  • Customer can decrypt files received from Fusion

 

MFT Callout using Node.js

 

Prerequisites

To automate HCM batch integration patterns, the following components must be installed and configured respectively:

 

Node.js Utility

A simple Node.js utility “mft2hcm” has been developed for uploading or downloading files to/from a MFT server callout to Oracle WebCenter Content server and initiate HCM SaaS loader service. It utilizes the node "mft-upload" package and provides SOAP substitution templates for WebCenter (UCM) and Oracle HCM Loader service.

Please refer to the “mft2hcm” node package for installation and configuration.

RunScript

The RunScript is configured as “Run Script Pre 01” to configure a callout that can be injected into MFT in pre or post processing. This callout always sends the following default parameters to the script:

  • Filename
  • Directory
  • ECID
  • Filesize
  • Targetname (not for source callouts)
  • Sourcename
  • Createtime

Please refer to “PreRunScript” for more information on installation and configuration.

 

MFT Design

MFT Console enables the following tasks depending on your user roles:

Designer: Use this page to create, modify, delete, rename, and deploy sources, targets, and transfers.

Monitoring: Use this page to monitor transfer statistics, progress, and errors. You can also use this page to disable, enable, and undeploy transfer deployments and to pause, resume, and resubmit instances.

Administration: Use this page to manage the Oracle Managed File Transfer configuration, including embedded server configuration.

Please refer to the MFT Users Guide for more information.

 

HCM FBL/HDL MFT Transfer

This is a typical MFT transfer design and configuration for FBL/HDL:

MFT_FBL_Transfer

The transfer could be designed for additional steps such as compress file and/or encrypt/decrypt files using PGP, depending on the use cases.

 

HCM FBL/HDL (HCM-MFT) Target

The MFT server receives files from any Source protocol such as SFTP, SOAP, local file system or a back end integration process. The file can be decrypted, uncompressed or validated before a Source or Target pre-processing callout uploads it to UCM then notifies HCM to initiate the batch load. Finally the original file is backed up into the local file system, remote SFTP server or a cloud based storage service. An optional notification can also be delivered to the caller using a Target post-processing callout upon successful completion.

This is a typical target configuration in the MFT-HCM transfer:

Click on target Pre-Processing Action and select “Run Script Pre 01”:

MFT_RunScriptPre01

 

Enter “scriptLocation” where node package "mft2hcm" is installed. For example, <Node.js-Home>/hcm/node_modules/mft2hcm/mft2hcm.js

MFTPreScriptUpload

 

Do not check ”UseFileFromScript”. This property replaces an inbound file (source) of MFT with the file from target execution. In FBL/HDL, the response (target execution) do not contain file.

 

HCM Extract (HCM-MFT) Transfer

An external event or scheduler triggers the MFT server to search for a file in WCC using a search query. Once a document id is indentified, it is retrieved using a "Source Pre-Processing" callout which injects the retrieved file into the MFT Transfer. The file can then be decrypted, validated or decompressed before being sent to an MFT Target of any protocol such as SFTP, File system, SOAP Web Service or a back end integration process. Finally, the original file is backed up into the local file system, remote SFTP server or a cloud based storage service. An optional notification can also be delivered to the caller using a Target post-processing callout upon successful completion. The MFT server can live in either on premise or a cloud iPaaS hosted environment.

This is a typical configuration of HCM-MFT Extract Transfer:

MFT_Extract_Transfer

 

In the Source definition, add “Run Script Pre 01” processing action and enter the location of the script:

MFTPreScriptDownload

 

The “UseFileFromScript” must be checked as the source scheduler is triggered with mft2hcm payload (UCM-PAYLOAD-SEARCH) to initiate the search and get WCC's operations. Once the file is retrieved from WCC, this flag tells MFT engine to substitute the file from downloaded from WCC.

 

Conclusion

This post demonstrates how to automate HCM inbound and outbound patterns using MFT and Node.js. The Node.js package could be replaced with WebCenter Content native APIs and SOA for orchestration. This process can also be replicated for other Fusion Applications pillars such as Oracle Enterprise Resource Planning (ERP).

Jack Desai

A product strategist and “solution and enterprise” integration architect to innovate and automate complex integration patterns with Oracle SaaS applications.


Previous Post

Integrating Oracle Fusion Applications - WebCenter / Universal Content Management (UCM) with PL/SQL

Jay Pearson | 12 min read

Next Post


Oracle Service Cloud - Getting Started With Bulk APIs

Ashish Singh | 6 min read