Using File Based Loader for Fusion Product Hub

Using File Based Loader for Fusion Product Hub


File Based Loaders (FBL) offer a broad bandwidth of varieties to import batch data manually by user interaction or automated via locally scheduled processes using existing API’s and Web Services. This article will highlight the Fusion Product Hub specific capabilities to import item data in a batch by using FBL. Another and more generic article about File Based Loaders can be found here.

The current FBL solution for Fusion Product Hub is covering the following customer scenarios:

  • Manual (UI based) upload and import of item data to Fusion Product Cloud or Fusion Product Hub on-premise instances
  • Automated loader and import processes for item data to Fusion Product Cloud or Fusion Product Hub on-premise instances

This article will describe a technical implementation that can be used with Fusion Product Hub and on-premise installations the same way. It will also cover some basic and necessary functional setup aspects. Please note that item import via FBL doesn’t replace the other Product Hub solutions for data import such as item batch imports via Fusion Desktop Integration. It can rather be seen as an additional offering for item imports.

Release Notes

Please notice the following changes for this article since initial publishing in March 2015:

  • The section below called Optional: Checklist Security Setup in UCM (on-premise only) became obsolete since Release 12 as the Entitlement Server APM is not longer part of the Fusion Apps tech stack and the described APM Policies are not existent anymore.
  • The information for UCM Security Setup  below have changed since R12 (i.e. Account and Security Group in UCM) as they were related to APM roles and are not relevant anymore.

Main Article

File Based Loader for Fusion Product Hub is using standard technologies and components from the Fusion Apps technology stack on the backend – both in the cloud and on-premise. It’s not necessary to install extra components or products in addition to Fusion Product Hub.

Figure below visualizes the available product data load options for manual (user interaction trough portal or desktop integration) and automatic (Web Services, API’s) scenarios. This blog will explain how to use the various features.


Customers can use a thin technology footprint in their client environment to use the capabilities of FBL for Fusion Product Hub. The following runtime components and tools are sufficient to create a connection to FBL for uploading item data and triggering scheduling jobs:

  • Java Development Kit (JDK) 1.8.x
  • JDeveloper 12c
  • WebCenter Content Document Transfer Utility for Oracle Fusion Applications (free of charge utility available on Oracle Technology Network)

Especially for cloud customers this footprint eliminates the necessity to install additional server components in their data center while Fusion Apps on-premise or Fusion Middleware customers can leverage their existing infrastructure to run the FBL related client programs and tools.

FBL can be seen as an additional integration point with an option to provide item loader data in Fusion Content Server (UCM) for further import processing. These tasks can be done as manual interactions (occasional item loads) or alternately as an automated task via scripts and API’s. Details will be explained in following sections:

  • Common functional setup for successful item imports
  • Loading data to Fusion Content Server
  • Initiating a item load scheduled job

Note: Other item import capabilities using Desktop Integration are co-existing with the current FBL offering and remain another import offering for on-premise customers.

Part I: Functional Setup for Fusion Product Hub Item Loader

This blog will not cover all aspects of functional setup steps. Instead we’ll focus just on some basic introduction about a functional setup which is generic in terms of being valid for performing any other item definitions as well. Fusion Product Hub offers a set of capabilities for definition of custom item structure as required by customers needs.

DefineAttributesIn a first step, after receiving information describing the item structure and validations, an authorized user will create custom attributes by running the setup task Manage Attribute Groups and Attributes as shown in screenshot above. This step is optional and needs to be carried out only if attributes other than those available out of the box (operational attributes) in Product Hub are required.

DefineAttributes2Attribute Groups consist of attributes, which describe a specific feature of an item. Attribute values can be validated by value sets or more complex, coded validations. All these definitions are stored in an internal metadata repository called Extensible Flexfields (EFF).

DefineAttributes3Once these Attributes and Attribute Groups have been defined, they can be assigned to Item Classes as shown below. New items being loaded via FBL belong to dedicated item classes after import.

DefineItemClassBefore running an item import we must create a mapping between import item structure and the equivalent item class in Fusion Product Hub. Once defined, we must save the Import Map and will refer to it later in the loader process.

DefineItemClass2Screenshot below shows a sample mapping in an overview page. Process of mapping consists of assigning columns in CSV structure to attributes defined per item.

MouserImportMapDefinition2The import structure for mapping is derived from a sample CSV file being loaded to Fusion Product Hub. The first line (header) in a CSV file describes the columns in import structure that has to be mapped to target structure. This can be done via UI by dragging target fields to source fields.

MouserImportMapDefinitionLast but not least an Item Import will run in the context of a Spoke System in Fusion Product Hub. If not already existing, it must be created and assigned to an Item Organization. Every import job started via FBL must refer to a spoke system.

PIMDH_SpokeSystemDefinitionThe functional setup for FBL Item Import as shown above doesn’t differ from any other item import like Desktop Integration. This is usually a one-time activity per import structure. The functional setup is complete after finishing the previous tasks.

Part II: Loading Product Data to Fusion Content Server

FBL leveraged the Universal Content Management (UCM) server coming with Fusion Product Hub for storing import files. It’s usually available under the following URL:


Customers have a choice to either use the FBL UI for occasional data loads or to setup a machine-to-machine communication instead. This chapter will give an overview about folder structures, basic security structures and given functionality in UCM to put loader files into a staging area for further processing for both variants: the manual and the automated loader tasks.

Manual Steps

Login page to Fusion Content Server is available via URL above.

UCM_LoginPageIn demo system we’re using a Fusion Product Hub identity named PIMQA.

UCM_LoginPage2This user PIMQA is assigned to the following roles as shown below. By using these roles we ensure that all required permissions are given to run File Based Loader.

PIMQA_RolesFile Based Loader requires two files being available in UCM:

  • Data File in CSV format containing the item information
  • Manifest file defining the used import mapping (see above) and the path/name for file containing item data

Both files must be existent and accessible in Fusion Content Server before triggering the loader job. Screenshot below shows a sample of item data in CSV format.

NewStructureItemCSV As stated above a manifest file describes the file and import mapping information for a dedicated item load as shown below.

NewStructureManifestCSVStaging area for Fusion Product Hub is predefined as /Content Folders/PIM.

Via menu New Item new files must be uploaded into that folder. The screenshot below provides further details. The field Account must be filled with the correct values for accessibility permissions – in case of this FBL for Fusion Product Hub sample we used scm$/item$/import$. This account is a seeded value and it can be used for this purpose. Users can setup their own accounts and use them alternately. It’s also possible to use Security Groups instead of Accounts when using UI based file upload. More details about security mechanisms are explained in the next section below.

UCM_UploadPimFile_ManuallyOnce all files were uploaded – either manually or automatically via scripts – the required files must reside in UCM folder before triggering the item load job. The screenshot below shows a sample.

UCM_PimFiles3When double checking properties for uploaded files the screenshot below shows a typical setup (meaning for Security Group and Account to be explained further down in this document):

  • Folder: /Contribution Folders/PIM
  • Security Group: FAImportExport
  • Account: scm$/item$/import$

UCM_PimFiles2As soon as these files have been uploaded and correct data has been provisioned the UCM part of FBL is done and we can proceed to do the next step.

Optional: Checklist Security Setup in UCM (on-premise only)

Normally there are no requirements to modify or extend the UCM security setup. The security features described above (i.e. Security Group, Account etc.) are supposed to be existent. However in case of troubleshooting it might be good having a quick checklist about UCM security options as needed by FBL for Fusion Product Hub. A full documentation can be found on product documentation site here.

The following relationship between Users, Roles, Policies and Resources exist:

  • UCM resources like Security Groups and Accounts define access to various folders and files
  • These resources are grouped in APM Policies
  • APM Policies are assigned to Application Roles
  • Application Roles are assigned to Fusion Product Hub Users

The best option to check security setup is by using a privileged user like FAAdmin. Obviously that won’t work in Fusion Product Cloud. Its recommended to submit a Service Request in case of doubt that security options might not be set correctly if using cloud services.

After login as a privileged user open the Administration sub-page. In the right window pane a list of Admin Applets appears after activation of Administration -> Admin Applets (see below).

UCM_AdminAppletsThe applet User Admin shows the user PIMQA we’re using in this sample as the external user in our sample. It means the user is registered in Fusion Identity Management. Only a few UCM built-in users are marked as local. Usually it’s neither necessary nor recommended to touch any of these entries via this applet.

UCM_UserAdminAppletThe screenshot below shows more details of our sample user.

UCM_UserAdminApplet2Furthermore it might be useful to show where the UCM Account scm$/item$/import$ (see usage in section above) is defined as it will improve the understanding of underlying security concepts.

Entries can be found via the Authorization and Policy Management (APM) page in Fusion Product Hub via a link like this:


You must use privileged user credentials like FAAdmin for a successful login to APM.

Once logged in we can search for UCM Accounting details as shown below.

Search for Application Roles -> IDCCS -> Resources

APM_UCM_AccountNext step is to search for resources starting with scm in the Search Resources pages s shown below.

APM_UCM_Account2Open detail page for scm$/item$/import$ with the results as shown below and click the button Find Policies.

APM_UCM_Account3In Policies overview page we find the attached policies.

APM_UCM_Account4Opening these policies will show the details about document permissions per resource as defined for Item Import in Fusion Content Server.


Programmatic Interface for Item Upload to Fusion Content Server

As an alternative to manual uploads of Item data we can use a publicly available toolset called Webcenter Content Document Transfer Utility for Fusion Apps. It can be downloaded from OTN as shown below.


This toolset provides some Java programs to be used from command line interface. Such an interface is useful when running periodical jobs in customer environments to upload new or changed item data without human interaction.

A processing pipeline could look like this:

  • Extract item data from a local system and transform them into a CSV format as expected by Fusion Product Hub
  • Put the file to a staging area
  • Create a manifest file or reuse an existing manifest file in case file names remain the same on Fusion Content Server
  • Run the command line utility to upload file(s)
  • Initiate further processing to load Item Data via calling a Webservice to run an Import Job.

Recently some related articles have been published on Fusion Apps Developer Relations Blog like this post. Please refer to those sources if you want to learn more about the tool usage.

In this section we will cover the tool usage as required for Fusion Product Hub item load.

The transfer utility provides two different interfaces to connect to Fusion Content Server:

  • The RIDC-based transfer utility as a feature-set Java library that encapsulates a proprietary protocol (ridc) to Fusion Content Server via HTTPS.
  • A generic soap-based transfer utility using the Oracle JRF supporting libraries for JAX/WS over HTTPS to communicate with the Fusion Content Server.

After download and extraction of transfer utility two sub-directories will exist: ridc and generic. Details about the specific command line parameters and connection information can be found below.

In addition to these both sub-directories a file WebCenter Content Document Transfer Utility Readme.html will be extracted with a comprehensive documentation about tool usage, troubleshooting and additional options.

Upload via the RIDC Java library

Using a RIDC connection might be the preferred option for those customers who have no FMW products in place. The Java library oracle.ucm.fa_client_11.1.1.jar (existing in in sub-directory ridc after extraction) can be used standalone and doesn’t require any other libraries in addition to a JDK with a minimum release of 1.7 (for JRockit 1.6).

Connection information can be located in a configuration file with content like this:


In production environments it’s strongly recommended to avoid saving passwords in clear text in configuration files like this. Putting them into wallets and reading values from there would be the preferred choice.

A command line running the document upload via RIDC would look like this (“\” used for same line where columns too long) :

${JAVA_HOME}/bin/java \
-jar ./oracle.ucm.fa_client_11.1.1.jar UploadTool \
--propertiesFile=./ \
--primaryFile=ItemManifest.csv \
--dDocTitle="ItemManifest.csv" --k0=dCollectionPath \
--v0="/Contribution Folders/PIM/" \

A successful execution will result in an output like this:

Oracle WebCenter Content Document Transfer Utility
Oracle Fusion Applications
Copyright (c) 2013-2014, Oracle. All rights reserved.
* Custom metdata set: "dCollectionPath"="/Contribution Folders/PIM/".
Performing upload (CHECKIN_UNIVERSAL) ...
Upload successful.
[dID=76 | dDocName=UCMFA000076]

The uploaded document from the example above resides in the Fusion Content Server with a global document id 76 and an internal document name UCMFA000076. For further processing we’d rather locate it by its logical file information /Contribution Folders/PIM/ItemManifest.csv.

Using RIDC connection would be apparently a first choice for cloud customers who are not using any Oracle Fusion Middleware runtime environment. However it is possible for Fusion Product Hub on-premise customers to use this connection type too.

Upload via the generic Java library

The generic approach will connect to a WebService in Fusion Content Server to perform a file upload. After extraction in folder generic a Java library oracle.ucm.fa_genericclient_11.1.1.jar can be found. For this type of connection the will point to a different URL as shown below:


Its Important to mention that the tool can’t run standalone, as we must add an additional library from a Weblogic Server runtime directory: jrf-client.jar. This can be found in WLS directory oracle_common/modules/oracle.jrf_11.1.1. No other libraries are required to be added to the classpath as the remaining Oracle JRF Web Service are referred from jrf-client.jar.

The command line using the generic Java library would look like this:

${JAVA_HOME}/bin/java -classpath \
<WLS_HOME>/oracle_common/modules/oracle.jrf_11.1.1/jrf-client.jar:./oracle.ucm.fa_genericclient_11.1.1.jar \
oracle.ucm.idcws.client.UploadTool \
-propertiesFile=./ \
--primaryFile=/home/oracle/CASE_1-CsvMap.csv \
--dDocTitle="Product Item Import 0001" \
--k0=dCollectionPath --v0="/Contribution Folders/PIM/" \

Output looks identical like the RIDC version:

Oracle WebCenter Content Document Transfer Utility
Oracle Fusion Applications
Copyright (c) 2013-2014, Oracle. All rights reserved.
* Custom metdata set: "dCollectionPath"="/Contribution Folders/PIM/".
Performing upload (CHECKIN_UNIVERSAL) ...
Upload successful.
[dID=77 | dDocName=UCMFA000077]

As mentioned above using this connection type requires a fully installed Weblogic runtime environment and uses a standard WebService interface.

Logging option

In order to adjust the level of logging information, the log level can be controlled through a properties file such as that can be added to the runtime Java call by option


The content of this file could look like this:


As this is a standard Java feature the full list of values looks as follows:

  • SEVERE (highest value – least logging)
  • INFO
  • FINE
  • FINEST (lowest value – most logging)

Using the logging features might help in cases where content transfer utility runs into issues with the connections and/or the upload of files.

Optional: Managing SSL Self-Signed Certificates

It’s strongly recommended to use the HTTPS protocol when connecting to Fusion Content Server despite the fact that plain HTTP connections would technically work as well. In scenarios using Fusion Product Cloud the server certificates are signed by well-known authorities (trust centers), who’s root certificates are normally part of JDK or browser distributions and no special certificate handling is required.

When using Fusion Product Hub on-premise there might be situations where self-signed certificates are used for SSL. When running Java programs these certificates must be imported into the clients certificate store. Here is a short explanation how to manage this:

The connection to a server with self-signed certificate will produce a warning in web browsers. Its possible to take a closer look to the certificate details like shown in a Firefox screenshot below:

  • Warning page appears stating that a connection can’t be trusted
  • Click on “I understand the risks”
  • Click on “Add exception …”
  • Click on “View” and as a result the certificate details appear like shown below


Usually unknown certificates shouldn’t be trusted, but in this special case we are the issuers and make an exception.

We can download the certificate via the following access path in Firefox:

  • Click on tab “Details” as shown in screenshot above
  • Click on “Export … ” as shown in screenshot below
  • In File Save dialog choose “X.509 Certificate (DER)”
  • Save the file in a folder


Once saved, we must import this certificate into the certificate store of Java runtime which we use to run the content transfer utility. The command line looks like this:

${JAVA_HOME}/bin/keytool –importcert \
–alias <name_referring_to_ssl_server> \
-keystore ${JAVA_HOME}/jre/lib/security/cacerts \
–file <path_to_der_certificate>

When asked for a password and it has never been changed the default value would be “changeit”.

Part III: Initiating the Item Data Load

In the previous section the provisioning of files to Fusion Content Server has been explained. The final step to import these items into Fusion Product Hub is running the loader and import job. This job runs seamlessly and the following jobs are included:

  • Transfer item data from the file in Fusion Content Server to Item Interface Tables
  • Run a batch import from Item Interface Tables to Item tables

Step 2 above is identical to Item Batch Import as existent in Fusion Product Hub for a while including exception handling and job status reporting.

Customers have an option to initiate the scheduled job (occasional triggering) via UI or to embrace it by scripts for automated and periodical runs.

Manual Item Load via Fusion Product Hub UI

Initiating a manual item load is pretty straightforward, as users just have to follow the standard dialog to trigger a job. For this purpose use the Scheduled Processes menu entry in Fusion Product Hub Navigator.

ScheduleProcessesNavigatorSearch for a job called Schedule Product Upload Job as shown below.

NewScheduleJobProvide parameters as required:

  • Manifest File Path: file location of the manifest file as uploaded to Fusion Content Server previously
  • Assigned Spoke System: Spoke system as defined in the functional setup previously (see section I)

Once the job parameters have been provided the job execution can be submitted for immediate execution or scheduled for a later time.

NewScheduleJobParametersThe execution status of jobs can be monitored via the same UI as shown below. Once finished the items are supposed to be transferred from CSV file into the system and can be found in the standard Fusion Product Hub UI for further processing.


Programmatic execution of the loader job from command line

Similar as for uploading files into Fusion Content Server the triggering of loader jobs can be initiated by Java programs. For this purpose it’s recommended to use Oracle JDeveloper 12c that can be downloaded from OTN as shown below.


It’s not necessary to download more technology products to run the programmatic interface for job scheduling.

Create WebService Client to initiate a scheduling job

Technically the job scheduler can be accessed via an existing WebService interface. Oracle JDeveloper 12c provides a great offering to generate the WebService accessory code via a coding wizard.

The series of screenshots below will document the step-by-step procedure to generate the Java code. Once done, we have a skeleton of Java code and configuration files that require some minor extensions in order to execute the web service.

As a very first step create a Java application with a custom project. Then choose Create Web Service Client and Proxy via “New …” and “Gallery …”.

JDeveloper2As shown in screenshots below we must provide the information for the WebService we intend to invoke in our code. For Item Loader it has the following format:


JDeveloper3Once provided click Next and the wizard will start determining the web service configuration by introspecting the provided web service WSDL.

JDeveloper4As shown in screenshot below there is a choice to enter a custom root package for the generated WebService client code. Default code will use a package name like this:

In most cases customers want to reflect their own packaging naming conventions and this screen is the location where to configure it.

JDeveloper5In next step, as shown screenshot below, it is not necessary to change any information and user can enter Next.

JDeveloper6The next dialog will give users a choice to configure the client using a synchronous or asynchronous method. Scheduling a job is a synchronous activity and therefore its not required to generate asynchronous Java methods.

JDeveloper7After reading and analyzing the web service a WSM policy oracle/wss11_username_token_with_message_protection_server_policy has been found on the server side. The code generator uses the corresponding client policy oracle/wss11_username_token_with_message_protection_client_policy to fulfill the server requirements. This value must be accepted, as the communication between client and server will fail otherwise.

JDeveloper8On the next dialog screen no changes are required and the user can press Next.

JDeveloper9Last screenshot of this code generation dialog will show a summary of methods being generated, as they will fit with the methods found in the web service wsdl. After clicking finish the code generation starts and might take up to one or two minutes.

JDeveloper10The development environment after finishing code generation will have a look like shown below in screenshot.

JDeveloperThis code generation helps saving a tremendous amount of time if programming manually. Its worth to mention, that some parts of the generated code are under control of JDeveloper and might be overwritten in case of some configuration changes happen. Developers must be careful to add their own code in sections where foreseen and indicated in the code.

The generate code doesn’t provide any details about

  • Authentication by providing credential
  • Message encryption as required by the WSM policy
  • Web service operations to be initiated by this Java code – here Java method submitESSJobRequest() for web service operation submitESSJobRequest
  • Parameters to be passed to these operation calls

All the additions above are manual tasks to be performed by programmers.

Below is a piece of Java code that shows a working example kept simple for better readability. For a production use we recommend the following improvements:

  • Put details for keystore etc in config files
  • Same for username
  • Store passwords in a wallet
  • Important: as mentioned the code is under control of a code generator. To avoid unintentionally code changes its strongly recommended to create an own class by copy and paste from generated class

Generated and modified Java File


import java.util.ArrayList;
import java.util.List;
import java.util.Map;

import oracle.webservices.ClientConstants;

import weblogic.wsee.jws.jaxws.owsm.SecurityPoliciesFeature;

// This source file is generated by Oracle tools.
// Contents may be subject to change.
// For reporting problems, use the following:
// Generated by Oracle JDeveloper 12c
public class FinancialUtilServiceSoapHttpPortClient {
  public static void main(String[] args) {
      FinancialUtilService_Service financialUtilService_Service = 
            new FinancialUtilService_Service();

      // Configure security feature
      SecurityPoliciesFeature securityFeatures = 
            new SecurityPoliciesFeature(new String[] {
      FinancialUtilService financialUtilService =
      // Add your code to call the desired methods.
      WSBindingProvider wsbp = (WSBindingProvider) financialUtilService;
      Map<String, Object> reqCon = wsbp.getRequestContext();

      reqCon.put(WSBindingProvider.USERNAME_PROPERTY, "IntegrationUser");
      reqCon.put(WSBindingProvider.PASSWORD_PROPERTY, "Password");

      reqCon.put(ClientConstants.WSSEC_KEYSTORE_TYPE, "JKS");
      reqCon.put(ClientConstants.WSSEC_KEYSTORE_PASSWORD, "Welcome1");
      reqCon.put(ClientConstants.WSSEC_ENC_KEY_ALIAS, "mykey");
      reqCon.put(ClientConstants.WSSEC_ENC_KEY_PASSWORD, "Welcome1");
      reqCon.put(ClientConstants.WSSEC_RECIPIENT_KEY_ALIAS, "mykeys");

      Long jobID = startEssJob(financialUtilService);

      System.out.println("Item Data Import Job started with ID: " +

  private static Long startEssJob(FinancialUtilService fus) {
      Long essRequestId = new Long(-1);

      try {
          List<String> paramList = new ArrayList<String>();
          // UCM folder and file name
          paramList.add("/Contribution Folders/PIM/ProductLoad.csv"); 
          // Spoke System Code
          // Product Upload - static value here
          // Product Hub Portal Flow
          essRequestId = fus.submitESSJobRequest( 
             "ExtProductUploadSchedulingJobDef", paramList);
      catch (ServiceException e) {

      return essRequestId;

Running this Java program from command line doesn’t require any additional libraries except those coming with a JDeveloper installation and a standard JDK.

Its recommended to package all files in project into a JAR file via a Deployment Profile. Once done, a sample call for this WebService client would look as follows:


${JAVA_HOME}/java \
-server \
-Djava.endorsed.dirs=${jdevDir}/oracle_common/modules/endorsed \
-classpath ${clientJar}:\
${modulesDir}/ oracle.toplink_12.1.3/eclipselink.jar:\
${modulesDir}/ oracle.toplink_12.1.3/org.eclipse.persistence.nosql.jar:\
${modulesDir}/ oracle.toplink_12.1.3/\
${modulesDir}/ \

Output of this call will be the Job ID of the scheduled item loader job. The monitoring of job progress can be done via the application UI. There are other web services that can be used for checking job status but their explanation is subject to a future blog post.

Managing WS Security

As mentioned earlier in this blog the web service policy to run a scheduler job is oracle/wss11_username_token_with_message_protection_server_policy. For our Java client it means that two core requirements must be satisfied and have been shown in code sample above:

  • Passing username/password (here: IntegrationUser/Password)
  • Encrypt the message content

For encryption we must use the public key of web service as provided inside the WSDL. An example can be seen in the screenshot below.


The following steps are required to create an entry in the client side Java Key Store for message encryption:

  • Save the certificate from the WSDL in a certificate file in .pem or .der format
  • Import the certificate to an existing key store or create a new key store by importing the certificate
  • Refer to the certificate entries for message encryption as shown above in the sample class.

Open the WSDL in a web browser and search for XML tag dsig:X509Certificate. The content must be copied and pasted into a text file as shown below between a line BEGIN CERTIFICATE and END CERTIFICATE (sample data as copied from our test case):


Save the file under a name like <fusion_product_hub_server>.der and create an entry or a new Java keystore file as follows:

${JAVA_HOME}/bin/keytool –importcert \
–alias <alias_as_referred_in_Java_code > \
-keystore <my_local_client_trust_store> \
–file <path_to_der_certificate_above>

If the keystore doesn’t exist it will be created including new passwords (Welcome1 in Java code sample above). If the files exist we must provide the keystore passwords in order to be able to create the entry.

Once created, the key store will contain an entry like this:

$ ${JAVA_HOME}/bin/keytool -v -list -keystore <my_local_client_trust_store>
Enter keystore password:  

Keystore type: JKS
Keystore provider: SUN

Your keystore contains 1 entry

Alias name: mykeys
Creation date: Feb 24, 2015
Entry type: trustedCertEntry

Owner:, OU=defaultOrganizationUnit, O=defaultOrganization, C=US
Issuer:, OU=defaultOrganizationUnit, O=defaultOrganization, C=US
Serial number: 5397498d
Valid from: Tue Jun 10 20:08:13 CEST 2014 until: Sat Jun 10 20:08:13 CEST 2017
Certificate fingerprints:
	 MD5:  6D:BA:94:CE:84:E6:C0:A3:CA:A3:F1:8A:39:1E:E9:2E
	 SHA1: C7:3D:62:42:D8:E7:A0:DB:57:93:40:32:A8:54:E0:57:60:F0:8B:FD
	 SHA256: 40:D0:C3:81:CF:5D:6B:61:95:23:27:24:83:8D:1A:34:9F:31:C7:E5:15:BE:49:44:81:E6:D9:34:0A:69:FA:06
	 Signature algorithm name: SHA1withRSA
	 Version: 3



In this article we provided a 360° view on tasks and activities for automating the use of File Based Loader for Fusion Product Hub. Everything discussed in this article applies to both cloud and on-premise deployments of Fusion Product Hub the same way.

Link collection in order of appearance in this blog:

Add Your Comment