Best Practices from Oracle Development's A‑Team

Reference Architecture - Fusion SaaS Data Replication into ADW : Using ODI Marketplace and BICC

Jay Pearson
Consulting Solutions Architect, Business Intelligence

Matthieu Lombard


* This blog was last tested on ODI V12. + Fusion 20D ( + ADW 18c *


This article walks though the steps to configure Oracle Data Integrator on Oracle Cloud Marketplace (ODI) with Business Intelligence Cloud Connector (BICC) to replicate data from Fusion SaaS into an Autonomous Data Warehouse (ADW) database.

This blog is the first in a series covering topics on how to utilize Oracle ETL / Data Integration tools to extract data from various data sources. Future blog topics will cover extracting from: Oracle SaaS Applications, Oracle on-premise applications (e.g. EBS), and other Oracle applications such as Responsys, Eloqua.

Suggested Prerequisite Reading Material:

* Using Oracle Data Integrator on Oracle Cloud Marketplace

Business Intelligence Cloud Connector (BICC) Prerequisite:

* Before starting this blog, first setup Oracle Fusion SaaS Business Intelligence Cloud Connector (BICC) to use Oracle Cloud Infrastructure (OCI) Object Storage by following the steps in this prerequisite blog.

User/Policy/Key Requirements:

* It is strongly recommend to configure OCI, BICC, and ODI with a local (non-Federated) user.

* The local OCI user must be granted these policies: "read buckets in compartment" and "manage objects in compartment".

Fusion Requirements:

* Fusion Applications must be 19C ( or higher.

* The Fusion user must be a local non-federated user - and granted the ORA_ASM_APPLICATION_IMPLEMENTATION_ADMIN_ABSTRACT role or a role that includes it.

Autonomous Database Requirements:

* BICC writes View Objects extracts to Object Storage. Therefore, the Autonomous Database needs access to Object Storage to get to the View Objects extracts.

Oracle Data Integrator (ODI) Requirements:

* This feature is also available with ODI on-prem starting v12.2.1.3. See What's New for more details.

Oracle Data Integrator on Oracle Cloud Marketplace (ODI) Requirements:

* Capacity for a VM.Standard2.4 shape or higher.

To confirm available service limits -> go to Infrastructure -> Hamburger -> Governance and Administration -> Governance -> Limits, Quotes and Usage.


 Deploy Oracle Data Integrator (ODI) Stack

a) Log into the Oracle Cloud Applications Console

b) Hamburger (top left) -> "Marketplace" -> Click Applications

c) Filter on Type = Stack | Category = Data Integration -> Click on "Data Integration". At the time of publishing ODI Marketplace is available for free when the target database is in OCI (DbaaS, ADW. This promotion may not always be available in the future.  

d) Select the Compartment -> Review and accept the Oracle Standard Terms and Restrictions -> Click Launch Stack

e) Change ODI name as desired or leave as default -> Click Next

f) Either create a new VCN or use an existing one -> Select an option for the ODI repository (database schema) -> Select the compartment -> Scroll down

g) Scroll down -> Select ODI Note Shape (requires min of 2.4) -> Enter a SSH public key -> Select an availability zone -> Enter VNC password -> Click Next

Click here to learn how to generate the SSH Public key.

* Take a note of the ODI VNC Password

h) Review -> Click Create

If successful, the following will be shown (under Resource Manager -> Stacks -> Stack Details -> Job Details)

i) Check Compute -> Instances to see if the instance was created

Change the scope to your compartment

j) Click on the instance to get the details. Make a note of the Public IP to access with VNC (Port 5901).

k) In previous releases, the Instance was not automatically created. If not created automatically, go to Resources Manager to debug.


l) Under Resource Manager -> Jobs -> Click on Jobs -> If the job was started, use the logs to debug any install errors

m) If the job wasn't started at all click Terraform -> Apply -> Monitor the logs

 VNC to the Marketplace Image and Launch ODI from the Desktop

a) Confirm Port 22 is available for SSH Remote Login.

Click Networking -> Virtual Cloud Networks -> Select the compartment -> Select the ODI Virtual Cloud Network -> Select the ODI Subnet -> Select the ODI Security Lists

b) Connect using a VNC client using the Public IP (step 2.j) with Port 22 with the Private Key.

The below example is using MobaXterm. Another VNC option is to use Putty with VNC Viewer.

Enter the VNC Pwd specified on install -> OK

c) Double click on the ODI Studio icon (it initially looks like a text document) -> click Trust and Launch.

d) Connect to the repository as SUPERVISOR

By default the SUPERVISOR password will be generated by ODI and saved (but hidden) when the studio is launched.

If necessary, retrieve the SUPERVISOR password by running:


 ODI Prerequisites

a) Create a LOCAL user <api_user>

Currently it is not possible to use a Federated user for the ODI configuration.

Administration privileges are required to create the local user.

Identity > Users -> click Create User

Enter a name for the user. i.e. api_user -> enter a description -> click Create

Verify that the user has been created successfully and that it is NOT a Federated user.

Use a text editor or excel to create a "Topology Definitions Table" to record the required ODI topology credentials. Saved it to secure location. See example format below.

Topology Definition Table:





Tenancy OCID:


Tenancy Object Storage Namespace



b) Create the private/public key pair

*** This key pair is NOT the SSH key that you used to access the compute instance.***

*** Instead, this key will be used to allow the OCI api_user to connect to Object Storage from ODI. ***

Summary steps are provided below on how to create the key pair. If required detailed steps can be found here.

From the ODI Marketplace VM -> create an .oci directory to store the credentials: 

mkdir ~/.oci

Generate the private key with the following commands:

It is highly recommended to generate the key with a passphrase.

openssl genrsa -out ~/.oci/oci_api_key.pem -aes128 2048

Add the passphrase to the Topology Definition Table:




Note: For Windows, insert -passout stdin to be prompted for a passphrase. The prompt will just be the blinking cursor, with no text.

openssl genrsa -out ~/.oci/oci_api_key.pem -aes128 -passout stdin 2048

Ensure the private key is secure:

chmod go-rwx ~/.oci/oci_api_key.pem

Generate the public key: 

openssl rsa -pubout -in ~/.oci/oci_api_key.pem -out ~/.oci/oci_api_key_public.pem

For windows add -passin stdin. Then type passphrase at blinking cursor.

openssl rsa -pubout -in ~/.oci/oci_api_key.pem -out ~/.oci/oci_api_key_public.pem -passin stdin

Add the Pubic Key to the Topology Definition Table:

Public key



-----END PUBLIC KEY-----


See this doc for additional information on generating the public/private key pair.

c) Generate the Fingerprint

Generate the fingerprint with the following OpenSSL command. (On Windows install Git Bash to generate the key.)

openssl rsa -pubout -outform DER -in ~/.oci/oci_api_key.pem | openssl md5 -c

When the public key is uploaded into the console the fingerprint is also displayed there.

Add the Fingerprint to the Topology Definition Table:




d) Attach the public key to the api_user

Note: This step requires admin privileges

Identity > Users > api_user -> click API Keys -> click Add Public Key

Enter the value of the public key saved in the Topology Definition Table -> click Add

Verify the key is correctly attached. The fingerprint should be the same as noted in the Topology Definition Table.

e) Generate the Auth Token for the api_user

Note: This step requires admin privileges

Identity > Users > api_user -> click on Auth Tokens -> click Generate Auth Token Public Key


Enter a description -> click Generate Token

Record the Generated Auth Token immediately. THIS WILL NOT BE DISPLAYED AGAIN!

Add the Auth Token to the Topology Definition Table:

Auth token



Note: There is a limit of two Auth Tokens per user.

f) From OCI Object Storage locate the bucket previously created for BICC [Step 1 of this prerequisite blog].

Note: Since the BICC blog and this blog were written at different times the bucket name is different in each blog.

Be sure to reference the same OCI Storage Bucket name in both BICC and ODI.

Object Storage > Object Storage

Locate the OCI Storage Bucket previously created and re-tested in BICC.

Add the Bucket Name to the Topology Definition Table:

Bucket Name



g) Upload a sample file to the bucket

Click on the bucket name

Objects -> Upload Objects

DO NOT enter an object name prefix.

Drag and drop a file from the file system -> click Upload Objects

Wait for the upload to complete

h) Create the ADW credentials that will used later by ODI to load the data into ADW

Connect to the ADW instance with SQL Developer client or SQL Developer Web client.

Verify the existing credentials:

select * from USER_CREDENTIALS;

Drop the credential it exists:


  dbms_cloud.drop_credential(credential_name => 'ODI_ATEAM');


Re-create the credentials:

credential_name => 'ODI_ATEAM',
username => 'api_user',
password => 'Auth Token Generated from OCI'

Verify the credential was created:

select * from USER_CREDENTIALS;

i) Validate the new ADW credentials

Connect to ADW with SQL Developer client or SQL Developer Web client.

Verify the existing files are attached to the bucket:

select object_name, bytes from dbms_cloud.list_objects('ODI_ATEAM','https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/<TENANCY>/bucket-bicc-saas-replication/');

j) If needed, see (Doc ID 2468298.1) to validate the Object Storage URI and credentials are correctly set for Autonomous Data Warehouse / Autonomous Transaction Processing.

 Configure ODI Topology

a) Create the Object Storage Physical Data Server

In the Physical Architecture -> right click the Oracle Object Storage Technology -> click New Data Server

Enter the information collected in the Topology Definition Table

Click "Test Connection"

Click Yes to save


Click OK

Select Local Agent -> click Test

Click OK

b) Create the Object Storage Physical Schema

From the Physical Data Server created -> right click -> select New Physical Schema


From the Bucket Name drop-down -> select the BICC extract bucket (previously created) -> enter a work directory


Click OK to save


c) Create Object Storage Logical Schema and set context

In the Logical Architecture -> right click on "Oracle Object Storage Technology" -> select "New Logical Schema"

Enter a Name -> set the Development and Global contexts to point to the "Object Storage Physical" schema previously created

Save -> Verify that the Logical Schema has been created for Object Storage

d) Verify BI Cloud Connector details and schedule a validation file extract 

From Fusion -> go to the "Oracle Business Intelligence Cloud Connector Console" - https://<Fusion-Apps-host>/biacm

Select "Configure Cloud Storage" -> "OCI Object Storage"

Open the External Storage connection that was created for BICC in the prerequisite blog.

 Test that the connection can still successful connect.

Add the External Storage Name to the Topology Definition table

External Storage Name


Open "Manage Extract Schedules"

Click + to add a new schedule

From the "Job Type" drop-down -> Select "Application Data Extract" -> Enter a name -> click Next

Select the FscmTopModelAM.FinArCustomersPublicModelAM.TaxClassification datastore -> click Enabled for Extract -> click Next

Select the External Storage <DS_OCI_Ateam> that will be used to write the extract to the storage bucket.

Change the global parameter to use GZIP compression type -> click Save

Note: ZIP compression format is not supported in ADW. The ingestion from Object Storage will fail if ZIP is used.

Wait for the job to complete

Once the job completes successfully -> go to the Object Storage bucket -> verify all three files are created

  1. Extract Status, in json format
  2. Manifest file
  3. Extracted datastore .csv, compressed in .gz format

e) Create the BI Cloud Connector Physical Data Server

From Topology Physical Architecture -> Technologies -> right click on "Oracle BI Cloud Connector" -> click "New Data Server"

Enter the BICC Service URL

Under Storage -> select "Oracle Object Storage" and choose the topology defined for Object Storage in the previous step.

The external storage name is the name of the storage defined in BICC as recorded in the Topology Definition Table.

Click Test Connection -> click Yes to save 

Click OK 

Select Local Agent -> click Test

Click OK

f) Create the BICC Physical Schema

Right click on the physical schema created in the previous step -> select "New Physical Schema"

Select FscmTopModel Schema

Save -> click OK

Verify that the physical schema is created

g) Create the BICC Logical Schema and set the context

From Logical Architecture -> right click on "Oracle BI Cloud Connector" technology -> select "New Logical Schema"

Enter a name -> Set the Development and Global Topology to point to the physical schema created in previous step

Save and confirm that the Logical Schema is created

h) Verify the ADW Physical Data server

ODI will list all ADW instances in the tenancy where the ODI Marketplace instance was provisioned

If you provisioned ADW after ODI you can follow this blog to manually configure the ADW connection.


ODI also imports the wallets to /home/opc/

If the wallet isn't already there. Copy it using pscp (installed with Putty):

C:\<Install Path>\pscp.exe -i "C:\<SSH Private Key Path>\SSHPrivateKey.ppk" "Y:\<Wallet Path>\wallet_file.zip" opc@123.456.78.123:/home/opc/wallet_file.zip

From Topology -> double click on the DB201911251202-1837 data server -> click OK

Update the Data Server Connection fields as specified below.









Array Fetch Size


Batch Update Size


Uncheck "Use Credentials File" -> click Save 

Test the connection

Click OK


i) Create ADW Physical Schema

Right click on the ADW Physical Server created in the prior step -> select "New Physical Schema"

Enter ADMIN in Schema and Work Schema -> click Save

i) Create the ADW Logical Schema and set contexts

Right on "Oracle Technology" -> Select "New Logical Schema"

Enter details below -> set the contexts -> click Save

 Create ODI Project and Models

a) Import BI Cloud Connector RKM

From Designer -> click the Project Icon -> select "New Project"

Enter a Project Name

Save -> confirm the project has been created in the Projects pane 

Rename the First Folder to FSCM

b) Import the BI Cloud Connector RKM if not present

In some early releases of ODI Marketplace was shipped without the KM. If you run into this, you can download it from here:


Copy the RKM into the VM Downloads folder using scp ...

Copy the RKM BI Cloud Connector to the xml-reference folder


Expand the Knowledge Module tree, right click on Reverse Engineering (RKM) -> select "Import Knowledge Module"

Point the File import directory to the xml-reference folder -> select RKM Oracle BI Cloud Connector -> click OK

Wait for the import to complete -> click close

Verify that the RKM is correctly imported

c) Create the Fusion VOs Model and reverse engineer

Create a Model Folder 


Create the Model


Enter a name -> select "Oracle BI Cloud Connector Technology" -> select the "Logical Schema" created previously

Click the Reverse Engineering tab -> enter details below.

Select Table, View, System Table, Table Alias and Synonym.

Note: be cautious of leading / trailing spaces

  • Mask : %FinArCustomers%

  • Characters to Remove: FscmTopModelAM.

  • LIST_OF_OFFERINGS:       Financial


Reverse Engineer

Click OK

Click OK

Wait for the process to complete. It may take a few minutes depending on the mask and type of PVOs being reversed engineered.

Verify that the VOs datastores are created under the Fusion VOS model

d) Create the ADW Model and Datastore

Create a new Model folder for ADW

Right Click on the ADW model folder -> select "New Model"

Select Oracle Technology -> point to the logical schema previously created.


Create a new diagram under ADW

Enter a diagram name -> click OK

Dragging the objects (individually) from the FSCM model (source) to the target will simplify the datastore creation.

Click on Attributes

Reverse engineer to create the datastore

Verify the datastore is created 

 Create ODI Mapping

a) Create the mapping

Under the FSCM folder -> right click on Mappings -> select "New Mapping"

Enter a mapping name FscmTopModelAM.FinArCustomersPublicModelAM.TaxClassification

uncheck "Create Empty Dataset" -> click OK

The new mapping is created

Drag and Drop the FscmTopModelAM.FinArCustomersPublicModelAM.TaxClassification datastore from Fusion BICC into the working area

Drag and drop the FscmTopModelAM.FinArCustomersPublicModelAM.TaxClassification datastore from ADW into the working area

Connect the output of the Fusion VO datastore to the ADW datastore -> click OK

Verify the fields are mapped from source to target

b) Set up the physical implementations

Click on Physical tab


Click on the first element in the target group -> Expand the Loading Knowledge module section to choose the LKM BICC to ADW external table LKM -> Update the CREDENTIAL_NAME to ODI_ATEAM -> leave all other fields as defaults 

Click the last element in the Target group -> Expand the Integration Knowledge Module section and select IKM Oracle insert -> Set the CREATE_TARGET_TABLE option to True.


c) Execute the mapping 

Select the mapping from the tree -> click the Execute button (green play arrow)

Select Context=Development -> Local Agent = Local (no Agent) -> click OK

Click OK

Wait for the session to complete -> Check the number of inserts. i.e. in this example 8 records were inserted.

d) Verify the data has been loaded into ADW

Check the table name ODI created from the session logs

Connect to SQL Developer or SQL Developer web and execute the SQL below














Verify that the correct number of records were loaded into ADW (i.e. in this example 8 records were loaded)

 Want to Learn More?

Click here for more A-Team Oracle Data Integrator (ODI) Blogs

Click here to sign up to the RSS feed to receive notifications for when new A-team blogs are published

Click here to access the ODI Marketplace documentation library 


This article walked through the steps to configure Oracle Data Integrator on Oracle Cloud Marketplace (ODI) with Business Intelligence Cloud Connector (BICC) to replicate data from Fusion SaaS into an Autonomous Data Warehouse (ADW) database.

This blog is the first in a series covering topics on how to utilize Oracle ETL / Data Integration tools to extract data from various data sources. Stay tuned for more to come!

Bookmark this post to stay up-to-date on changes made to this blog as our products evolve.

Join the discussion

Comments ( 8 )
  • Onkar Tiwari Thursday, April 16, 2020
    Very helpful. Thanks for amazing article. Every step is explained very clearly.
  • William Lawrence Wednesday, April 22, 2020
    Does ODI provide native integration with BICC? Also is it possible to schedule replication in ODI?
  • Puneet Srivastava Thursday, July 16, 2020
    Hi Matthieu,
    Can we apply filter on the BICC side to extract a subset of data like "applicationid='10002345'
  • Jay Pearson Wednesday, July 29, 2020
    Hi Puneet,

    We can use query filters on the data store itself in BICC:


  • Jay Wednesday, July 29, 2020
    Hi William,
    Yes, ODI does provide native integration with BICC and it is possible to schedule the replication through ODI.
    Thanks, Jay
  • Pranay Sunday, August 2, 2020
    Dear Jay,
    What is the best option to load Fusion ERP data into on-premise EDW using ODI? Seems this solution only supports loading into ADW
  • Jay Tuesday, September 8, 2020
    Hi Pranay,

    We are working on a blog that would use a similar approach in ODI to schedule Fusion Extracts to OSS and use ODI tools to handle the manifest and load the data to a regular DB. Note: This would require a BYOL ODI Marketplace image.

  • Vasu Cheemakurthi Sunday, April 25, 2021
    Very nice article, It provided the detailed steps of implementation. Helpful, Thanks.
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.Captcha