Reference Architecture - Fusion SaaS Data Replication into ADW : Using ODI Marketplace and BICC

December 17, 2019 | 38 minute read
Jay Pearson
Consulting Solutions Architect, Business Intelligence
Matthieu Lombard
Consulting Solution Architect
Text Size 100%:

Matthieu Lombard

CLOUD SOLUTIONS ARCHITECT, DATA MANAGEMENT, A-TEAM

* This blog was last tested on ODI V12.2.1.4.201011 + Fusion 20D (11.13.20.10.0) + ADW 18c *

 Background

This article walks though the steps to configure Oracle Data Integrator on Oracle Cloud Marketplace (ODI) with Business Intelligence Cloud Connector (BICC) to replicate data from Fusion SaaS into an Autonomous Data Warehouse (ADW) database.

This blog is the first in a series covering topics on how to utilize Oracle ETL / Data Integration tools to extract data from various data sources. Future blog topics will cover extracting from: Oracle SaaS Applications, Oracle on-premise applications (e.g. EBS), and other Oracle applications such as Responsys, Eloqua.

Suggested Prerequisite Reading Material:

* Using Oracle Data Integrator on Oracle Cloud Marketplace

ODI Mapping Using "LKM BICC to ADW External Table" Processes Old Data (Doc ID 2655099.1)

Business Intelligence Cloud Connector (BICC) Prerequisite:

* Before starting this blog, first setup Oracle Fusion SaaS Business Intelligence Cloud Connector (BICC) to use Oracle Cloud Infrastructure (OCI) Object Storage by following the steps in this prerequisite blog.

User/Policy/Key Requirements:

* It is strongly recommend to configure OCI, BICC, and ODI with a local (non-Federated) user.

* The local OCI user must be granted these policies: "read buckets in compartment" and "manage objects in compartment".

Fusion Requirements:

* Fusion Applications must be 19C (11.13.19.07.0) or higher.

* The Fusion user must be a local non-federated user - and granted the ORA_ASM_APPLICATION_IMPLEMENTATION_ADMIN_ABSTRACT role or a role that includes it.

Autonomous Database Requirements:

* BICC writes View Objects extracts to Object Storage. Therefore, the Autonomous Database needs access to Object Storage to get to the View Objects extracts.

Oracle Data Integrator (ODI) Requirements:

* This feature is also available with ODI on-prem starting v12.2.1.3. See What's New for more details.

Oracle Data Integrator on Oracle Cloud Marketplace (ODI) Requirements:

* Capacity for a VM.Standard2.4 shape or higher.

To confirm available service limits -> go to Infrastructure -> Hamburger -> Governance and Administration -> Governance -> Limits, Quotes and Usage.

 

 Deploy Oracle Data Integrator (ODI) Stack

a) Log into the Oracle Cloud Applications Console

b) Hamburger (top left) -> "Marketplace" -> Click Applications

c) Filter on Type = Stack | Category = Data Integration -> Click on "Data Integration". At the time of publishing ODI Marketplace is available for free when the target database is in OCI (DbaaS, ADW. This promotion may not always be available in the future.  

d) Select the Compartment -> Review and accept the Oracle Standard Terms and Restrictions -> Click Launch Stack

e) Change ODI name as desired or leave as default -> Click Next

f) Either create a new VCN or use an existing one -> Select an option for the ODI repository (database schema) -> Select the compartment -> Scroll down

g) Scroll down -> Select ODI Note Shape (requires min of 2.4) -> Enter a SSH public key -> Select an availability zone -> Enter VNC password -> Click Next

Click here to learn how to generate the SSH Public key.

* Take a note of the ODI VNC Password

h) Review -> Click Create

If successful, the following will be shown (under Resource Manager -> Stacks -> Stack Details -> Job Details)

i) Check Compute -> Instances to see if the instance was created

Change the scope to your compartment

j) Click on the instance to get the details. Make a note of the Public IP to access with VNC (Port 5901).

k) In previous releases, the Instance was not automatically created. If not created automatically, go to Resources Manager to debug.

 

l) Under Resource Manager -> Jobs -> Click on Jobs -> If the job was started, use the logs to debug any install errors

m) If the job wasn't started at all click Terraform -> Apply -> Monitor the logs

 VNC to the Marketplace Image and Launch ODI from the Desktop

a) Confirm Port 22 is available for SSH Remote Login.

Click Networking -> Virtual Cloud Networks -> Select the compartment -> Select the ODI Virtual Cloud Network -> Select the ODI Subnet -> Select the ODI Security Lists

b) Connect using a VNC client using the Public IP (step 2.j) with Port 22 with the Private Key.

The below example is using MobaXterm. Another VNC option is to use Putty with VNC Viewer.

Enter the VNC Pwd specified on install -> OK

c) Double click on the ODI Studio icon (it initially looks like a text document) -> click Trust and Launch.

d) Connect to the repository as SUPERVISOR

By default the SUPERVISOR password will be generated by ODI and saved (but hidden) when the studio is launched.

If necessary, retrieve the SUPERVISOR password by running:

/home/opc/oracle/odi/common/scripts/getPassword.sh

 ODI Prerequisites

a) Create a LOCAL user <api_user>

Currently it is not possible to use a Federated user for the ODI configuration.

Administration privileges are required to create the local user.

Identity > Users -> click Create User

Enter a name for the user. i.e. api_user -> enter a description -> click Create

Verify that the user has been created successfully and that it is NOT a Federated user.

Use a text editor or excel to create a "Topology Definitions Table" to record the required ODI topology credentials. Saved it to secure location. See example format below.

Topology Definition Table:

User OCID

ocid1.user.oc1..****************

Tenancy

****************

Tenancy OCID:

ocid1.tenancy.oc1..****************

Tenancy Object Storage Namespace

****************

 

b) Create the private/public key pair

*** This key pair is NOT the SSH key that you used to access the compute instance.***

*** Instead, this key will be used to allow the OCI api_user to connect to Object Storage from ODI. ***

Summary steps are provided below on how to create the key pair. If required detailed steps can be found here.

From the ODI Marketplace VM -> create an .oci directory to store the credentials: 

mkdir ~/.oci

Generate the private key with the following commands:

It is highly recommended to generate the key with a passphrase.

openssl genrsa -out ~/.oci/oci_api_key.pem -aes128 2048

Add the passphrase to the Topology Definition Table:

Passphrase

*******

 

Note: For Windows, insert -passout stdin to be prompted for a passphrase. The prompt will just be the blinking cursor, with no text.

openssl genrsa -out ~/.oci/oci_api_key.pem -aes128 -passout stdin 2048

Ensure the private key is secure:

chmod go-rwx ~/.oci/oci_api_key.pem

Generate the public key: 

openssl rsa -pubout -in ~/.oci/oci_api_key.pem -out ~/.oci/oci_api_key_public.pem

For windows add -passin stdin. Then type passphrase at blinking cursor.

openssl rsa -pubout -in ~/.oci/oci_api_key.pem -out ~/.oci/oci_api_key_public.pem -passin stdin

Add the Pubic Key to the Topology Definition Table:

Public key

-----BEGIN PUBLIC KEY-----

*******

-----END PUBLIC KEY-----

 

See this doc for additional information on generating the public/private key pair.

c) Generate the Fingerprint

Generate the fingerprint with the following OpenSSL command. (On Windows install Git Bash to generate the key.)

openssl rsa -pubout -outform DER -in ~/.oci/oci_api_key.pem | openssl md5 -c

When the public key is uploaded into the console the fingerprint is also displayed there.

Add the Fingerprint to the Topology Definition Table:

Fingerprint

26:*******

 

d) Attach the public key to the api_user

Note: This step requires admin privileges

Identity > Users > api_user -> click API Keys -> click Add Public Key

Enter the value of the public key saved in the Topology Definition Table -> click Add

Verify the key is correctly attached. The fingerprint should be the same as noted in the Topology Definition Table.

e) Generate the Auth Token for the api_user

Note: This step requires admin privileges

Identity > Users > api_user -> click on Auth Tokens -> click Generate Auth Token Public Key

 

Enter a description -> click Generate Token

Record the Generated Auth Token immediately. THIS WILL NOT BE DISPLAYED AGAIN!

Add the Auth Token to the Topology Definition Table:

Auth token

*******

 

Note: There is a limit of two Auth Tokens per user.

f) From OCI Object Storage locate the bucket previously created for BICC [Step 1 of this prerequisite blog].

Note: Since the BICC blog and this blog were written at different times the bucket name is different in each blog.

Be sure to reference the same OCI Storage Bucket name in both BICC and ODI.

Object Storage > Object Storage

Locate the OCI Storage Bucket previously created and re-tested in BICC.

Add the Bucket Name to the Topology Definition Table:

Bucket Name

bucket-bicc-saas-replication

 

g) Upload a sample file to the bucket

Click on the bucket name

Objects -> Upload Objects

DO NOT enter an object name prefix.

Drag and drop a file from the file system -> click Upload Objects

Wait for the upload to complete

h) Create the ADW credentials that will used later by ODI to load the data into ADW

Connect to the ADW instance with SQL Developer client or SQL Developer Web client.

Verify the existing credentials:

select * from USER_CREDENTIALS;

Drop the credential it exists:

BEGIN

  dbms_cloud.drop_credential(credential_name => 'ODI_ATEAM');

END;

Re-create the credentials:

BEGIN
dbms_cloud.create_credential(
credential_name => 'ODI_ATEAM',
username => 'api_user',
password => 'Auth Token Generated from OCI'
);
END;

Verify the credential was created:

select * from USER_CREDENTIALS;

i) Validate the new ADW credentials

Connect to ADW with SQL Developer client or SQL Developer Web client.

Verify the existing files are attached to the bucket:

select object_name, bytes from dbms_cloud.list_objects('ODI_ATEAM','https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/<TENANCY>/bucket-bicc-saas-replication/');

j) If needed, see (Doc ID 2468298.1) to validate the Object Storage URI and credentials are correctly set for Autonomous Data Warehouse / Autonomous Transaction Processing.

 Configure ODI Topology

a) Create the Object Storage Physical Data Server

In the Physical Architecture -> right click the Oracle Object Storage Technology -> click New Data Server

Enter the information collected in the Topology Definition Table

Click "Test Connection"

Click Yes to save

 

Click OK

Select Local Agent -> click Test

Click OK

b) Create the Object Storage Physical Schema

From the Physical Data Server created -> right click -> select New Physical Schema

/var/folders/m0/gs8k_34j5ds47gkt02bwq2zm0000gn/T/com.microsoft.Word/Content.MSO/E108C1B9.tmp

From the Bucket Name drop-down -> select the BICC extract bucket (previously created) -> enter a work directory

/var/folders/m0/gs8k_34j5ds47gkt02bwq2zm0000gn/T/com.microsoft.Word/Content.MSO/B3F7C52F.tmp

Click OK to save

/var/folders/m0/gs8k_34j5ds47gkt02bwq2zm0000gn/T/com.microsoft.Word/Content.MSO/34CF7615.tmp

c) Create Object Storage Logical Schema and set context

In the Logical Architecture -> right click on "Oracle Object Storage Technology" -> select "New Logical Schema"

Enter a Name -> set the Development and Global contexts to point to the "Object Storage Physical" schema previously created

Save -> Verify that the Logical Schema has been created for Object Storage

d) Verify BI Cloud Connector details and schedule a validation file extract 

From Fusion -> go to the "Oracle Business Intelligence Cloud Connector Console" - https://<Fusion-Apps-host>/biacm

Select "Configure Cloud Storage" -> "OCI Object Storage"

Open the External Storage connection that was created for BICC in the prerequisite blog.

 Test that the connection can still successful connect.

Add the External Storage Name to the Topology Definition table

External Storage Name

DS_OCI_Ateam

Open "Manage Extract Schedules"

Click + to add a new schedule

From the "Job Type" drop-down -> Select "Application Data Extract" -> Enter a name -> click Next

Select the FscmTopModelAM.FinArCustomersPublicModelAM.TaxClassification datastore -> click Enabled for Extract -> click Next

Select the External Storage <DS_OCI_Ateam> that will be used to write the extract to the storage bucket.

Change the global parameter to use GZIP compression type -> click Save

Note: ZIP compression format is not supported in ADW. The ingestion from Object Storage will fail if ZIP is used.

Wait for the job to complete

Once the job completes successfully -> go to the Object Storage bucket -> verify all three files are created

  1. Extract Status, in json format
  2. Manifest file
  3. Extracted datastore .csv, compressed in .gz format

e) Create the BI Cloud Connector Physical Data Server

From Topology Physical Architecture -> Technologies -> right click on "Oracle BI Cloud Connector" -> click "New Data Server"

Enter the BICC Service URL

Under Storage -> select "Oracle Object Storage" and choose the topology defined for Object Storage in the previous step.

The external storage name is the name of the storage defined in BICC as recorded in the Topology Definition Table.

Click Test Connection -> click Yes to save 

Click OK 

Select Local Agent -> click Test

Click OK

f) Create the BICC Physical Schema

Right click on the physical schema created in the previous step -> select "New Physical Schema"

Select FscmTopModel Schema

Save -> click OK

Verify that the physical schema is created

g) Create the BICC Logical Schema and set the context

From Logical Architecture -> right click on "Oracle BI Cloud Connector" technology -> select "New Logical Schema"


Enter a name -> Set the Development and Global Topology to point to the physical schema created in previous step

Save and confirm that the Logical Schema is created

h) Verify the ADW Physical Data server

ODI will list all ADW instances in the tenancy where the ODI Marketplace instance was provisioned

If you provisioned ADW after ODI you can follow this blog to manually configure the ADW connection.

 

ODI also imports the wallets to /home/opc/

If the wallet isn't already there. Copy it using pscp (installed with Putty):

C:\<Install Path>\pscp.exe -i "C:\<SSH Private Key Path>\SSHPrivateKey.ppk" "Y:\<Wallet Path>\wallet_file.zip" opc@123.456.78.123:/home/opc/wallet_file.zip

From Topology -> double click on the DB201911251202-1837 data server -> click OK

Update the Data Server Connection fields as specified below.

NAME

DB201909261013-MEDIUM

User

ADMIN

Password

******

ADMIN

******

Array Fetch Size

1000

Batch Update Size

1000

Uncheck "Use Credentials File" -> click Save 

Test the connection

Click OK

 

i) Create ADW Physical Schema

Right click on the ADW Physical Server created in the prior step -> select "New Physical Schema"

Enter ADMIN in Schema and Work Schema -> click Save

i) Create the ADW Logical Schema and set contexts

Right on "Oracle Technology" -> Select "New Logical Schema"

Enter details below -> set the contexts -> click Save

 Create ODI Project and Models

a) Import BI Cloud Connector RKM

From Designer -> click the Project Icon -> select "New Project"

Enter a Project Name

Save -> confirm the project has been created in the Projects pane 

Rename the First Folder to FSCM

b) Import the BI Cloud Connector RKM if not present

In some early releases of ODI Marketplace was shipped without the KM. If you run into this, you can download it from here:

km_rkm_oracle_bi_cloud_connector.xml

Copy the RKM into the VM Downloads folder using scp ...

Copy the RKM BI Cloud Connector to the xml-reference folder

/home/opc/oracle/odi/sdk/xml-reference

Expand the Knowledge Module tree, right click on Reverse Engineering (RKM) -> select "Import Knowledge Module"

Point the File import directory to the xml-reference folder -> select RKM Oracle BI Cloud Connector -> click OK

Wait for the import to complete -> click close

Verify that the RKM is correctly imported

c) Create the Fusion VOs Model and reverse engineer

Create a Model Folder 

Save 

Create the Model

 

Enter a name -> select "Oracle BI Cloud Connector Technology" -> select the "Logical Schema" created previously

Click the Reverse Engineering tab -> enter details below.

Select Table, View, System Table, Table Alias and Synonym.

Note: be cautious of leading / trailing spaces

  • Mask : %FinArCustomers%

  • Characters to Remove: FscmTopModelAM.

  • LIST_OF_OFFERINGS:       Financial

Save

Reverse Engineer

Click OK

Click OK

Wait for the process to complete. It may take a few minutes depending on the mask and type of PVOs being reversed engineered.

Verify that the VOs datastores are created under the Fusion VOS model

d) Create the ADW Model and Datastore

Create a new Model folder for ADW

Right Click on the ADW model folder -> select "New Model"

Select Oracle Technology -> point to the logical schema previously created.

Save

Create a new diagram under ADW

Enter a diagram name -> click OK

Dragging the objects (individually) from the FSCM model (source) to the target will simplify the datastore creation.

Click on Attributes

Reverse engineer to create the datastore

Verify the datastore is created 

 Create ODI Mapping

a) Create the mapping

Under the FSCM folder -> right click on Mappings -> select "New Mapping"

Enter a mapping name FscmTopModelAM.FinArCustomersPublicModelAM.TaxClassification

uncheck "Create Empty Dataset" -> click OK

The new mapping is created

Drag and Drop the FscmTopModelAM.FinArCustomersPublicModelAM.TaxClassification datastore from Fusion BICC into the working area

Drag and drop the FscmTopModelAM.FinArCustomersPublicModelAM.TaxClassification datastore from ADW into the working area

Connect the output of the Fusion VO datastore to the ADW datastore -> click OK

Verify the fields are mapped from source to target

b) Set up the physical implementations

Click on Physical tab

`

Click on the first element in the target group -> Expand the Loading Knowledge module section to choose the LKM BICC to ADW external table LKM -> Update the CREDENTIAL_NAME to ODI_ATEAM -> leave all other fields as defaults 

Click the last element in the Target group -> Expand the Integration Knowledge Module section and select IKM Oracle insert -> Set the CREATE_TARGET_TABLE option to True.

Save

c) Execute the mapping 

Select the mapping from the tree -> click the Execute button (green play arrow)

Select Context=Development -> Local Agent = Local (no Agent) -> click OK

Click OK

Wait for the session to complete -> Check the number of inserts. i.e. in this example 8 records were inserted.

d) Verify the data has been loaded into ADW

Check the table name ODI created from the session logs

Connect to SQL Developer or SQL Developer web and execute the SQL below

SELECT

    "ClassCatCodeMeaning",

    "CodeAssignmentClassCode",

    "CodeAssignmentEndDateActive",

    "CodeAssignmentLastUpdateDate",

    "CodeAssignmentStartDateActive",

    "FiscalClassTypeClassificationTypeCode",

    "FiscalClassTypeClassificationTypeName",

    "FiscalClassTypeLastUpdateDate",

    "PartyTaxProfileId",

    "PartyTaxProfileLastUpdateDate"

FROM

    "FscmTopModelAM.FinArCustomersP";

Verify that the correct number of records were loaded into ADW (i.e. in this example 8 records were loaded)

 Want to Learn More?

Click here for more A-Team Oracle Data Integrator (ODI) Blogs

Click here to sign up to the RSS feed to receive notifications for when new A-team blogs are published

Click here to access the ODI Marketplace documentation library 

 Summary

This article walked through the steps to configure Oracle Data Integrator on Oracle Cloud Marketplace (ODI) with Business Intelligence Cloud Connector (BICC) to replicate data from Fusion SaaS into an Autonomous Data Warehouse (ADW) database.

This blog is the first in a series covering topics on how to utilize Oracle ETL / Data Integration tools to extract data from various data sources. Stay tuned for more to come!

Bookmark this post to stay up-to-date on changes made to this blog as our products evolve.

Jay Pearson

Consulting Solutions Architect, Business Intelligence

Matthieu Lombard

Consulting Solution Architect

The Oracle A-Team is a central, outbound, highly technical team of enterprise architects, solution specialists, and software engineers.

The Oracle A-Team works with external customers and Oracle partners around the globe to provide guidance on implementation best practices, architecture design reviews, troubleshooting, and how to use Oracle products to solve customer business challenges.

I focus on data integration, data warehousing, Big Data, cloud services, and analytics (BI) products. My role included acting as the subject-matter expert on Oracle Data Integration and Analytics products and cloud services such as Oracle Data Integrator (ODI),  and Oracle Analytics Cloud (OAC, OA For Fusion Apps, OAX).


Previous Post

Deploying Oracle Analytics Cloud Remote Data Gateway in a Remote Private Subnet

Dayne Carley | 6 min read

Next Post


2019 Network Blog Summary

Tal Altman | 2 min read