Co-author: Ashlesh Bajpai, Senior Director, Product Management, Analytics Apps for ERP
Content validated on 12/9/2020 with
ODI Version 184.108.40.206.200304.2238
ADW Version Oracle Database 19c Enterprise Edition Release - Production Version 220.127.116.11.0
EPM Enterprise Cloud Planning Version 20.11.70
FAW Version 4.1.2
Fusion ERP Analytics provides accounting data sourced from ERP Cloud in a warehouse designed to support broad set of analytics use cases. In addition to ERP Cloud, another important source for financial data is EPM Cloud and finance users often have needs to combine the data from these two sources.
This blog is part of in a series covering topics on how to utilize Oracle ETL / Data Integration tools to extract data from various data sources to complement Fusion Analytics Warehouse factory data. Future blog topics will cover extracting from: Oracle SaaS Applications, Oracle on-premise applications (e.g. EBS), and other Oracle applications such as Responsys, Eloqua and third party data sources like Salesforce or Workday.
This blog walks though how to configure Oracle Data Integrator (ODI) on Oracle Cloud Marketplace to replicate data from Oracle EPM Cloud (EPM Enterprise Cloud Planning in particular) into an Autonomous Data Warehouse (ADW) database that is part of the FAW product.
Suggested Prerequisite Reading Material:
* Have access to an EPM Cloud instance
* Follow the steps in the first blog for instructions on:
Policies Assignment (Step 1)
SSH key-pair Generation (Step 4-b and 4-c)
ODI Marketplace Provisioning (Step 2)
ADW Configuration (Step 5-h, 5-i, 5-j and 6-d)
The diagram below details the architecture blue print for this use case:
The data flow is as follow:
An EPM Extract will be created in EPM Cloud
EPM Automate and ODI MP will be installed in the same Linux VM
ODI will call EPM Automate to login to EPM and execute the extract job from EPM Cloud to store it in EPM Cloud as a ZIP file format (containing mutliple CSV files)
ODI will call EPM automate to download the EPM Extract ZIP file and decompress the EPM CSV extract files, merge them into one single CSV and load them in ADW
All further transformations (unpivoting, aggregation, lookups, soft deletes,...) will happen inside the ADW as subsequent ODI processes (not documented as part of the scope of this blog).
EPM will be configured in such a way that EPM data can be extracted on the convenient schedule
Semantic model will be extended by customizing an existing subject area or addition of a new subject area to allow reporting on EPM data extracted in ADW, as per the CEAL Team blog available here.
In order to have ODI extract data from EPM Cloud, and extract job needs to exist as a prerequisite. Follow the steps below to define an EPM Cloud data extract job.
a) Sign in EPM Cloud
b) Once logged in EPM Cloud, click on the upper left hamburger and navigate to Application > Overview menu
c) Then click on Actions > Export Data
d) Click on Create
e) Enter a Name for the Export (here ExportPlan1Data), select outbox as the location, comma separated file type, export labels, exclude dynamic members, for rows and columns make sure to choose dense dimensions, and select the dimensions members to constitute the point of view.
The EPM cloud data extract job ExportPlan1Data can now be executed.
Follow the steps below to install the EPM Automate utility:
a) In EPM Cloud, click on your user name on the top right corner (here Administrator) and click download
b) Select EPM Automate for Linux / Mac, download the file and close.
c) Copy the downloaded file in a convenient folder and unzip it, here under /home/opc/odi_local/epmautomate.
This completes the EPM Automate installation.
Now the ODI Topology needs to be set up, especially the File and ADW technology
a) Create a new File Data Server and name it EPM_FILES. Save -> Test the connection.
b) Create a Physical Schema and set the Schema and Work Schema to /home/opc/odi_local/epmautomate/bin
c) Create a Logical Schema and map the context as shown below:
Refer to this blog to review your ADW topology (Steps 5 h), i) and j).
As a prerequisite, the EPM Cloud Extract needs to be downloaded from EPM Cloud and uploaded into the ODI Marketplace VM (using scp commands). One will need to execute the Extract from EPM Cloud .
Then the file will need to be download from EPM Cloud. In the next steps of this section, the assumption is that the CSV file is present under /home/opc/odi_local/epmautomate/bin folder, as per the figure below:
a) In ODI Designer, create a new model folder -> Create a new model. Name it EPM_Files, select File Technology and EPM_Files logical architecture:
b) Create a new data store named and aliased EPM_Plan1. Select the demo csv file mentioned in the section introduction.
c) Click on File tab in the data store and set the delimited format, the heading to one, the record separator as Unix, the Field seperator as comma, the text delimiter as double quotes and the decimal separator as the .
d) Click on attributes and reverse engineer the attributes in the CSV file header. The columns size and file type might be adjusted based on the actual values of the columns (here all columns are VARCHAR(255).
a) Create a new model using Topology defined as per the previous blog. Below are the figures for the physical, logical schemas and context mapping.
b) In Designer, create a new model named ODI_EPM
c) Create a new ADW diagram named ODI_EPM-> Drag and drop the EPM_Plan1 File datastore onto the diagram -> Click OK.
d) Rename and alias the Data store EPM_PLAN1. Enter EPM_PLAN1 in the resource name.
e) Click on Attributes > Reverse Engineer to create the datastore on the ADW diagram.
c) Click OK.
d) Confirm that the data store is created and appears in the ADW diagram.
ODI will call a series of EPM Automate and execute mapping to achieve the following:
a) From Designer -> Create a new Project named EPM Cloud. Rename the first folder into EPM Plan. Under the folder -> Create a new Mapping named EPM_PLAN1.
b) Drag and drop the File EPM_Plan1 datastore on the left side of the Mapping diagram. Drag and drop the ADW EPM_PLAN1 datastore on the right side of the Mapping diagram.
c) Connect the two datastores -> select "Match Options" = "By Name".
d) Click on the Physical implementation -> Select the first element in the target group. Confirm the Loading Knowledge Module is set to LKM File to Oracle (Built in).GLOBAL:
e) Select the second element in the TARGET_GROUP -> Confirm the IKM is set to IKM Oracle Insert.GLOBAL -> Ensure the CREATE_TARGET_TABLE and TRUNCATE_TARGET_TABLE options are set to true.
f) View the data on EPM_Plan1 source.
Create an ODI Package that follows the steps described in the introduction of this section, using OdiOsCommand tool to call EPM Automate or linux commands.
The figure below details the package flow
a) Run the package and confirm the successful execution
b) Verify the table has been created in ADW and that the EPM data was loaded
This concludes the activities in the blog.
Click here for more A-Team Oracle Data Integrator (ODI) Blogs.
Click here to sign up to the RSS feed to receive notifications for when new A-team blogs are published.
Click here to access the ODI Marketplace documentation library.
In addition to the capability described in this blog, customers have the option to load budget data in ERP Cloud from any EPM system and that data is supported in Fusion ERP Analytics using GL Budget subject area, that will be covered in a future blog post.
This article walked through the steps to configure Oracle Data Integrator on Oracle Cloud Marketplace (ODI) with EPM Automate to replicate data from EPM Cloud into an Autonomous Data Warehouse (ADW) database.
Bookmark this post to stay up-to-date on changes made to this blog as our products evolve.
The Oracle A-Team is a central, outbound, highly technical team of enterprise architects, solution specialists, and software engineers.
The Oracle A-Team works with external customers and Oracle partners around the globe to provide guidance on implementation best practices, architecture design reviews, troubleshooting, and how to use Oracle products to solve customer business challenges.
I focus on data integration, data warehousing, Big Data, cloud services, and analytics (BI) products. My role included acting as the subject-matter expert on Oracle Data Integration and Analytics products and cloud services such as Oracle Data Integrator (ODI), and Oracle Analytics Cloud (OAC, OA For Fusion Apps, OAX).