Co-author
Validation
Content validated on 11/30/2022 with
-
Autonomous Data Warehouse - Oracle Database 19c Enterprise Edition Release - Production Version 19.11.0.0.0
-
Oracle Cloud Application 22C (11.13.22.07.0)
-
Fusion Analytics Warehouse Application Version 22.R2R3
Background
Oracle Fusion Analytics (FAW) provides analytics for Oracle Cloud applications, powered by Autonomous Data Warehouse (ADW) and Oracle Analytics (OAC). More specifically:
- Fusion ERP Analytics provides accounting data sourced from ERP Cloud in a warehouse designed to support broad set of analytics use cases.
- Fusion HCM Analytics provides comprehensive workforce, turnover data sourced from Fusion HCM Cloud.
- Fusion SCM Analytics give customers the ability to gain visibility into Supply Chain performance, correlate Supply Chain processes with business goals and detect, understand and predict Supply Chain issues.
Fusion Accounting Hub (FAH) is a cloud application that unifies data from different financial systems, giving customer finance teams a complete view of financial data to increase forecasting accuracy, shorten reporting cycles, and simplify decision-making.
This blog is part of a series of blogs that details the Reference Architecture leveraging the FAW extensibility framework to use FAW as :
- an Inbound Data Store layer to
- Standarize and group customer source financial systems data
- Transform customer source financial systems data into FAH data file format
- Upload tranformed FAH data file to Fusion
- Run the Accounting Engine to generate Subldeger Accouting entries in FAH
- an Outbound Data Store layer to:
- Enable analytics for the FAH data foundation
- Enable enrichement of the FAH data foundation via the AI / Machine Learning capabilities of the FAW platform
- Allow fine grained sub-ledgers analysis, general ledger balances and supporting references drill-downs with end to end traceability for operational and financial insights.
Figure 1: FAW and FAH Integration
In a previous blog, we leveraged Oracle Data Integration on the Marketplace to prepare and load data to the AHC_Loan FAH Application
This blog will focus on the Inbound Data Flows and Orchestration to FAH Reference Architecture using the
Technical Architecture Overview
Rationale
Banks feed transactions from many upstream source systems to the Fusion Accounting Hub. These could be trading systems, loans, retail banking, foreign exchange deals, hedges, etc. Key characteristics of the upstream feeds:
- Very high volumes at a level of detail unnecessary for accounting purposes.
- Acquisitions often mean that source systems are duplicated. That is, transactions from different source systems warrant a similar accounting treatment.
- Large number of source system transaction attributes that are irrelevant to accounting. But they must be made available to downstream operational reporting.
- Source systems are often not transactional in nature in the sense that a current status can be extracted at end of day but, no coherent / robust history of transaction status is possible.
Customers have requirements to make source transactions available in a central, persistent and robust reporting platform, aka Transaction Data Store (TDS).
The source systems feeds must be prepared prior to the upload to the Accounting Hub. Transformations include:
- standardization - mapping source system attributes to a common list of values. For example, 0 and N = No; and 1 and Y = Yes.
- summarization - merging many source system lines using a subset of attributes and facts (i.e. amounts and counts).
- derived "records", derived fields, sequences, etc.
Value Proposition
In addition to the existing FAW value proposition (reporting tools, extensibility, pre-defined subject areas, etc.), the FAW FAH Integration architecture offers s a single platform for pre-processing, enrichment and reporting. It eliminates data migration from the TDS to the enrichment and / or reporting platform and its reconciliation. The end to end process is faster, more reliable and cheaper.
Architecture Blueprint
FAW to FAH Architecture Blueprint is depicted in Figure 2 below. This architecture has a set of pre requisites :
- Having a clear Master Data Management strategy
- Having derived the Metadata Object Model for Financials products from the Master Data Management strategy, including mapping from source systems data elements to FAH data elements (such as Accounts, Products, Party, Events , FX rates lookups and mappings)
- Having defined the functional configuration in FAH applications according to best practices
Figure 2: FAW FAH Integration Reference Architecture Blueprint with OCI Data Integration
The high level data flow steps are as follow:
- Loading source transactions and instrument balances into FAW tables (staging tables and instrument balances).
- Based on the Metadata Object Model, invoking the process to cleanse, validate and re-format the source transactions and instrument balances in preparation for their upload to the Accounting Hub. The results will be "stored" in AH Input Analyses. The architecture technical components will capture and process any errors that occur.
- Extracting AH Input Analyses from FAW Staging tables, generating Accounting Hub input files and invoking the Accounting Engine.
- The Accounting Engine will process the transactions to generate detailed journal entries and financial balances. The architecture technical components will invoke SR balance calculations.
- FAW Data Pipeline will load the Accounting Hub results (detailed journal entries and financial balances) into FAW ADW tables.
- FAW Data Pipeline will combine Accounting Hub detailed journal entries and financial balances with source transactions (staging tables and instrument balances). This process is known as enrichment. The resulting enriched source system transactions and financial balances will be available for further processing by Financial Services Industry specific analytical applications and operational reporting (FAW OAC).
In this blog, we will leverage OCI Data Integration and the AHC_Loan FAH Application to implement this inbound architecture.
Figure 3: FAW FAH Integration Technical Architecture
The video
The following video walks you through the OCI artefacts created in order to implement the technical architecture describes above:
Want to Learn More?
Click here to sign up to the RSS feed to receive notifications for when new A-team blogs are published.
Summary
In a previous blog, we leveraged Oracle Data Integration on the Marketplace to prepare and load data to the AHC_Loan FAH Application. The high level data flow steps were as follow:
- Loading source transactions and instrument balances into FAW tables (staging tables and instrument balances).
- Based on the Metadata Object Model, invoking the process to cleanse, validate and re-format the source transactions and instrument balances in preparation for their upload to the Accounting Hub. The results will be "stored" in AH Input Analyses. The architecture technical components will capture and process any errors that occur.
- Extracting AH Input Analyses from FAW Staging tables, generating Accounting Hub input files and invoking the Accounting Engine.
- The Accounting Engine will process the transactions to generate detailed journal entries and financial balances. The architecture technical components will invoke SR balance calculations.
In a this blog, we used the OCI Native Data Integration serverless Service to implement this inbound architecture and also demonstrated the Outbound Architecture from within FAW
- FAW Data Pipeline will load the Accounting Hub results (detailed journal entries and financial balances) into FAW ADW tables.
- FAW Data Pipeline will combine Accounting Hub detailed journal entries and financial balances with source transactions (staging tables and instrument balances). This process is known as enrichment. The resulting enriched source system transactions and financial balances will be available for further processing by Financial Services Industry specific analytical applications and operational reporting (FAW OAC).
Bookmark this post to stay up-to-date on changes made to this blog as our products evolve.