Introduction

In a recent project with Oracle Business Intelligence Cloud Connector (BICC), a customer faced a challenge with filtering data during a full extract. They needed to restrict the dataset from a Public View Object (PVO), such as BalanceExtractPVO, to a specific BalancePeriodName instead of extracting all available data. In this blog, I’ll discuss the issue, the limitations encountered, and how I leveraged BICC APIs to meet their unique business integration needs.

 

The Business Requirement

When performing a full data extract in BICC, the default configuration retrieves the entire dataset for a given PVO. Customer requirement was to filter the data and extracting only records tied to a specific BalancePeriodName.

One possible option would be to manually adding filters through the BICC UI, but this approach wasn’t feasible due to the automated nature of the business integration process, which required a programmatic solution.

Given the limitations, I shifted focus to leveraging BICC’s APIs to achieve the desired filtering. BICC provides a set of RESTful APIs that allow for more granular control over data extracts, including the ability to define custom queries or parameters programmatically. This approach aligned with the need for automation and integration with external systems as part of the business workflow.

 

Solution: Using BICC APIs for Filtered Extracts

To address the filtering requirement, I developed a solution using BICC APIs to customize the extract process. Here’s a high-level overview of the steps I took:

  1. Leverage BICC REST APIs: Utilized the BICC API endpoints to initiate and configure the extract job. Specifically, I used the API to pass a custom query or parameter to filter data based on BalancePeriodName.
  2. Automate the Process: Integrated the API calls into the existing business integration workflow to ensure the filtered extract runs on a schedule without manual intervention.
  3. Validate the Output: Tested the solution to confirm that only the relevant data for the specified BalancePeriodName was extracted, reducing data volume and processing overhead.

 

BICC REST APIs

We used specific BICC API endpoints for our metadata-driven approach:

  • GET /biacm/rest/meta/offerings: Retrieves list of offerings.The value of Offering ID is required for metadata table.
  • GET /biacm/rest/meta/datastores/{datastoreId}: Retrieves current datastore metadata and configuration
  • PUT /biacm/rest/meta/datastores/: Updates datastore configuration with new filters and offerings

 

Metadata Driven Framework

The solution relies on a metadata table (BIACM_METADATA) that contains:

  • DATASTORE_ID: Unique identifier for each data source
  • OFFERINGS: Specific data offerings to be extracted
  • FILTERS: Dynamic filter criteria to be applied at the PVO level

 

Implementation Workflow

The process involves updating PVOs with filters using BICC APIs and then executing Data Flows in Oracle Data Transforms. Below are the high-level steps for the workflow:

 

Step 1: Create Objects and set up ACLs

 

BEGIN
DBMS_NETWORK_ACL_ADMIN.create_acl (
acl => ‘bicc.xml’,
description => ‘ACL for BICC API’,
principal => ‘<DB Username>’,
is_grant => TRUE,
privilege => ‘connect’,
start_date => SYSTIMESTAMP,
end_date => NULL);
end;
/
 
BEGIN
DBMS_NETWORK_ACL_ADMIN.ADD_PRIVILEGE (acl => ‘bicc.xml’
, principal => ‘<DB Username>’
, is_grant => TRUE
, privilege => ‘resolve’);
end;
 
begin
DBMS_NETWORK_ACL_ADMIN.assign_acl (
acl => ‘bicc.xml’,
host => ‘<fusion hostname>’,
lower_port => 443,
upper_port => 443);
end;
/
commit;
 
BEGIN
DBMS_NETWORK_ACL_ADMIN.APPEND_HOST_ACE(
host => ‘<fusion hostname>’,
ace => xs$ace_type(privilege_list => xs$name_list(‘http’),
principal_name => ‘ADMIN’,
principal_type => xs_acl.ptype_db));
END;

 

Step 2: Metadata Table Setup

Create a centralized metadata repository that defines extraction parameters for each datastore. This table stores configuration details like DATASTORE_ID, OFFERINGS, and FILTERS in a metadata table (BIACM_METADATA) to drive the process.

CREATE SEQUENCE BIACM_PROCESS_LOG_SEQ;
 
CREATE TABLE BIACM_PROCESS_LOG (
LOG_ID NUMBER PRIMARY KEY,
DATASTORE_ID VARCHAR2(128),
RAW_RESPONSE CLOB,
MODIFIED_REQUEST CLOB,
FINAL_RESPONSE CLOB,
LOG_DATE TIMESTAMP
);
 
CREATE TABLE BIACM_METADATA (
DATASTORE_ID VARCHAR2(128),
OFFERINGS VARCHAR2(128),
FILTERS VARCHAR2(4000),
PRIMARY KEY (DATASTORE_ID)
);

 

Step 3: Dynamic PVO Filter Application

The core of our solution involves dynamically updating PVO configurations using BICC APIs. The process includes:

  1. Retrieve Current Configuration: Using GET API to fetch existing datastore metadata
  2. Apply Business Filters: Modify the configuration with specific filter criteria from metadata
  3. Update Datastore: Push the modified configuration back using PUT API
  4. Logging: Maintain comprehensive logs for auditing and troubleshooting

 

DECLARE
v_host VARCHAR2(256) :=‘https://<fusion host>’;
v_username VARCHAR2(128) := ‘<bicc username>’; — for production store them in OCI vault
v_password VARCHAR2(128) := ‘<bicc password>’; — for production store them in OCI vault
— For Basic Auth header
v_auth_string VARCHAR2(256);
v_auth_raw RAW(256);
v_auth_b64 VARCHAR2(512);
v_auth_header VARCHAR2(600);
 
— Metadata cursors and variables
CURSOR c_meta IS
SELECT DATASTORE_ID, OFFERINGS, FILTERS
FROM BIACM_METADATA;
v_url_get VARCHAR2(512);
v_url_post VARCHAR2(512);
v_buffer VARCHAR2(32767);
v_raw_response CLOB;
v_modified_request CLOB;
v_final_response CLOB;
req UTL_HTTP.req;
resp UTL_HTTP.resp;
base_json JSON_OBJECT_T;
dsmeta JSON_OBJECT_T;
out_json JSON_OBJECT_T;
offerings_json JSON_ARRAY_T;
log_id NUMBER;
 
BEGIN
— Build and encode the basic auth header
v_auth_string := v_username || ‘:’ || v_password;
v_auth_raw := UTL_I18N.STRING_TO_RAW(v_auth_string, ‘AL32UTF8’);
v_auth_b64 := UTL_RAW.CAST_TO_VARCHAR2(UTL_ENCODE.BASE64_ENCODE(v_auth_raw));
v_auth_header := ‘Basic ‘ || v_auth_b64;
 
FOR rec IN c_meta LOOP
— Prepare GET URL
v_url_get := v_host || ‘/biacm/rest/meta/datastores/’ || rec.DATASTORE_ID;
v_raw_response := NULL;
 
BEGIN
req := UTL_HTTP.begin_request(v_url_get, ‘GET’, ‘HTTP/1.1’);
UTL_HTTP.set_header(req, ‘Content-Type’, ‘application/json’);
UTL_HTTP.set_header(req, ‘Authorization’, v_auth_header);
resp := UTL_HTTP.get_response(req);
LOOP
UTL_HTTP.read_line(resp, v_buffer, TRUE);
v_raw_response := COALESCE(v_raw_response, ) || v_buffer;
END LOOP;
 
EXCEPTION
WHEN UTL_HTTP.end_of_body THEN
UTL_HTTP.end_response(resp);
WHEN OTHERS THEN
BEGIN
UTL_HTTP.end_response(resp);
EXCEPTION
WHEN OTHERS THEN
NULL;
END;
v_raw_response := NULL;
 
END;
 
IF v_raw_response IS NULL THEN
CONTINUE;
END IF;
 
— Parse and modify JSON
base_json := JSON_OBJECT_T.parse(v_raw_response);
dsmeta := JSON_OBJECT_T.parse(base_json.to_clob); — deep copy
dsmeta.put(‘filters’, rec.FILTERS);
 
— Build offerings array
offerings_json := JSON_ARRAY_T();
offerings_json.append(rec.OFFERINGS);
 
out_json := JSON_OBJECT_T();
out_json.put(‘offerings’, offerings_json);
out_json.put(‘dsMeta’, dsmeta);
 
v_modified_request := out_json.to_clob;
 
DBMS_OUTPUT.put_line(‘v_modified_request: ‘ || v_modified_request);
 
 
v_url_post := v_host || ‘/biacm/rest/meta/datastores/’;
v_final_response := NULL;
 
BEGIN
req := UTL_HTTP.begin_request(v_url_post, ‘PUT’, ‘HTTP/1.1’);
UTL_HTTP.set_header(req, ‘Content-Type’, ‘application/json’);
UTL_HTTP.set_header(req, ‘Content-Length’, TO_CHAR(DBMS_LOB.getlength(v_modified_request)));
UTL_HTTP.set_header(req, ‘Authorization’, v_auth_header);
UTL_HTTP.write_text(req, v_modified_request);
resp := UTL_HTTP.get_response(req);
LOOP
UTL_HTTP.read_line(resp, v_buffer, TRUE);
v_final_response := COALESCE(v_final_response, ) || v_buffer;
END LOOP;
 
EXCEPTION
WHEN UTL_HTTP.end_of_body THEN
UTL_HTTP.end_response(resp);
WHEN OTHERS THEN
BEGIN
UTL_HTTP.end_response(resp);
EXCEPTION
WHEN OTHERS THEN
NULL;
END;
v_final_response := NULL;
 
END;
 
 
DBMS_OUTPUT.put_line(‘v_final_response: ‘ || v_final_response);
 
— Save results in log table
INSERT INTO BIACM_PROCESS_LOG (
LOG_ID,
DATASTORE_ID,
RAW_RESPONSE,
MODIFIED_REQUEST,
FINAL_RESPONSE,
LOG_DATE
)
VALUES (
BIACM_PROCESS_LOG_SEQ.NEXTVAL,
rec.DATASTORE_ID,
v_raw_response,
v_modified_request,
v_final_response,
SYSTIMESTAMP
);
COMMIT;
END LOOP;
 
DBMS_OUTPUT.put_line(‘Process completed.’);
END;
/

 

Step 4: Data Flow Execution

Once PVOs are updated with appropriate filters, Oracle Data Transforms execute the actual data extraction with significantly reduced data volumes.

 

Workflow

 

Conclusion

By leveraging BICC APIs, I was able to implement a filtered data extract for BalanceExtractPVO that met the specific business requirement of limiting data to a particular BalancePeriodName. This solution not only addressed the immediate challenge but also provided a repeatable framework for similar use cases in the future.