Best Practices from Oracle Development's A‑Team

Loading Data into Oracle Fusion SaaS - Serverless Style!

Angelo Santagata


This article introduces how you can use serverless technology to extend SaaS. It is one of a series of articles I, and some colleagues of mine, are and will be authoring on the topic of how to use Oracle Cloud Infrastructure (OCI) to extend and integrate with SaaS. We've nicknamed the initiative "OCI4SaaS", and it would be good to see if that sticks!

Why Serverless for SaaS?

SaaS Customers love the licensing model of SaaS, they pay by consumption and/or by the number of "entities" they create, they manage nothing except their own data. They really love that everything is managed them and it just works, When building SaaS extensions they expect the same. They do not want to have the hassle of having to maintain the operating system, the patches, the firewall, runtime libraries.. yuck yuck yuck.  This is where serverless comes in to the rescue. Serverless does not mean no "servers" but that the developer doesn't manage them, so it so they might as well not exist to the developer. In a serverless environment a developer "just" deploys their code, and configurations, to the cloud and the cloud manages it - job done.

The cloud CPU scales up, scales down,need more space just use it, , and then delete it and the list goes on, The other important aspect here is that the customer one pays for what they consume whilst its being consumed. So if they aren't using functions they don't pay, if they don't have anything stored then they don't pay. 


One use case where serverless might be useful is for customers who are loading data into SaaS infrequently, or where the integration flows are straightforward.  Generally data loading into SaaS Oracle's premier product is Oracle Integration Cloud (OIC) domain. OIC allows integrators the ability to create integrations using declarative constructs, perform transformations in the cloud, run the actual data load in the cloud, automagically manage exceptions, retries and everything else without writing a single line of code. However some customers want to code a "code centric" approach, they want complete control of what the code is doing, how its deployed and more importantly if they aren't using it they want to pay nothing.. zilch.. 

Our Serverless SaaS Data Loading Example

To demonstrate how serverless can be used to load data into Oracle Fusion Cloud we have built a sample using only OCI serverless services. and a 3GL code based approach. Before we start explaining the sample its worth understanding how data gets loaded in Oracle Fusion ERP cloud.

There are typically a number of steps involved when loading data into Oracle Fusion ERP :

  1. Enriching the incoming data with some external data
  2. Transforming the data into the Oracle ERP data loading format (aka FBDI)
  3. Loading the file into Oracle Fusion ERP using REST, or SOAP, Services
  4. Listening for a call-back from Oracle Fusion ERP telling us that the process has completed and that the data has been processed.


Breaking this down 


When loading data into Oracle Fusion a mandatory step is to transform the input data into the right CSV format and then zip up the files into a single ZIP file.  In our example we demonstrate this transform step by accepting a "simplified" JSON data structure which we then transform into the CSV format required by Oracle ERP Fusion. After this we then wrap the files into a single ZIP file ready for uploading

Background Reading:  If doing this manually the user normally needs to download Excel Macro files, which they can populate and generate the zip files. These XLSM files can be downloaded from Oracle Docs, for Invoices you can find an example here  https://docs.oracle.com/en/cloud/saas/financials/20c/oefbf/payables.html#payablesstandardinvoiceimport-3200


Once the data has been transformed we want to load it into Oracle Fusion ERP. The response from Oracle Fusion contains some metadata and more importantly the JobID being used to process the data.. We will need that later to determine which data file is being imported by which job.

Processing the Call back

 Fusion issues  a call-back to the client which tells us if the job was processed correctly or errored. Its worth nothing just because it processed successfully does not mean the data loaded OK, there could have been duplicate records etc.

Implementing this "Serverless" Style With Oracle Cloud Infrastructure (OCI)


Getting the Data into Oracle Cloud

We first need to get the data into the OCI Cloud so we can process it efficiently. Oracle OCI has somewhere ideally suited for this called Storage Buckets. OCI Buckets provide us with huge amounts of storage and multiple ways of uploading files to storage cloud, namely a CLI, a REST API or the admin console. For more information see https://docs.cloud.oracle.com/en-us/iaas/Content/Object/Concepts/objectstorageoverview.htm

The data we are going to load looks like this

"invoices": [ 
    "invoiceId": "222290", 
    "businessUnit": "US1 Business Unit", 
    "source": "External", 
    "invoiceNumber": "111190", 
    "invoiceAmount": "4242.00", 
    "invoiceDate" : "2019/02/01", 
    "supplierName": "Staffing Services", 
    "supplierNumber" : 1253, 
    "supplierSite" : "Staffing US1", 
    "invoiceCurrency": "USD", 
    "paymentCurrency": "USD", 
    "description" : "New Invoice from global Angels", 
    "importSet": "AP_Cloud_Demo", 
    "invoiceType": "STANDARD", 
    "paymentTerms": "Immediate", 
    "termsDate": "2019/02/01", 
    "accountingDate": "2019/02/01", 
    "paymentMethod": "CHECK", 
    "invoiceLines": [ 
                        "amount": "200", 
                         "description" : "Invoice Line Description" 
                        "amount": "300", 
                        "description" : "Invoice Line Description2", 
                        "invoiceQuantity": "10", 
                        "unitPrice": "5" 

This is much simpler than the native format ERP FBDI import zip which comprises of two CSV files, looking something like this

222284,US1 Business Unit,External,111184,4242.00,2019/02/01,Staffing Services,1253,Staffing US1,USD,USD,New Invoice from global Angels,AP_Cloud_Demo,STANDARD,,,,,,Immediate,2019/02/01,,,2019/02/01,CHECK,Standard,#NULL,,,,,,,,,,,#NULL,,,,,#NULL,#NULL,,,,21,#NULL,,,#NULL,#NULL,,,,#NULL,,,,,,,,,,,,,,,N,N,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,END


222284,1,ITEM,200,,,,Invoice Line Description,,,,,,,,,,,,,N,,#NULL,2019/02/01,,,,,#NULL,,,,,,,,,,,,,,,,,#NULL,,,N,1,,,N,,,,,,,N,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,END

Once we have created the storage buckets in OCI we can upload the resulting zip file using the following CLI command

       oci os object put -ns mynamespace -bn JSONIncoming --name mysimpliedJSONInvoice.json --file mysimpliedJSONInvoice.json 

The rest of our solution will also have a number of buckets which will be used to store the file as its being processed and as the process moves along the file moves along to the next bucket. As soon as the file is loaded into Oracle Fusion ERP the file is renamed to include the ERP data loading JOBID. 

Say Hello to Oracle Cloud Functions

So now we have created our buckets to store the data we now need to process it. We have broken the functionality into 3 independent code units, namely the "transform function", the "load function" and the "callback function". These have been implemented using Oracle Cloud Functions and written in Python. Oracle Cloud Functions are perfectly suited to this job because data loading could be something which happens with limited frequency (couple of times an hour or a day). This is a situation where  one could have deployed the code in a Kubernetes container but that would have the negative effect that the code would be "hot" reading and waiting to process the data input, and of course the customer would be paying for the compute time. Using Oracle Cloud Functions is great here, the function is only called when there is something to do and once it has finished processing then it disappears. Additionally there is no operating system, routing etc for the customer to maintain - its serverless..  For the implementation we chose Python and not Java,  there wasn't a hard and fast reason but we wanted the code to be really easily extended, understood and lightning performance is not a major requirement here. 

Our three functions are :

- One for transforming the file from a simplified JSON format to the Oracle Fusion ERP Specific Zip file
- One for loading the file into Oracle Fusion ERP
- One for handling the call-back from Oracle Fusion 

Each of these functions will source its data the OCI Storage bucket, process it and then put it in another bucket.



The transform function gets the JSON data from the Incoming JSON bucket, transforms it via code into a CSV, zips it up and then stores it in the incoming ZIP bucket.  The  ERP Load function takes this zip file and then loads it into Oracle Fusion ERP, passing the right parameters for our job request (invoices in our case) and also adds a call-back URL for Oracle Fusion to "call". 

When the source code is available to peruse on GitHub you will that the transform function uses a template approach to generate the CSV file. 

Interacting with OCI files is really easy, you can put / delete objects easily , if you need to copy large files you can do this asynchronously however for this example we didn't need to

put_object_response = object_storage_client.put_object(namespace, param_processing_bucket_name, data_file_name + "_ERPJOBID_" + erp_job_id, data_file.data.content)

Loading data into Oracle Fusion SaaS is pretty straightforward, for this example we used the standard importBulkData REST API (See https://docs.oracle.com/en/cloud/paas/integration-cloud/erp-adapter/use-importbulkdata-operation-and-its-parameters.html) and passed it the right parameters, here is another snippet of code showing how we can load data using the open source requests REST API

erp_payload = {
        "OperationName": "importBulkData",
        "DocumentContent": base64_encoded_file,
        "ContentType": "zip",
        "FileName": data_file_name,
        "JobName": jobname,
        "ParameterList": paramlist,
        "CallbackURL": param_fa_callback_url,
        "NotificationCode": "10"
    logging.info(f'Sending file to erp with payload {erp_payload}')
    result = requests.post(
        headers={"Content-Type": JSON_CONTENT_TYPE},
    if result.status_code != 201:
        message = "Error " + str(result.status_code) + " occurred during upload. Message=" + str(result.content)
        raise FA_REST_Exception("Error " + message)


Linking the objects together using Events

So how do the functions get run? This is the fun bit we missed out earlier on in our description. When a file is uploaded into OCI Buckets it is possible to configure the OCI Bucket to emit an "Event" when CRUD operations are performed. This event can be captured by the event service and by using rules you can say something like "If a new file is created in bucket called INCOMING_JSON then call the erp-transform serverless function.  The configuration of the bucket to "emit events" is a simple checkbox and the creation of the rules is all stupidly simple declarative 

This is really useful as it means you can tie operations which happen to a storage bucket, and others BTW, to Function calls for processing - Neat!. This event based functionality allows us to create a string of operations which are triggered by "events" in a truly decoupled manner..  

Our current flow now looks like this, each Bucket has been marked to emit an event and the event service captures this event and invokes the appropriate Oracle Cloud Function. When invoking the function the event service passes an "Event Payload" which in turn invokes the function. Within the event payload we will find the type of event emitted and in our case the bucket and file name we are interested in. 


Securing Passwords Securely

The LoadToSaaS function needs to be able to authenticate with Oracle ERP, for this to work we will need access to the username and password for the integration user. We can store the username as a Functions Configuration variable but its probably not safe to store the password there. Thankfully Oracle OCI has a secure solution for this - The vault service. The vault service provides us with a secure place to store passwords, encrypted of course, where the admin of the system can store the password . Within code we can query the vault, and extract out the secret. Once retrieved we can use this secret to execute an authenticated REST call to Oracle ERP. What is also interesting to note that the OCID of this secret key does not change if you update it, so you can update passwords safely without breaking the solution.


The Python code to extract out the secret isn't complex at all, checkout this snippet of code

signer = oci.auth.signers.get_resource_principals_signer()
secret_client = oci.secrets.SecretsClient(config={}, signer=signer)
secret_response = secret_client.get_secret_bundle("ocid1.vaultsecret.oc1.phx.xxxxxxxx")
base64_secret_bytes = secret_response.data.secret_bundle_content.content.encode('ascii')
base64_message_bytes = base64.b64decode(base64_secret_bytes)
print("secret value is " + base64_message_bytes.decode('ascii'))


Waiting For the Call-back

Now that we have the data in Oracle Fusion ERP, Fusion will do its thing. When data is loaded into Oracle Fusion , Fusion does the following (simplified) steps

  • Data is loaded in Fusion UCM Repository
  • ESS job transfers the file contents into ERP integration tables
  • ESS job imports data into transactional tables
  • Report generated, showing which rows inserted where, into UCM
  • Call-back to client with status payload


The final function which gets called is the one which implements the client side, the receiving side of this call-back from Fusion. The call-back from Fusion is a HTTP call with XML data. Oracle Cloud Functions are not REST endpoints so to be able to receive the GET HTTP call from Fusion we need to front end our Function with API Gateway.

Again this is a declarative operation and simply involves mapping the endpoint + resource url to a function 

Our erp-callback function, gets triggered when Fusion issues a call back. The function decodes the XML payload and extracts out the JOBID and Status. With the JOB ID we can determine which file in the processing bucket the event was for and then move the file from the Processing bucket to either the success or errored bucket. Worth noting that in ERP lingo, just because a job succeeded doesnt mean that the data was loaded into Oracle ERP, it may be duplicate data, unknown business organisation etc. A future enhancement would be for this function to download the report from UCM and introspect it to determine if all rows were inserted successfully or not.. 

Extending the Solution by Subscribing To Notifications 

So the flow is done, everything works and now we get the to benefits of integrating with OCI. Because we've designed out flow in small, perhaps micro, services, and we are using native services like events we can leverage other services like notifications. The OCI Notification Service allows us to create "topics" where we can post messages to, these topics can have subscribers which listen to a message and then send it out somewhere. In our sample code we're just sending emails but in the real world the subscriber could be another Function, a PagerDuty channel or something totally different. This loosely coupled architecture lends itself to be extended in many ways. For example you could add a function which inserts data into a Grafana dashboard, the notification could call this function with some data which is then inserted into Grafana.

Here is a snippet, helper function which demonstrates how to send a notification to a topic (via its OCID)

 def publish_ons_notification(topic_id, msg_title, msg_body):
        signer = oci.auth.signers.get_resource_principals_signer()
        logging.info("Publish notification, topic id" + topic_id)
        client = oci.ons.NotificationDataPlaneClient({}, signer=signer)
        msg = oci.ons.models.MessageDetails(title=msg_title, body=msg_body)
        client.publish_message(topic_id, msg)
    except oci.exceptions.ServiceError as serr:
        logging.critical(f'Exception sending notification {0} to OCI, is the OCID of the notification correct? {serr}')
    except Exception as err:
        logging.critical(f'Unknown exception occurred when sending notification, please see log {err}')


The Final Architecture

The diagram above shows the final architecture we are using and again its worth noting that all the components above are serverless and you only pay for what you use.



The above example demonstrates how one can create a serverless architecture for loading data into Oracle Fusion ERP. It demonstrates a pattern of using OCI Buckets, Events , Functions and other OCI Services to implement the integration pipeline. The pattern highlights how the pattern can be extended to include notifications and further enrich the integration.  This pattern is code based and totally serverless and unless running there is zero cost to the consumer (except the storage of the files in OCI Storage).  If a customer has a lot of integrations they wish to implement and these integrations will be changed frequently then perhaps Oracle Integration Cloud (OIC) is the right service. OIC provides a rich declarative development environment where integrations can be built , deployed quickly and scale to meet your needs. Additionally the OIC gives you a lot of extra functionality OOTB, such as error handling, retrying of messages , dashboards, reporting, state tracking of previous requests etc.

However you want a totally serverless approach, don't mind the code centric style of integration and relish the flexibility and extensibility this pattern provides then the Serverless Cloud is your oyster!

Source Code

Code to support the above blog is available as open source on https://github.com/oracle/sample-serverless-saas-erp-dataload , pull requests and enhancements welcome.

Join the discussion

Comments ( 2 )
  • Periasamy Ramamoorthy Friday, December 4, 2020
    Dear Angelo, Wow - this is very cutting edge ! I was researching for new integration patterns for our data conversion tool and came across your blog! I had come across serverless functions in one of Oracle's presentation earlier this year but had only paid cursory attention to it thinking it is too futuristic. Your blog has opened my mind to possibilities of using Oracle Cloud Functions. Some of the concepts outlined by you are new to me as I come from a predominately EBS/ERP background. But I would definitely explore your idea and see how I can implement one or two simple data conversion uses cases. With warm regards Peri (check out our products on https://cloudascend.in/)
  • Simon Wigg Monday, April 19, 2021
    Github link does not work.
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.Captcha