Automate CPQ Data Table Exports

April 3, 2024 | 4 minute read
Text Size 100%:

Introduction

Did you know that you can automate the export of your data table's data using CPQ REST APIs? If not, you've come to the right place. The export process in CPQ allows you to group together multiple tables, optionally apply filter criteria, and creates an asynchronous task to handle the request. We'll go through how to follow the necessary steps to manage the export task including starting the process, checking the status of the task, retrieving information about a completed task, and downloading your data.

Start Exporting Data Tables

You start exporting one or more tables by calling the Export Data Tables API. Here's an example request payload.

POST /rest//datatables/actions/export HTTP/1.1
Content-Type: application/json
Authorization: Basic xxxxxxxx
 
{ "selections": ["Oracle_BomItemDef", "Oracle_BomItemMap"] }

As you can see above, the request is a post that uses basic authentication to authenticate each request. Make sure you base64 encrypt your login credentials and include them in the in the Authorization header.

The request accepts a JSON body which includes a string array of data table names. Please be aware that the data table names are case sensitive.

You can also include filtering criteria for any of the tables you specify in the selection array. The filtering syntax follows the MongoDB Query Specifications, you can see API Query Collections documentation for more information. Here's an example that applies filter criteria to both of the selected tables.

{
 "selections": [
    "Oracle_BomItemDef",
    "Oracle_BomItemMap"
 ],
 "filterCriteriaForSelections": [{
    "name": "Oracle_BomItemDef",
    "q": "{$or:[{RootVariableName:'ROOT_VAR'},{Optional:'Y'}]}"
  }, {
    "name": "Oracle_BomItemMap",
    "q": "{$and:[{VariableName:'TEST_1'},{BomItemVarName:'TEST_2'}]}"
  }]    
}

The export data table API kicks off a long-running task and responds with information about the task it created. Here's a sample response payload.

{
  "links": [{
     "rel": "related",
     "href": "https:///rest//tasks/123456"
  }],
  "taskId": 123456
}

Checking Task Status

A task is initiated when you export data tables. After exporting the data tables, you can view the status of the export, cancel the export, and download the exported data tables or log files using the Task APIs. Please note, Task APIs are only available for admin users.

As you can see from above, the new task identifier is included in the data table export response. You'll use that identifier to check the status and download you files. If you're using Postman, here's sample code that will check the response and save the task idenitifier.

var data = JSON.parse(responseBody)

pm.test("Task Id present in response", function () {
    pm.expect(data.taskId).not.eql(null);
    // Set Task Id variable
    pm.collectionVariables.set('taskId',data.taskId)
    console.log(data.taskId)
});

Once you've saved the task idenitifier, you use that in a request to Get Task Status. Here's how that looks.

GET /rest//tasks/ HTTP/1.1
Authorization: Basic xxxx
 
HTTP/1.1 200 OK
Content-Type: application/json
 
{
    "id": 123456,
    "dateModified": "04/1/2024 8:11 PM",
    "name": "Data Tables (2) Export",
    "category": {
        "lookupCode": "26",
        "displayValue": "Data Table Export"
    },
    "status": "Completed",
    "result": "datatable_xxxxx",
    "executionTime": "04/1/2024 8:11 PM",
    "dateAdded": "04/1/2024 8:11 PM"
}

When the task status value is "Completed", you can move on to the next step - getting the task file

Getting the Task File

After the task has completed, call the Get Task File List API using the task id from the Export Data Tables API response. Here's how that looks.

GET /rest//tasks//files HTTP/1.1
Authorization: Basic xxxx
 
HTTP/1.1 200 OK
Content-Type: application/json

{
  "items": [{
    "links": [{
      "rel": "related",
      "href": "https:///rest//tasks//files/"
    }],
   "name": "",
   "type": "application/zip"
  }]
}

Using Postman, you can check the response payload and save a variable for the task file. Here's sample code to do so.

var data = JSON.parse(responseBody)

pm.test("Task file items present", function () {
    pm.expect(data.items.length).to.be.above(0);    
});

pm.test("Task file links present", function () {
    console.log(data.items[0].links.length);
    pm.expect(data.items[0].links.length).to.be.above(0);
    // Get task file name from links href variable
    pm.collectionVariables.set('taskFile',data.items[0].links[0].href)
    console.log(pm.collectionVariables.get('taskFile'))
});

Downloading the Task File

Once you've stored the task file full path to the taskFile variable, you can download the task file directly. The task file is a zip archive that includes a separate CSV file for each data table you specified in the Export Data Tables request. Here's an example of the GET request and response.

GET /rest//tasks//files/ HTTP/1.1
Authorization: Basic xxxx
 
HTTP/1.1 200 OK
Content-Type: application/octet-stream
Transfer-Encoding: chunked
Content-Encoding: gzip

<<CONTENT>>

Summary

Exporting your data tables is a simple process and easy to script. Start exporting data tables by submitting a POST request to Export Data Tables including request body info on what tables to include.
A task is initiated when you export data tables and the data table zip file will be associated with the task. After exporting the data tables, you can view the status of the export, cancel the export, and download the exported data tables or log files.

To view the status of the data table export call Get Task Status using the task identifier and once the task has completed, call the Get Task File List API to get the task file full path. Finally, download the task file zip archive which contains separate CSV files for all the tables you selected.

Happy hunting!

~sn

Shea Nolan


Previous Post

Multicast on OCI - High Availability, wXcked Eye and multicast traffic testing

Next Post


Automating network deployments on OCI using Resource Manager

Aditya Kulkarni | 4 min read