Launching Digital Assets

January 20, 2020 | 5 minute read
Dolf Dijkstra
Cloud Solutions Architect
Text Size 100%:

Launching Digital Assets

Launching a rocket is a 3 step process: bring the rocket to the platform, countdown and then launch. Uploading a DigitalAssets in Oracle Content and Experience with REST is a similar 3 step process, though it ends with the countdown.

The 3 steps are:

  1. Upload asset as a Document, using the /documents API

  2. Add the uploaded document to the Repository

  3. Poll the add-to-repository operation job for completion.

In this blog article I will detail the three steps with code samples. The samples expect that you have at least contributor rights to the Repository of choice.

The samples are written in JavaScript and need to be launched with nodejs. The HTTP library used is node-fetch and form-data is used to create the multi-part MIME message. The samples make uses of Promises and async/await constructs.

Bring your documents to the platform

To add DigitalAssets to your Repository you first need to upload the binary as a Document in the Documents part of Oracle Content and Experience. The API call for this is simple: make a HTTP POST call to /documents/api/1.2/files/data with a multi-part MIME message as the body.

  const uploadToFolder = async (folderID, filename) => {
    const form = new FormData()
    form.append(
      'jsonInputParameters',
      JSON.stringify({
        parentID: folderID
      }),
      { contentType: 'application/json' }
    )
    form.append('primaryFile', fs.createReadStream(filename))
    return postFile(`${host}/documents/api/1.2/files/data`, form)
  }

 const postFile = async (url, body) => {
    const options = {
      method: 'post',
      body,
      headers: {
        'x-requested-with': 'XMLHttpRequest',
        Authorization: auth
      }
    }
    const response = await fetch(url, options)
    if (response.ok) {
      return response.json()
    } else {
      console.error(`Error ${response.status} on ${url}`)
      const text = await response.text()
      console.error(`${text}`)

      throw new Error(`HTTP Post error ${response.status} on ${url}`)
    }
  }

As you can see, the uploadToFolder method takes two arguments: folderID and filename. The filename is the same of the file on disk, but the implementation is easy to rewrite to take in another pointer of the binary as the node-form Form objects accepts a generic ReadStream as input.

The folderID can be passed in if you know it, or can easily be discovered with another API call. In the sample below is the code to create a first-level Folder in Documents if the folder does not exists. The return in an ID, the folderID that can be passed to uploadToFolder.


const createFolder = async name => {
    // const folder = await get(`${host}/documents/api/1.2/folders/self`)
    // const { id } = folder
    const { id, items } = await get(
      `${host}/documents/api/1.2/folders/self/items?filterName=${name}`
    )
    if (
      items &&
      items.filter(
        i => i.type === 'folder' && i.name.toLowerCase() === name.toLowerCase()
      ).length > 0
    ) {
      return items[0].id
    }
    const body = JSON.stringify({
      name: name,
      description: 'Temporary folder to import digital assets'
    })
    const newOne = await post(`${host}/documents/api/1.2/folders/${id}`, body)
    return newOne.id
  }
  const get = async url => {
    const options = {
      method: 'get',
      headers: {
        'Content-Type': 'application/json',
        'x-requested-with': 'XMLHttpRequest',
        Authorization: auth
      }
    }
    const response = await fetch(url, options)

    if (response.ok) {
      return response.json()
    } else {
      console.error(`Error ${response.status} on ${url}`)
      throw new Error(`HTTP GET error ${response.status} on ${url}`)
    }
  }
  const post = async (url, body) => {
    const options = {
      method: 'post',
      body: typeof body === 'string' ? body : JSON.stringify(body),
      headers: {
        'Content-Type': 'application/json',
        Accept: 'application/json',
        'x-requested-with': 'XMLHttpRequest',
        Authorization: auth
      }
    }
    const response = await fetch(url, options)
    if (response.ok) {
      if (response.status === 202) {
        const Location = response.headers.get('location')
        return { Location }
      }
      return response.json()
    } else {
      console.error(`Error ${response.status} on ${url}`)
      console.error(body)
      const text = await response.text()
      console.error(text)

      throw new Error(`HTTP Post error ${response.status} on ${url}`)
    }
  }  

The name argument can be any valid Folder name.

After the successful upload of the binary, have a JSON response. In the JSON response are fields id and errorCode. If errorCode is “0”, the operation was a success and the id will contain the ID of the newly uploaded Document. This ID we need later to add the document to the Repository a s a DigitalAsset.

Authenticating

The three methods to execute the HTTP operations: get, post and postFile need to provide a OAuth token in the Authorization header. How to get this token is documented in the Oracle Content and Experience REST documentation. In short, in a browser where you are logged in you need to make a call to /documents/web?IdcService=GET_OAUTH_TOKEN and grab the tokenValue. To make the samples work you need to provide an auth variable with the contents “Bearer <tokenValue>”.

In a future blog post I will document how to get the token programmatically for a client application.

Launching

After the successful upload of the binary, we can go on with the next step: adding a DigitalAsset to the Repository. To do this you need to make a single API call to /content/management/api/v1.1/bulkItemsOperations.

As the name bulkItemOperations suggest, with this API we can do operations in bulk. The REST call is a POST call with a JSON body. In that body the Ids of the Documents and the ID of the repository needs to be provided.

  const addToRepository = async (repositoryId, docIds) => {
    const externalIds = Array.isArray(docIds) ? docIds : [docIds]

    const body = {
      operations: {
        addToRepository: {
          storageProvider: 'docs',
          repositoryId: repositoryId,
          externalIds
        }
      }
    }
    const url = `${host}/content/management/api/v1.1/bulkItemsOperations`
    const { Location } = await post(url, body)

    const job = await monitorJob(Location)
    return job
  }

As in the previous step where you needed to provide the ID of the Folder, here you need to provide the ID of the Repository where you want the DigitalAssets to be added to.

  const getRepoGuid = async repoName => {
    const repo = await getRepositoryByName(repoName)

    const repositoryId =
      Array.isArray(repo) && repo.length > 0 ? repo[0].id : null
    if (repositoryId == null) {
      throw new Error(`Repository by name '${repoName}' is not found.`)
    }
    return repositoryId
  }
  const getRepositoryByName = async name => {
    if (!name || name.length === 0) {
      throw new Error(`repository name should not be blank`)
    }
    const url = `${host}/content/management/api/v1.1/repositories?roleName=viewer&fields=all`
    const json = await get(url)
    return json.items.filter(r => r.name === name)
  }

Please take note that if your system has a large number of Repositories, the repository of choice might not be on the first results and you need to paginate.

The result of the addToRepository call is a HTTP response with a Location header. The addToRepository is an asynchronous operation as it might take more than a couple of seconds before all the Documents are added to the Repository. This brings up to the last step: the countdown.

Countdown

The addToRepository is an asynchronous operation, it returns a URL to a job that we can pull for it’s status. Once the status of the job is completed, all the Documents are added to the repository and we have a list of DigitalAsset IDs and matching Document IDs.


 const monitorJob = Location => {
    return new Promise((resolve, reject) => {
      const check = async () => {
        const job = await get(Location)
        const {
          id,
          startTime,
          progress,
          message,
          completedPercentage,
          completed,
          error,
          result
        } = job
        console.log(
          'job',
          id,
          completedPercentage,
          completed || false,
          error || ''
        )
        if (error) {
          return reject(error)
        } else if (completed) {
          return resolve(job)
        }
        setTimeout(check, 2000)
      }

      setTimeout(check, 2000)
    })
  }

The response of the monitorJob function, on success, is a an object with the items:

{
  "result": {
    "body": {
      "operations": {
        "addToRepository": [{"id":"","externalId":""}]
      }
    }
  }
}

The id is the ID of the DigitalAsset and the externalId is the id of the Document. In this way you can relate the file that you have uploaded to the DigitalAsset that you have created.

Closing remarks.

The samples provided are samples, for production quality the error handling and logging needs to be improved.

To speed up the whole operation when you need to upload multiple DigitalAssets is to parallelize the upload of the documents and then call addToRepository with an array of IDs. Currently, it is not documented how many documents can be added in one call. I have make successful calls with 50 images, but Your Milage May Vary in space.

Dolf Dijkstra

Cloud Solutions Architect


Previous Post

FastConnect Design

Javier Ramirez | 11 min read

Next Post


Access HCM cloud flex fields over REST API.

Mani Krishnan | 4 min read