In Oracle Cloud Infrastructure (OCI), cost and usage reports provide essential insights into your cloud spending. Today, you can view and manually download these reports. However, this process can become cumbersome if you need to build a custom dashboard using Analytics services like OAC or connect to third-party services that provide a centralized cost view for all your resources across multiple cloud environments.

In this blog post, we will show you how to automate the export of your Cost Management data to an Object Storage bucket in your tenancy. This approach streamlines data access and enables integration with other systems, custom data enrichment, and analysis in external tools like dashboards or financial systems, offering enhanced insights and simplified reporting.

FOCUS is a new open-source cloud billing data specification that provides consistency and standardization to simplify cloud cost reporting and analysis across multiple sources. Refer to this blog post for additional information.

Prerequisites

  • Object Storage Bucket: Create a bucket in OCI Object Storage (example Cost_Usage_Reports) to store the cost and usage reports.
  • Dynamic Groups:
    • Create dynamic group (like dg-fn-copy-CUR-reports) for the function to copy files to Object Storage – ALL {resource.type = ‘fnfunc’, resource.compartment.id = ‘ocid1.compartment.oc1..xxx’}

Creating Automated Daily Reports 

To automate the daily copying of FOCUS (Cost and Usage Reports), we will set up an OCI Function that performs this task daily.

Step 1 – Create an OCI Function to Copy FOCUS Reports to Object Storage

  1. Set up an OCI Function that will be responsible for copying the FOCUS reports to your Object Storage bucket. Refer to this guide for detailed steps to create an OCI Function.
  2. Use the following code in your function:
import io
import json
import logging
import oci
from datetime import datetime, timedelta
from fdk import response
def handler(ctx, data: io.BytesIO = None):
    try:
        # Do not modify these values
        reporting_namespace = 'bling'
        reporting_bucket = 'ocid1.tenancy.oc1..aaaaaaaaa3qmjxr43tjexx75r6gwk6vjw22ermohbw2vbxyhczksgjir7xdq' #Donot change this value 
        yesterday = datetime.now() - timedelta(days=3)
        prefix_file = f"FOCUS Reports/{yesterday.year}/{yesterday.strftime('%m')}/{yesterday.strftime('%d')}"
        print(f"prefix is {prefix_file}")
        destination_path = '/tmp'        
        
        dest_namespace='oci_os_namespace'
        upload_bucket_name = 'Cost_Usage_Reports'  # Replace with your target bucket for uploads


        
        Signer = oci.auth.signers.get_resource_principals_signer() 
        object_storage = oci.object_storage.ObjectStorageClient(config={}, signer=Signer)
        report_bucket_objects = oci.pagination.list_call_get_all_results(object_storage.list_objects, reporting_namespace, reporting_bucket, prefix=prefix_file)
        
        for o in report_bucket_objects.data.objects:
            object_details = object_storage.get_object(reporting_namespace, reporting_bucket, o.name)
            filename = o.name.rsplit('/', 1)[-1]
            local_file_path = destination_path+'/'+filename            
            with open(local_file_path, 'wb') as f:
                for chunk in object_details.data.raw.stream(1024 * 1024, decode_content=False):
                    f.write(chunk)
            with open(local_file_path, 'rb') as file_content:
                object_storage.put_object(
                    namespace_name=dest_namespace,
                    bucket_name=upload_bucket_name,
                    object_name=filename,
                    put_object_body=file_content
                )
    except (Exception, ValueError) as ex:
        logging.getLogger().info('error parsing payload: ' + str(ex))
    return response.Response(
        ctx, response_data=json.dumps(
            {"message": "Processed Files sucessfully"})
    )

Note: In the Function configuration, increase the execution time limit to 300 seconds to accommodate potential data processing requirements.

Step 2 – Schedule the Function invocation using Resource Scheduler

Configure the Resource Scheduler to invoke this function daily. Follow the instructions in this blog post to set up a daily job.

Step 3 – Set Up Required Policies

Using the Dynamic Groups created in the Prerequisite section, ensure you have the following policies in place:

  • Policy 1 – Allow the Resource Scheduler to invoke the function.
    • Resource Scheduler Policy 
    • Allow any-user to manage functions-family in Tenancy where all{request.principal.type='resourceschedule',request.principal.id='ocid1.resourceschedule.oc1.iad.xxxxx'}
  • Policy 2 – Permit the function to access and write to Object Storage buckets.
    • Function Access Policy
      • Allow dynamic-group dg-fn-copy-CUR-reports to manage in compartment <your-compartment>
      • define tenancy usage-report as ocid1.tenancy.oc1..aaaaaaaaned4fkpkisbwjlr56u7cj63lf3wffbilvqknstgtvzub7vhqkggq
      • endorse dynamic-group dg-fn-copy-CUR-reports to read objects in tenancy usage-report
      • Allow dynamic-group dg-fn-copy-CUR-reports to inspect compartments in tenancy
      • Allow dynamic-group dg-fn-copy-CUR-reports to inspect tenancies in tenancy

Note: For the purposes of this blog post, broad access permissions were granted. For production deployments, it is strongly recommended to use fine-grained permissions.

Once these configurations are complete, the Resource Scheduler will automatically trigger the function each day at scheduled time, copying the FOCUS reports to your designated Object Storage bucket.

In the coming weeks, we will publish follow-up blog posts detailing how you can use these files for custom reporting with Oracle Analytics Cloud and apply machine learning models, like anomaly detection, to identify unusual spending patterns.