OCI Security Fundamentals Dashboards - Manage Logging Analytics Storage

October 21, 2023 | 6 minute read
Amine Tarhini
Principal Security and Management Platform Specialist
Johannes Murmann
Master Principal Security Cloud Architect.
Royce Fu
Principal Database and O&M Solution Architect
Text Size 100%:

The widgets on the Security Fundamentals Dashboards (SFD) are saved searches that query log data that has been ingested into the Logging Analytics service. Therefore, it’s important to manage your Logging Analytics storage in a cost-effective way. This blog post describes various options for purging and/or archiving your log data, along with examples, to help you decide what data to keep, where, and for how long.

Please note that data ingested into the Logging Analytics services is stored in storage managed by Oracle and not accessible directly by customers. To view detailed information about your storage usage, navigate in the OCI console to "Observability & Management"->"Logging Analytics"->"Administration", then click on "Storage" (under "Resources") in the left hand-side navigation bar. 

Logging Analytics costs are based on the amount of storage used. There’s no additional cost for ingesting or using the data (by searching, analyzing, or visualizing).

Ingested data is first placed in active storage, where it’s available for use. By default, the data remains in this storage indefinitely, unless you archive or purge it.

Archiving the data moves it to a lower cost storage type, but the data is not readily usable. You will need to make a “Recall” request to make archived data available for use. The recall request may take anywhere from a few minutes to a few hours to be completed, depending on the amount of data recalled and other factors. Purging the data on the other hand removes it completely from Logging Analytics. 

Archive Log Data

If you're using only the recent logs for your search and analysis tasks in Oracle Logging Analytics, then enable archiving to optimize the storage cost. Please note you can enable archiving only after you have the minimum specified size of data in active storage. Currently, this is 1 TB. Also, the minimum Active Storage Duration (Days) for logs before they can be archived is 30 days.

To enable log data archiving:

  • Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.
  • The administration resources are listed in the left hand-side navigation pane under Resources. Click Storage.
  • The Storage page is displayed.
  • Click Enable Archiving. In the Enable Archiving dialog box, enter the count of the days after which the log data in the active storage must be archived in the field Active Storage Duration (Days), and click Enable.
  • The count is calculated based on the timestamp of the logs. For example, if your logs have the timestamp November 4, and you've specified the Active Storage Duration as 30, then the logs will be typically moved to archive storage on December 3
  • Click Save Changes

Purge Log Data

Purging enables you to bring down your usage to reduce charges. Oracle Logging Analytics can purge log data automatically per a set schedule or manually based on your need.

There are different ways to purge log data.

  • By purging on-demand: All log data from the specified compartment created prior to the selected date and time, and matches your query filter, gets purged.
  • By creating a purge policy: The old log data can be purged by specifying a schedule for purging and the query to filter the data to purge.

To automate the purge activity, create a purge policy by selecting the log data to purge (based on age of the data, and optionally a query filter), specifying the purge schedule, and enabling the policy.

Keep in mind purging log data, whether on-demand or with a purge policy, purges data in active storage, not archive storage. Data remains in archive for the duration you set under “Archival Storage Duration (Days)

Create a purge policy to purge logs based on a query or age

Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.

  • The administration resources are listed in the left-hand navigation pane under Resources. Click Storage.
  • The Storage page is displayed.
  • Under Purge Policies, click Create. The Create Purge Policy dialog box opens.
  • Enter a name for the new purge policy.
  • Under Purge Logs Older than, select the time from when the log data must be purged.
  • Under Schedule Interval, select the periodicity, and time of the purge action.
  • In the Query field, enter the query to select a specific set of log data. For example, to select the logs from the VCN Flow and OCI Audit Log sources, specify the query 'Log Source' in ('OCI VCN Flow Unified Schema Logs', 'OCI VCN Flow Logs', 'OCI Audit Logs').
  • Click Create.

The purge policy is created.

To delete a policy, click Actions icon Actions icon next to the policy name, and click Delete.

To view the purge activities performed, in the Storage page, under Resources, click Activity Report. The Activity Report page is displayed which summarizes all the storage activities. Use the Status and Time filters to view the preferred purge activities.

Recommendations for SFD Customers

When deploying the Security Fundamentals dashboards, you first onboarded the Logging Analytics service, and configured ingestion of log data for OCI audit and VCN flow logs into Logging Analytics. We recommend you create a purge policy that meets your company's requirement for data retention. If you believe you may need to query log data at some point in the future, you should consider archiving the log data for a certain period of time.

Consider the following examples assuming you have enabled OCI audit and VCN flow logs only in Logging Analytics (as is the case when you deployed SFD):

Example 1

  • You need ingested log data to be available for searching, analysis, and visualization for a period of 2 months, and do not need it at all after this period of time

In this case, create a purge policy that purges the log data after 2 months:

  • Policy Name: Purge all log data older than 2 months
  • Purge Logs Older Than: 2 months
  • Schedule Interval: Every Day
  • Query: Leave blank (this will purge all log entries with timestamp older than 2 months)

You do not need to enable archiving in this use case.

Example 2

  • You need ingested log data to be available for searching, analysis, and visualization for a period of 2 months
  •  You may need to query this data occasionally for a period of 4 months after the initial required retention period

In this case, enable archiving with the following parameters:

  • Active Storage Duration (Days): 60
  • Archival Storage Duration (Days): 120

There's no need to create a purge policy for this use case.

Example 3

  • You need ingested log data to be available for searching, analysis, and visualization for a period of 2 months
  •  You may need to query the audit data only occasionally for a period of 4 months after the initial required retention period

In this case, create a purge policy that purge the VCN Flow log data after 2 months:

  • Policy Name: Purge VCN Flow log data older than 2 months
  • Purge Logs Older Than: 60 Days
  • Schedule Interval: Every Day
  • Query: 'Log Source' in ('OCI VCN Flow Unified Schema Logs', 'OCI VCN Flow Logs')

Now enable archiving of the remaining log data (in this case it's the audit log data) with the following parameters:

  • Active Storage Duration (Days): 61 (this is to ensure the VCN Flow logs have been purged from active storage)
  • Archival Storage Duration (Days): 120

Policy required to allow Users to Purge Log Data

To purge log data, first set up the right permissions by creating the following dynamic group and IAM policies:

  • Create a dynamic group to allow purges for the compartments you want to allow purges in (This is to allow the Logging Analytics scheduler to purge within Logging Analytics log groups that reside within a compartment):
    • ALL {resource.type='loganalyticsscheduledtask', resource.compartment.id='<compartment ocid>'}
    • Alternatively, to allow purges on all compartments:
    • ALL {resource.type='loganalyticsscheduledtask'}
  • Create policies in the root compartment to allow the dynamic group to perform purge operation:
    • allow dynamic-group <group_name> to read compartments in tenancy
    • allow dynamic-group <group_name> to {LOG_ANALYTICS_STORAGE_PURGE} in tenancy
    • allow dynamic-group <group_name> to {LOG_ANALYTICS_STORAGE_WORK_REQUEST_CREATE} in tenancy
    • allow dynamic-group <group_name> to {LOG_ANALYTICS_LOG_GROUP_DELETE_LOGS} in tenancy
    • allow dynamic-group <group_name> to {LOG_ANALYTICS_QUERY_VIEW} in tenancy
    • allow dynamic-group <group_name> to {LOG_ANALYTICS_QUERYJOB_WORK_REQUEST_READ} in tenancy
  • Additionally, ensure that the user has MANAGE permission on loganalytics-features-family and loganalytics-resources-family. If the user creating the on-demand or scheduled purge has Administrator privileges, then the required permissions are already available:
    • allow group <group_name> to MANAGE loganalytics-features-family in tenancy
    • allow group <group_name> to MANAGE loganalytics-resources-family in tenancy

Reference

OCI Logging Analytics documentation - Manage Storage

Amine Tarhini

Principal Security and Management Platform Specialist

Amine is a member of the North America Technology Platform Specialist Team at Oracle Corporation. Amine specializes in Oracle Observability & Management platform (O&M), and Oracle Enterprise Manager (OEM).

Johannes Murmann

Master Principal Security Cloud Architect.

Royce Fu

Principal Database and O&M Solution Architect

Royce Fu is the Principal Database Solution Architect of the North America Cloud Technology and Engineering Team. Royce's area of specialty is core Database Technology and OCI O&M especially in Database Platform Engineering, Architecture, and Integration. He started his career as Java software engineer and spent over a decade in database engineering and architecture.


Previous Post

Leveraging ANTLR for IAM Policy Automation

Gordon Trevorrow | 11 min read

Next Post


Install and configure OAS 6.4 and Grafana 9.2 for use with OEM 13.5 - Step by Step Guide

Amine Tarhini | 19 min read