Best Practices from Oracle Development's A‑Team

Performance Management Post Go Live


A common practice for handling performance considerations during software implementations is to include a phase near the end of the development cycle for load and stress testing and to then tune the system based on the results of those tests. However, software systems are not static. The system is constantly changing through addition of new data, patches and regular maintenance, and variations in usage either through growth or because of seasonal changes.

This dynamic and growing nature of the system causes performance variations that can significantly impact an operating environment. Continuous performance monitoring and analysis is critical to evolving systems.

Main Articleflow

There are four basic steps to performance management, be it prior to go-live or afterwards. These are:

  • Measure key aspects of system performance
  • Analyze the results and compare to a baseline or desired results
  • Optimize the system based on the findings
  • Validate that your optimizations solved the performance concerns (and did not create others)

Ideally, a single goal is handled at a time as multiple adjustments can have side effects that are difficult to resolve.

Post Go-Live

Prior to go-live, performance measurements are frequently based on preconceived notions and desired user experience rather than on historical data or a baseline. One advantage of performance measuring post go-live is that a comparison can be done to previous measurements.

As an example, a pre go-live measurement might be:

75% of the page requests are complete from the user perspective within 500ms

However, a post go-live comparison can be event driven as in:

Raise an issue if average page requests for any single hour are 20% slower than the baseline

This assumes that the baseline measurements reflect acceptable performance and that there has been no planned system change that may account for a change in that baseline.

Fusion Applicationsfa

Fusion Applications is a complex environment with a number of components, each of which could be measured. The simplistic approach is to measure the user experience in terms of page rendering and the duration of reports and other long running processes. However, this approach is reactionary in that it cannot predict when performance issues might arise.

For instance, monitoring common database queries could lead to a recognition that additional data is causing queries to underperform. These stepped increases in performance could be discovered by comparing the performance of a set of common queries with their baseline equivalents. In this way, optimizations could be accomplished prior to the issue rising to the level of user interaction.

Key to predicting and solving performance issues is live, on-going measurement. A number of systems can perform this type of measurement, including Oracle’s Enterprise Manager Cloud Control. EMCC is an integrated system that can monitor the various components within Fusion Applications. This tool also allows diagnoses of issues by providing a single pane view of the system along with the ability to drill down into specific areas (e.g. take a page refresh issue, determine the sessions it is associated with, view the operations it conducted down to the individual system component, and determine the performance of each of those operations).



While load and stress testing are important for performance measuring, they do not necessarily handle the dynamic post go-live environment. Continuous monitoring and analysis is required in order to assure appropriate performance over time. Without this monitoring, there is a risk of being surprised by poor performance or even system outages through growth and variations in the system over time.


Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.Captcha