I've been tasked at work with investigating performance baselines. Fortunately, as we use 10g and have paid for the diagnostics and tuning packs, base lines (or preserved snapshots) can easily be created with dbms_workload_repository.create_baseline. However, there are couple of things that would be really nice to have that are not available, namely:-
1. The ability for Enterprise Manager to generate notifications when the db time of a database deviates from that of a specified baseline by a specified threshold. This can be worked around via the custom metrics that the enterprise manager framework allows you to create. I'm a tad surprised that this functionality does not exist as db time is the cornerstone of the Oracle performance infrastructure from 10g onwards.
2. Consider the scenario where you have an existing production database and you want to test the impact of adding new functionality and / or adding an increased load to production database. Obviously you don't want to do this without some prior testing, fortunately you have a test environment with hardware that is identical to production. You test the changes on test and after some tuning you deem the changes satisfactory to go into production, also you have used dbms_workload_repository to create a baseline on test. At this stage it would be really nice to have the ability to export the baseline from the test environment and import it into production. This ability does exist but it is not documented, here is the solution as proposed by Oracle support, the speed at which Oracle support came back with suggests the solution is something that other people have asked for before. Here it is:-
Though AWR data can be exported and imported for transporting it into other databases
this is not supported by oracle and this is to be used only for troubleshooting purposes,
This is implemented with an internal package which is created with the script awrextr.sql
which is under $ORACLE_HOME/rdbms/admin
The steps to follow are
To download the data into a data pump dumpfile, you would use the procedure AWR_EXTRACT:
begin
DBMS_SWRF_INTERNAL.AWR_EXTRACT (
dmpfile => 'awr_data.dmp',
dmpdir => 'TMP_DIR',
bid => 302,
eid => 305);
end;
dmpfile => The name of the target file for the data is mentioned here. This is a Data Pump export
file. If non filename is given, the default value awrdat.dmp is used.
dmpdir => The directory object where the dumpfile is written. In this case, you may have defined
a directory TMP_DIR as /tmp.
bid => The snapshot ID of the beginning snapshot of the period.
eid => The end snapshot ID.
Once you dump the AWR data to the dumpfile awr_data.dmp to the new location and load it using another procedure in
the same package,
First we need to load the AWR data to a temporary staging area before loading into the SYS schema. In this case, the data
will be loaded to AWR_TEST schema initially.
begin
DBMS_SWRF_INTERNAL.AWR_LOAD (
SCHNAME => 'AWR_TEST',
dmpfile => 'awr_data',
dmpdir => 'TMP_DIR');
end;
After staging, the data is moved into the SYS schema:
begin
DBMS_SWRF_INTERNAL.MOVE_TO_AWR (SCHNAME => 'TEST');
end;
Once this is completed, the AWR data can be used for any performance analysis or you can build a central repository of
AWR data collected from multiple databases
No comments:
Post a Comment