Skip to content

Using awrreport, awrdump, ashplot

Bjørn Engsig edited this page Jul 29, 2024 · 10 revisions

Overview

The three utilities, awrreport, awrdump, ashplot can be used to create awr or ash reports in html or text format, or create a dump (extract) of the awr repository, or to create a plot of ash. All that is required for them is the ability to connect to the database using an account with DBA privileges, typically named SYSTEM, ADMIN or PDBADMIN. They can therefore also be used with autonomous database.

The utilities are included in the full distribution of rwloadsim, and as stand-alone binaries. In both cases, you will need an Oracle client distribution as well such as Instant Client; Basic Light is sufficient. It is recommended that you choose the latest version available, which can be retrieved directly via this link for Basic Lite. For the latter, ashplot, you also need to have gnuplot installed. It is also highly recommended that you get SQL*Plus from this direct link. Note that these direct links are for the Linux X86_64 variants.

Installation

You can either install the full rwloadsim distribution as described here or you can install the simpler stand alone binaries. The former, full installation is suggested if you generally want to use the RWP*Load Simulator, the latter is suggested if you only are interested in the awrreport and awrdump utilities (in addition to ociping, connping and netthroughput that also are included).

After installation, you have access to awrreport, awrdump and ashplot and all will show a brief help if called with just a -h option as shown here

awrreport options

RWP*AWR Report Release 3.0.3.17 Development on Thu, 24 Nov 2022 10:57:16 UTC
Show awr repository contents and create awr/ash reports
-l u/p@c                      - compulsory option providing database credetials
--dbid=<N>                    - list snapshots in this dbid; default list all dbids
--instance=<N>                - limit listing to this instance
--fromtime=YYYY.MM.DDTHH24:MI - list only snapshots after this time
--totime=YYYY.MM.DDTHH24:MI   - list only snapshots before this time
--no-headings                 - do not output headings
To create an awr report, use the following options in addition to dbid and instance
--begin-snap=<N>              - first awr snapshot; default is not to generate awr
--end-snap=<N>                - last awr snapshot in report, default begin-snap+1
To create an ash report, use the following options in addition to fromtime and totime
--ash-slot-width=<D>          - slot width in seconds; defualt is not to generate ash
To set directory, name and format of awr or ash, use any of these options:
--output-directory=<dirname>  - create report in this directory rather than current
--output-file=<file>          - specify file name rather than automatically generated
--output-text                 - generate txt output rather than html

awrdump options

RWP*AWR Dump Repository Release 3.0.3.17 Development on Thu, 24 Nov 2022 10:58:09 UTC
Create an awr dump and potentially copy it to a cloud bucket
-l usr/pwd@con            - required option providing database credentials
--dbid=<N>                - dbid of the database from which awr is dumped
--begin-snap=id           - first snapshot to export
--end-snap=id             - last snapshot to export
--directory=<directory>   - output to this directory
--filename=<filename>     - output to this filename (.dmp extension will be added)
--remove-file             - if the file already exists, remove it before creating dump
--logfile=filename        - copy the logfile from data pump to the named local file
If copy to a cloud bucket is wanted, provide all these
--credential=<credential> - name of credential created using dbms_cloud.create_credential
--region=<region>         - name of the region such as us-phoenix-1
--tenancy=<tenancy>       - name of the tenancy owning the cloud bucket
--bucket=<bucket>         - the name of the bucket to where the dump is copied
--copyonly                - do not create the dump, only copy an existing dumpfile or logfile

Ashplot option

RWP*ASH Plot Release 3.1.0.30 Development on Fri, 05 Jan 2024 15:01:25 UTC
Create a plot of ash data as image file or using X-Windows
-l usr/pwd@con                - required option providing database credentials
--dbid=<N>                    - query dba_hist_active_sess_history in
                                stead of gv$active_session_history
--instance=<N>                - only include sessions in this instance
--include-sys                 - Include sessions from SYS, default is
                                to exclude SYS
--fromtime=YYYY.MM.DDTHH24:MI - first timestamp to include in graph
--totime=YYYY.MM.DDTHH24:MI   - last timestamp to include in graph
--autoupdate=<N>              - output updated graph every N seconds
--maxautoupdate=<N>           - stop autoupdate after this many seconds (3600)
--output-X                    - generate output as X-Windows image
--output-Windows              - generate output as MS Windows image
--output-png=<file.svg>       - create image file as png
--output-svg=<file.svg>       - create image files as svg
--svgmouse                    - embed svgmouse code in svg file
--output-image=<file>         - create image files named file.svg
                                and file.png
--maxsessions=<N>             - Highest number of sessions to show
                                on y-axis
--w-pixels=<N>                - set the width in pixels
--h-pixels=<N>                - set the height in pixels
The following options are normally not used:
--boxwidth=N                  - set boxwidth
--ashplot-tempdir=<directory> - use directory for intermediate files
--ashplot-debug               - output some debug, save tempdir

Complete manual pages

You will find complete manual pages using rwlman if you are using the full installation of rwloadsim. If you are using the stand alone binaries, the manual pages can be found at:

Initial access to awr data

You will need to have username, password and connect string that allows you to access your database as a DBA. In the following, this will be written username/{password}@database_tp which could be making a connection via a downloaded wallet for an autonomous database. To see which awr repositories are available, use awrreport with only credentials and no other options:

awrreport -l username/{password}@database_tp

RWP*AWR Report Release 3.0.3.17 Development on Thu, 24 Nov 2022 10:35:20 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
       dbid con snap cnt instance list
----------- --- -------- -------------
  734638888  /      6506 1,2,3,4,5,6,7,8
 2286054548  P      6614 1,2,3,5,6,7,8

As the next step, list the snapshots per instance for a particular database by providing --dbid option:

awrreport -l username/{password}@database_tp --dbid=2286054548

RWP*AWR Report Release 3.0.3.17 Development on Thu, 24 Nov 2022 10:35:20 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
i#  losnap  hisnap  snaps       losnap_end      hisnap_end
-- ------- ------- ------ ---------------- ---------------
 1   94164   97271   1750 2022.10.23T12:54 2022.11.24T10:16
 2   96906   96907      2 2022.11.20T12:46 2022.11.20T12:56
 3   96908   97271    364 2022.11.20T13:49 2022.11.24T10:16
 5   93965   94165    201 2022.10.21T12:59 2022.10.23T13:05
 6   94165   94183     19 2022.10.23T13:05 2022.10.23T18:06
 7   94172   95530   1354 2022.10.23T15:11 2022.11.06T12:06
 8   93965   96905   2924 2022.10.21T12:59 2022.11.20T11:54

The next steps depend on your goal: You can either create an awr report by finding some "interesting" snapshots or you can create a datapump dump extract of all data or data in a selected snapshot range.

Creating awr reports

To create an awr report, you first need to find the snapshots. If as an example your interest is what happened on instance 3 on 2022.11.21, execute the following to see all snapshots available:

awrreport -l username/{password}@database_tp --dbid=2286054548 --instance=3 --fromtime=2022.11.21 --totime=2022.11.22

RWP*AWR Report Release 3.0.3.17 Development on Thu, 24 Nov 2022 10:35:20 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
i#     snap    snap_end_time   instance_start
--  ------- ---------------- ----------------
 3    96950 2022.11.21T00:02 2022.11.20T13:23
 3    96951 2022.11.21T00:13 2022.11.20T13:23
 3    96952 2022.11.21T00:16 2022.11.20T13:23
 3    96953 2022.11.21T00:23 2022.11.20T13:23
 3    96954 2022.11.21T00:26 2022.11.20T13:23
...
 3    97044 2022.11.21T23:29 2022.11.20T13:23
 3    97045 2022.11.21T23:32 2022.11.20T13:23
 3    97046 2022.11.21T23:58 2022.11.20T13:23
 3    97047 2022.11.22T00:00 2022.11.20T13:23

Finally create an actual awr report specifying two snapshots of interest:

awrreport -l username/{password}@database_tp --dbid=2286054548 --instance=3 --snap-begin=97029 --snap-end=97031

RWP*AWR Report Release 3.0.3.17 Development on Thu, 24 Nov 2022 10:35:20 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Generating awr to ./2286054548_3_97029_97031.html

If you want to provide your own name for the file, use the --output-file option; if you want to put it in a different directory, use the --output-directory option. If you omit --snap-end, it will be one larger than --snap-begin.

Extract the awr repository to a datapump dump file

Once you have identified a database (i.e. dbid) and the range of available awr data, you may want to extract the repository for some range of snapshots to a datapump dump file. This can be done using the awrdump utility. You first need to identify the database directories to where the dump can be written, which will be shown if you only give database credentials and the --dbid option to awrdump:

awrdump -l username/{password}@database_tp --dbid=2286054548

RWP*AWR Dump Repository Release 3.0.3.17 Development on Thu, 24 Nov 2022 11:01:42 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
directory                      path
------------------------------ ----
ORACLE_BASE                    /u01/app/oracle
ORACLE_HOME                    /u01/app/oracle/product/19.0.0.0/dbhome_1910
DATA_PUMP_DIR                  /u03/dbfs/93BE77A8B880DA47E0535318000A4656/data/dpdump
...
SQL_TCB_DIR                    /u03/dbfs/93BE77A8B880DA47E0535318000A4656/data/tcb_dir

One of these will as shown above typically be named DATA_PUMP_DIR. To actually create the dump, provide --directory (which is the name shown in the directory column, and not the actual path), --begin-snap and --end-snap. As an example if you want to dump the whole day 2022.11.21 as listed above, use this command (which will run for some time):

awrdump -l username/{password}@database_tp --dbid=2286054548 --begin-snap=96950 --end-snap=97047 --directory=DATA_PUMP_DIR

RWP*AWR Dump Repository Release 3.0.3.17 Development on Thu, 24 Nov 2022 11:01:42 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Dump file 2286054548_96950_97047.dmp of size 140652544 (134.1MiB) found in directory DATA_PUMP_DIR

Preparing upload to an Oracle Cloud Infrastructure bucket

If your database is an autonomous database in Oracle Cloud, you do not have access to directories on the database server, including the DATA_PUMP_DIR above. To make the dump accessible it must therefore first be uploaded to a bucket in your region and tenancy. To do this, there is some preparation that must be done for your tenancy, user and region, which involves these steps:

Creating Auth Token and adding it as credentials

In order to copy an awr dump file from a database directory to a cloud bucket, you need to have an Auth Token. You may already have an one, which you can use to create the necessary cloud credential.

If you do not have an Auth Token, you can create one from your cloud console: Go to your own user under Identity->Users and select the option to create an Auth Token. Note that you can only have very few Auth Tokens and that the token identifier (a string of some 20 random characters) will only be displayed once, so you need to save it safely.

You subsequently need to add your Auth Token as a credential in the ADMIN account of your autonomous database, which is done using dbms_cloud.create_credential. A typical call to do this would be to execute something like the following using sqlplus:

begin
  dbms_cloud.create_credential
  ( 'credname'
  , username=>'first.last@example.com'
  , password=>'abc.123-defgji>XYZ('
  );
end;

where the password is the contents of the Auth Token and the username is your email address. If you have access to multiple autonomous databases in the same tenancy, you need to add the Auth Token by calling dbms_cloud.create_credentail once for each database.

Creating a bucket

From your OCI console, go to the Object Storage and create a bucket. This step needs to be repeated for multiple regions, but the same bucket can very well be used for multiple databases in the same region.

Uploading the dump file to a bucket

A bucket is identified by a total of four values: The name of your cloud tenancy, the name of the region, the name of the credential created in the previous step, and the actual bucket name. All these must be provided to awrdump in order to do the actual upload. A sample call to both create a dump and to upload it is:

awrdump -l username/{password}@database_tp --dbid=2286054548 --begin-snap=96950 --end-snap=97047 \
  --directory=DATA_PUMP_DIR \
  --credential=credname --region=us-phoenix-1 \
  --tenancy=mytenancy --bucket=awrbucket
RWP*AWR Dump Repository Release 3.0.3.17 Development on Thu, 24 Nov 2022 11:01:42 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Dump file 2286054548_96950_97047.dmp of size 140652544 (134.1MiB) found in directory DATA_PUMP_DIR
Copying 2286054548_96950_97047.dmp to https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/mytenancy/awrbucket/2286054548_96950_97047.dmp

If you already have created the dump file and you just want to upload it to a bucket, add the --copyonly option which causes the first part with the actual dump creation to be skipped.

You can finally use the cloud console to identify the file and either download it or create a pre-authenticated URL for it.

Plotting the ash data

The Active Session History data is available in two places: There is historic data available in dba_hist tables and there is online, in-memory database available in gv$ tables. The ashplot utility allows you to plot either over a certain date/time interval.

To plot historic data, you should initially use awrreport as described above to identify a period of interest, and then use ashplot with the --dbid option in addition to --fromtime and possibly --totime. Without the latter, all data starting at --fromtime will be plotted.

To plot the online, in-memory data, do not specify --dbid.

A sample usage is to initially just provide the credentials using the -l option, which may give you:

ashplot -l username/{password}@//hostname/service

RWP*ASH Plot Release 3.1.0.30 Development on Fri, 05 Jan 2024 16:39:57 UTC
Connected default database to:
Oracle Database 19c EE Extreme Perf Release 19.0.0.0.0 - Production
Data available between 2024.01.05T15:34 and 2024.01.05T16:39
No output specified, nothing to do

which shows the date/time range for the online, in-memory data. You will need to tell how the output should be created using either of the --output option. If you have an X-11 environment with a correctly set DISPLAY, you can use --output-X, otherwise you one of the other --output options to create an svg-file, png-file or both. The next call may therefore be

ashplot -l username/{password}@//hostname/service --fromtime=2024.01.05T15:33 --output-image=apsampl

which will create two image files, apsampl.svg and apsampl.png being shown here:

apsampl

Running output of live ash data

You can also use ashplot to generate a running graph using gnuplot with output using either X11 or MS Windows. To do so, provide --autoupdate=N where N is the interval in seconds between updates. By default, this implies --output-X; if you are on Microsoft Windows, you additionally need --output-Windows. By default, --fromminutes=5 will be in effect implying the last 5 minutes of ash data will be shown. The running graph will terminate after 1 hour unless stopped by ctrl-c or a different runtime is set using --maxautoupdate. Several options, including --dbid, --fromtime, --totime are unavailable when using --autoupdate.