Translate into your own language

Monday, June 27, 2016

Step by step - How to refresh UAT/DEV/QA/TEST database schemas from PRODUCTION database

                                                                            Schema Refresh

Schema refresh is one of the routine tasks in DBA's life, Whenever application team feels to test something in production database, first they do their test in dev/uat/qa/test database.

For this they need the latest data copy from production database to dev/uat/qa/test database. We can achieve this by schema refresh.

Follow below steps in source and target database sides. Lets assume we are refreshing USER1 schema from production to development database.

Source Database Side :

Step 1: Check the Schema size in the source database.

SQL> select owner, sum(bytes)/1024/1024/1024  "GB"  from dba_segments where owner='USER1';

Step 2:Check the tablespace assigned to USER1 schema.

SQL>select default_tablespace from dba_users where username='USER1';

Step 3: Check tablespace size assign to USER1 scema

Step 4:  Take the count of schema objects, this will be useful after refresh to compare both source and target schema objects.

SQL> select object_type, count(*) from dba_objetcts where owner='ORACLE' group by object_type;

Step 5: Before export, first check mount point size where we're going to store the export dump files, if mount point doesn't have sufficient space export job will fail. So make sure we have enough space in the mount point.

Step 6 : Create a data pump directory at the database level for export, if we do not create directory then it will use the default directory DATA_PUMP_DIR. So if we are using this default directory, we need to make sure that it is mapped to the correct path.

Step 7:   Now take the export of schema or schemas.

            $ nohup expdp \'/ as sysdba\' directory=DATA_PUMP_DIR
              dumpfile=proddb1_8August_%U.dmp logfile=proddb1_8August.log
              schemas=USER1 parallel=8 &


Step 8: Copy the dump file from source server to destination server. We basically use scp command but we can also use other server copy command for this. It is always better to copy the dump
files in the location which will be used in the impdp directory path. Otherwise we will have to again copy this to the impdp directory path.

We have completed almost all steps in source database side, now we are moving to target database side,

Target Database side :

Step 1: Check if we have enough space available in the data pump director path to hold the dump file coming from the source database server.

Step 2 : Check if we have enough space available in the tablespace of the user which is going to be     refreshed. For this, it is always better to check the tablespace size of that particular user in the prod database and add that much of space before refresh starts.

Step 3:It is always recommended to take the export backup of the schema on target side as well which we are going to refresh.

               $ nohup expdp \'/ as sysdba\' directory=DATA_PUMP_DIR
                  dumpfile=devdb1_8August_%U.dmp logfile=devdb1_8August.log
                  schemas=USER1 parallel=8 &

Step 4 : Drop the schema objects only on the target side i.e. on development database. Double check  it before dropping anything in the database. Make sure we only drop the schema object not the complete schema. Sometime people also drop complete schema but that is not                                recommended.


Step 5 : Import the data into target schema(Shema refresh)

               $ nohup impdp \'/ as sysdba\' directory=DATA_PUMP_DIR
                  dumpfile=proddb1_8August_%U.dmp logfile=imp_8August.log
                  schemas=USER1 parallel=8 &
       
Note: While import is running, it is always better to monitor the alert log file of the database
and also monitor the import log file. Sometime the error is reported on the alert log and is not captured in the import log. So monitor both logs.

Step 6: Gather the statistics on the schema which we just refreshed.

Step 7: Count the objects and match it from production database. Make sure all are imported    properly before confirming to anyone or application team.

Suggestions are most welcome :)

No comments:

Post a Comment