Migrate Databases to Oracle Cloud using Data Pump

Migrating Oracle Databases to Cloud is not any different than migrating databases to any other Host. Oracle database cloud instance is like any other host running on your data center with few exceptions like they are elastic and scalable in nature. Still I thought it will be a good idea to write few blogs about different database migration methods and provide some clarity around this subject.

This blog will be focus on Oracle Data Pump utility. This method is useful when your data set is small (May be few GB) and there are no dependencies between schema’s.  I would say this method is good for schema level migrations and pretty much let you migrate any database version to latest oracle database version. Please follow below steps to migrate databases to Oracle cloud using Data Pump utility.

  1. Check data size and dependencies on source database using following query and make note of tablespace names and their sizes.
select substr(OWNER,1,10),substr(TABLESPACE_NAME,1,10) ,sum(bytes)/1024/1024/1000 "SIZE_IN_GB" from dba_segments where owner not in ('SYS','SYSTEM','APEX_04020','WMSYS','XDB','MDSYS','AUDSYS','DBSNMP','SCOTT') group by owner, TABLESPACE_NAME order by 3 desc,1;SQL>   2    3



SUBSTR(OWN SUBSTR(TAB SIZE_IN_GB

---------- ---------- ----------

SALES       USERS     .6370625
META7     USERS       .611875
APEX_04020 SYSAUX     .193875
ORDDATA    SYSAUX     .0160625
DVSYS       SYSAUX    .0044375
CTXSYS     SYSAUX     .0038125
GSMADMIN_I SYSAUX     .001375
OUTLN     SYSTEM      .0005625
ORDSYS    SYSAUX      .0004375
OJVMSYS    USERS      .000375
LBACSYS    SYSTEM     .0003125
  1. Export all the schema’s using oracle expdp utility. Please note that you also you need database directory , can use existing directory or create new one like i did. For this blog I will only be migrating one schema (Sales). It is best to use compress option, since you will uploading this dump file to Oracle Cloud.
CREATE OR REPLACE DIRECTORY migration AS '/oradata';
GRANT READ, WRITE ON DIRECTORY migration TO sales;

expdp sales/sales schemas=sales directory=migration dumpfile=sales.dmp logfile=sales_expdp.log compression=all
  1. Now on cloud, create target database and size it based on your requirement and make notes data file location using show parameter db_create_file_dest.
[oracle@META7DB META7DB]$ df -h
Filesystem            Size  Used Avail Use% Mounted on
/dev/xvdb3             25G   14G   11G  58% /
tmpfs                 7.3G     0  7.3G   0% /dev/shm
/dev/xvdb1            477M  148M  300M  34% /boot
/dev/xvde1             59G   11G   46G  19% /u01
/dev/mapper/dataVolGroup-lvol0
25G  4.1G   20G  18% /u02
/dev/mapper/fraVolGroup-lvol0
6.8G  3.1G  3.4G  48% /u03
/dev/mapper/redoVolGroup-lvol0
26G  3.4G   21G  14% /u04

[oracle@META7DB META7DB]$ sqlplus / as sysdba
SQL*Plus: Release 12.1.0.2.0 Production on Tue Feb 7 16:37:32 2017
Copyright (c) 1982, 2014, Oracle.  All rights reserved.

Connected to:
Oracle Database 12c EE High Perf Release 12.1.0.2.0 - 64bit Production
With the Partitioning, Oracle Label Security, OLAP, Advanced Analytics
and Real Application Testing options

SQL> show parameter db_create_file_dest

NAME                                 TYPE        VALUE
------------------------------------ ----------- ----------------------

db_create_file_dest                  string      /u02/app/oracle/oradata
  1. Create required user, tablespaces and database directory  based on your requirement. Please note that if you are using container database, set you container before creating user or tablespace.
<ALTER SESSION SET CONTAINER = PDB1;>

create tablespace sales datafile size 1G;
create user sales identified by sales default tablespace sales temporary tablespace temp;
grant create session to sales;
ALTER USER sales QUOTA UNLIMITED ON sales;

CREATE OR REPLACE DIRECTORY migration AS '/u04/app/oracle/migration';
GRANT READ, WRITE ON DIRECTORY migration TO sales;
  1. Copy dump file to oracle clouds instance. This can be achieve many different ways, you can winscip or scp to copy export dump file to oracle cloud instance.
[oracle@oraclenode1 dmp]$ scp -i /backup/lunix_rsa_cloudkey  sales.dmp  oracle@129.144.9.60:/u04/app/oracle/migration
sales.dmp     100%  358MB   1.5MB/s   04:06
  1. Now run oracle import utility to load data into your Oracle Cloud Instance.
[oracle@testdb1 migration]$ impdp sales/sales@pdb1 directory=migration dumpfile=sales.dmp logfile=sales_expdp.log

Import: Release 12.1.0.2.0 - Production on Sat Feb 18 20:51:29 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 12c EE High Perf Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Master table "SALES"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
import done in AL32UTF8 character set and AL16UTF16 NCHAR character set
export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
WARNING: possible data loss in character set conversions
Starting "SALES"."SYS_IMPORT_FULL_01": sales/********@pdb1 directory=migration dumpfile=sales.dmp logfile=sales_expdp.log
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"SALES" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "SALES"."UTILIZATION" 12.19 MB 7679886 rows
. . imported "SALES"."UTILIZATION_NOPART" 9.578 MB 6401656 rows
. . imported "SALES"."UTILIZATION_FACT_STAR" 3.925 MB 2559962 rows
. . imported "SALES"."DOBJECTS" 1.718 MB 124714 rows
. . imported "SALES"."DDBA_TAB_PRIVS" 519.8 KB 51649 rows
. . imported "SALES"."DINDEX" 286.1 KB 9560 rows
. . imported "SALES"."REV_MARGIN_F" 132.8 KB 3766 rows
. . imported "SALES"."CALENDER_DIM" 29.25 KB 731 rows
. . imported "SALES"."CALENDER_DIM_BK" 39.48 KB 1096 rows
. . imported "SALES"."CALENDER" 30.87 KB 731 rows
. . imported "SALES"."HW_SALE_F" 38.63 KB 890 rows
. . imported "SALES"."TIME_CALENDAR_DIM" 13.58 KB 365 rows
. . imported "SALES"."CLIENT_DIM" 5.062 KB 10 rows
. . imported "SALES"."CUSTOMER" 5.015 KB 2 rows
. . imported "SALES"."DUSERS" 7.640 KB 50 rows
. . imported "SALES"."EMPLOYEES" 6.421 KB 21 rows
. . imported "SALES"."ORACLE_USERS" 7.625 KB 50 rows
. . imported "SALES"."PRODUCT_DIM" 5.156 KB 6 rows
. . imported "SALES"."PROJECT_DIM" 5.242 KB 12 rows
. . imported "SALES"."REV_MAR_SUM_F" 5.390 KB 2 rows
. . imported "SALES"."SKILLS" 5.312 KB 20 rows
. . imported "SALES"."UTILIZATION_FACT" 5.609 KB 63 rows
. . imported "SALES"."ZIP_DESC" 4.796 KB 2 rows
. . imported "SALES"."SALES_FACT" 0 KB 0 rows
. . imported "SALES"."SKILLS_F" 0 KB 0 rows
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/BITMAP_INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/BITMAP_INDEX/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Job "SALES"."SYS_IMPORT_FULL_01" completed with  error(s) at Sat Feb 18 20:52:24 2017 elapsed 0 00:00:48

  1. Count all the database object and compile any invalid objects in your cloud database to complete your migration.
SQL> SELECT owner,
 object_type,
 object_name,
 status
FROM dba_objects
WHERE status = 'INVALID'
ORDER BY owner, object_type, object_name; 2 3 4 5 6 7

no rows selected