Data Pump Full Transportable
1. On the on-premises database host, prepare the database for the Data Pump full transportable export by placing the user-defined tablespaces in READ ONLY mode.
2.On the on-premises database host, invoke Data Pump Export to perform the full transportable export.
3.Use a secure copy utility to transfer the Data Pump Export dump file and the datafiles for all of the user-defined tablespaces to the Database Classic Cloud Service compute node.
4.Set the on-premises tablespaces back to READ WRITE.
5.On the Database Classic Cloud Service compute node, prepare the database for the tablespace import.
6.On the Database Classic Cloud Service compute node, invoke Data Pump Import and connect to the database.
7.After verifying that the data has been imported successfully, you can delete the dump file.
Steps :
1. CREATE DIRECTORY Exp_for_cloud AS '/u01/Exp_for_cloud';
2. SELECT tablespace_name, file_name FROM dba_data_files; --Save the output
3. SELECT 'ALTER TABLESPACE '||tablespace_name||'READ ONLY;' FROM dba_data_files; -- exicute output for readonly datafiles
4. expdp system FULL=y TRANSPORTABLE=always VERSION=12 DUMPFILE=expdat.dmp DIRECTORY=Exp_for_cloud
5. scp –i Your_private_file_name //dump/Exp_for_cloud/expdat.dmp oracle@IP_address_of_your_machin:/u01/Imp_from_prim
6. scp –i Your_private_file_name //dump/datafile_locatiosn/*.dbf oracle@IP_address_of_your_machin:/u01/Imp_from_prim --*/ copy all datafiles to cloud output from step 2
7. SELECT 'ALTER TABLESPACE '||tablespace_name||'READ WRITE;' FROM dba_data_files; -- exicute output for back in readwrite datafiles
4. CREATE DIRECTORY Imp_from_prim AS '/u01/Imp_from_prim';
5. impdp system FULL=Y TRANSPORTABLE=always DIRECTORY=Imp_from_prim \
'YOUr_datafile_location/example01.dbf', \
'YOUr_datafile_location/fsdata01.dbf',\