Oracle DataPump Utility
Version 20c

General Information
Library Note Morgan's Library Page Header
ACE Director Alum Daniel Morgan, founder of Morgan's Library, is scheduling complimentary technical Workshops on Database Security for the first 30 Oracle Database customers located anywhere in North America, EMEA, LATAM, or APAC that send an email to asra_us@oracle.com. Request a Workshop for your organization today.
DataPump comes in two different forms ... the executable EXPDP and IMPDP in the $ORACLE_HOME/bin directory, this page, and as a built-in package DBMS_DATAPUMP linked at page bottom.
Data Dictionary Objects
AMGT$DATAPUMP DBMS_DATAPUMP_INT SQL$TEXT_DATAPUMP_TBL
CDB_DATAPUMP_JOBS DBMS_DATAPUMP_UTL SQL$_DATAPUMP
CDB_DATAPUMP_SESSIONS DBMS_STREAMS_DATAPUMP SQL$_DATAPUMP_TBL
DATAPUMP_DDL_TRANSFORM_PARAMS DBMS_STREAMS_DATAPUMP_UTIL SQLOBJ$AUXDATA_DATAPUMP
DATAPUMP_DIR_OBJS GV$DATAPUMP_JOB SQLOBJ$AUXDATA_DATAPUMP_TBL
DATAPUMP_OBJECT_CONNECT GV$DATAPUMP_SESSION SQLOBJ$DATA_DATAPUMP
DATAPUMP_PATHMAP GV_$DATAPUMP_JOB SQLOBJ$DATA_DATAPUMP_TBL
DATAPUMP_PATHS GV_$DATAPUMP_SESSION SQLOBJ$PLAN_DATAPUMP
DATAPUMP_PATHS_VERSION KU$_DATAPUMP_MASTER_10_1 SQLOBJ$PLAN_DATAPUMP_TBL
DATAPUMP_REMAP_OBJECTS KU$_DATAPUMP_MASTER_11_1 SQLOBJ$_DATAPUMP
DATAPUMP_TABLE_DATA KU$_DATAPUMP_MASTER_11_1_0_7 SQLOBJ$_DATAPUMP_TBL
DATA_PUMP_DIR KU$_DATAPUMP_MASTER_11_2 TSDP$DATAPUMP
DATA_PUMP_XPL_TABLE$ KU$_DATAPUMP_MASTER_12_0 USER_DATAPUMP_JOBS
DBA_DATAPUMP_JOBS KU$_DATAPUMP_MASTER_12_2 V$DATAPUMP_JOB
DBA_DATAPUMP_SESSIONS OLS$DATAPUMP V$DATAPUMP_SESSION
DBA_DV_DATAPUMP_AUTH ORACLE_DATAPUMP V_$DATAPUMP_JOB
DBMS_DATAPUMP PSTDY_DATAPUMP_SUPPORT V_$DATAPUMP_SESSION
DBMS_DATAPUMP_DV_LIB SQL$TEXT_DATAPUMP  
Data Move Methods
Method Description
Conventional Path When there are conflicting table attributes, Oracle Data Pump uses conventional path to move data.
Data File Copy The fastest method of moving data is to copy the database data files to the target database without interpreting or altering the data.
Direct Path After data file copying, direct path is the fastest method of moving data. In this method, the SQL layer of the database is bypassed and rows are moved to and from the dump file with only minimal interpretation.
External Table If a data file copying method is ont selected, and the data cannot be moved using direct path this may be a viable option.
Network Link When the Import NETWORK_LINK parameter is used to specify a network link for an import operation, the direct path method is used by default. Review supported database link types.
File Move Methods
Method Description
Transportable Tablespaces The TRANSPORT_TABLESPACES parameter is used to specify a transportable tablespace export. Only metadata for the specified tablespaces is exported.
Transportable = ALWAYS The TRANSPORTABLE=ALWAYS parameter is supplied on a table mode export (specified with the TABLES parameter) or a full mode export (specified with the FULL parameter) or a full mode network import (specified with the FULL and NETWORK_LINK parameters).
Export Modes
Mode Description
Full Use the FULL parameter: Exports the entire database is unloaded. EXP_FULL_DATABASE role required.
Schema Use the SCHEMAS parameter: The default export mode. If you have the EXP_FULL_DATABASE role, then you can specify a list of schemas and optionally include the schema definitions themselves, as well as system privilege grants to those schemas. If you do not have the EXP_FULL_DATABASE role, you can export only your own schema.
Table Use the TABLES parameter. A specified set of tables, partitions, and their dependent objects are unloaded. The user must have the EXP_FULL_DATABASE role to specify tables that are not in your own schema. All specified tables must reside in a single schema. Note that type definitions for columns are not exported in table mode. It is expected that the type definitions already exist in the target instance at import time.
Tablespace Use the TABLESPACES parameter. Only tables contained in a specified set of tablespaces are unloaded. If a table is unloaded, its data, metadata, and dependent objects are also unloaded. In tablespace mode, if any part of a table resides in the specified set, then that table and all of its dependent objects are exported. Privileged users get all tables. Nonprivileged users get only the tables in their own schemas.
Transportable Tablespace Use the TRANSPORT_TABLESPACES parameter. In transportable tablespace mode, only the metadata for tables, and dependent objects, within a specified set of tablespaces are unloaded. EXP_FULL_DATABASE role required. Degree of parallelism must = 1.
Legacy Data Pump enters legacy mode once it determines a parameter unique to original Export is present, either on the command line or in a script. As Data Pump processes the parameter, the analogous Data Pump Export parameter is displayed.
Import Modes
Mode Description
Full A full import is specified using the FULL parameter. In full import mode, the entire content of the source (dump file set or another database) is loaded into the target database. This is the default for file-based imports. You must have the DATAPUMP_IMP_FULL_DATABASE role if the source is another database.

Cross-schema references are not imported for non-privileged users. For example, a trigger defined on a table within the importing user's schema, but residing in another user's schema, is not imported.
Schema Schema import is specified using the SCHEMAS parameter. In a schema import, only objects owned by the specified schemas are loaded. The source can be a full, table, tablespace, or schema-mode export dump file set or another database. you have the DATAPUMP_IMP_FULL_DATABASE role, then a list of schemas can be specified and the schemas themselves (including system privilege grants) are created in the database in addition to the objects contained within those schemas.

Cross-schema references are not imported for non-privileged users unless the other schema is remapped to the current schema. For example, a trigger defined on a table within the importing user's schema, but residing in another user's schema, is not imported.
Table Table-mode import is specified using the TABLES parameter. In table mode, only the specified set of tables, partitions, and their dependent objects are loaded. The source can be a full, schema, tablespace, or table-mode export dump file set or another database. You must have the DATAPUMP_IMP_FULL_DATABASE role to specify tables that are not in your own schema.

Use the transportable option during a table-mode import by specifying the
TRANPORTABLE=ALWAYS parameter with the TABLES parameter. This requires use of the NETWORK_LINK parameter, as well.
Tablespace Tablespace-mode import is specified using the TABLESPACES parameter. In tablespace mode, all objects contained within the specified set of tablespaces are loaded, along with the dependent objects. The source can be a full, schema, tablespace, or table-mode export dump file set or another database. For unprivileged users, objects not remapped to the current schema will not be processed.
Transportable Tablespace Transportable tablespace import is specified using the TRANSPORT_TABLESPACES parameter. In transportable tablespace mode, the metadata from a transportable tablespace export dump file set or from another database is loaded. The datafiles, specified by the TRANSPORT_DATAFILES parameter, must be made available from the source system for use in the target database, typically by copying them over to the target system. Encrypted columns are not supported in transportable tablespace mode. This mode requires the DATAPUMP_IMP_FULL_DATABASE role. In transportable tablespace mode, only the metadata for tables, and dependent objects, within a specified set of tablespaces are unloaded. EXP_FULL_DATABASE role required.Degree of parallelism must = 1.
Legacy Data Pump enters legacy mode once it determines a parameter unique to original Import is present, either on the command line or in a script. As Data Pump processes the parameter, the analogous Data Pump Import parameter is displayed.
 
Exporting Schemas
Demo Setup conn / as sysdba

desc dba_directories

col owner format a10
col directory_path format a70

SELECT * FROM dba_directories;

CREATE OR REPLACE DIRECTORY data_pump_dir AS 'c:\temp';
-- default is $ORACLE_BASE/admin/<database_name>/dpdump

GRANT export full database TO uwclass;

Note: Verify that the environment variables ORACLE_HOME and ORACLE_SID are set properly in your shell. If they are not set then you must set them at the command line to proceed.
   
 
Basic Export Types
Note: After each export, SELECT table_name FROM user_tables in the schema that ran the export, in the following examples, uwclass or abc. You will find tables with names such as SYS_EXPORT_FULL_01 and SYS_EXPORT_SCHEMA_01. Examine their contents
Full Export FULL=<NO | YES>
expdp uwclass/uwclass@pdbdev DIRECTORY=data_pump_dir DUMPFILE=demo01.dmp FULL=yes
Schema Export SCHEMAS=<schema_name_list>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo02.dmp SCHEMAS=uwclass,scott
Table Export TABLES=<[schema_name.]table_name[:partition_name] [, ...]>
expdp uwclass/uwclass@pdbdev DUMPFILE=data_pump_dir:demo03a.dmp TABLES=servers, serv_inst

expdp sh/sh DUMPFILE=data_pump_dir:demo03b.dmp TABLES=sales:sales_q3_2003
Tablespace Export TABLESPACES=<comma_delimited_tablespace_list>
expdp uwclass DUMPFILE=data_pump_dir:demo04.dmp TABLESPACES=uwclass,users TRANSPORT_FULL_CHECK=yes
Transportable Tablespace Export TRANSPORT_TABLESPACES=<tablespace_name [, ...]>

The default tablespace of the user performing the export must not be set to one of the tablespaces being transported.
expdp uwclass/uwclass@pdbdev DIRECTORY=data_pump_dir DUMPFILE=demo05.dmp
TRANSPORT_TABLESPACES=users,example TRANSPORT_FULL_CHECK=yes LOGFILE=demo5exp.log

conn / as sysdba

ALTER TABLESPACE users READ ONLY;
ALTER TABLESPACE example READ ONLY;

expdp uwclass/uwclass@pdbdev DIRECTORY=data_pump_dir DUMPFILE=demo05.dmp
TRANSPORT_TABLESPACES=users,example TRANSPORT_FULL_CHECK=yes LOGFILE=demo5exp.log

ALTER TABLESPACE users READ WRITE;
ALTER TABLESPACE example READ WRITE;
Legacy Mode Mappings
Legacy Export Cmd How Handled
BUFFER This parameter is ignored because Data Pump does not make use of conventional mode.
COMPRESS In original Export, the COMPRESS parameter affected how the initial extent was managed. Setting COMPRESS=n caused original Export to use current storage parameters for the initial and next extent. The Data Pump COMPRESSION parameter is used to specify how data is compressed in the dump file, and is not related to the original Export COMPRESS parameter.
CONSISTENT Data Pump Export determines the current time and uses FLASHBACK_TIME. Always use this option.
CONSTRAINTS If original Export used CONSTRAINTS=n, then Data Pump Export uses EXCLUDE=CONSTRAINTS.
The default behavior is to include constraints as part of the export.
DIRECT This parameter is ignored. Data Pump automatically chooses the best export method.
FEEDBACK The Data Pump STATUS=30 command is used. Note that this is not a direct mapping because the STATUS command returns the status of the export job, as well as the rows being processed. In original Export, feedback was given after a certain number of rows, as specified with the FEEDBACK command. In Data Pump, the status is given every so many seconds, as specified by STATUS.
FILE Data Pump attempts to determine the path that was specified or defaulted to for the FILE parameter, and also to determine whether a directory object exists to which the schema has read and write access.
GRANTS If original Export used GRANTS=n, then Data Pump uses EXCLUDE=GRANT. If original Export used GRANTS=y, then the parameter is ignored and does not need to be remapped because that is the Data Pump default behavior.
INDEXES If original Export used INDEXES=n, then Data Pump uses the EXCLUDE=INDEX parameter. If original Export used INDEXES=y, then Data Pump uses the INCLUDE=INDEX parameter.
LOG Data Pump attempts to determine the path that was specified or defaulted to for the LOG parameter, and also to determine whether a directory object exists to which the schema has read and write access.
OBJECT_CONSISTENT This parameter is ignored because Data Pump processing ensures that each object is in a consistent state when being exported.
OWNER The Data Pump SCHEMAS parameter is used.
RECORDLENGTH This parameter is ignored because Data Pump automatically takes care of buffer sizing
RESUMABLE This parameter is ignored because Data Pump automatically provides this functionality
RESUMABLE_NAME This parameter is ignored because Data Pump automatically provides this functionality
RESUMABLE_TIMEOUT This parameter is ignored because Data Pump automatically provides this functionality.
ROWS If original Export used ROWS=y, then Data Pump Export uses the CONTENT=ALL parameter.

If original Export used ROWS=n, then Data Pump Export uses the CONTENT=METADATA_ONLY parameter.
STATISTICS This parameter is ignored because statistics are always saved for tables as part of a Data Pump operation.
TABLESPACES If original Export also specified TRANSPORT_TABLESPACE=n, then Data Pump ignores the TABLESPACES parameter. If original Export also specified TRANSPORT_TABLESPACE=y, then Data Pump takes the names listed for the TABLESPACES parameter and uses them.
TRANSPORT_TABLESPACE If original Export used TRANSPORT_TABLESPACE=n (the default), then Data Pump uses the TABLESPACES parameter. If original Export used TRANSPORT_TABLESPACE=y, then Data Pump uses the TRANSPORT_TABLESPACES parameter and only the metadata is exported.
TRIGGERS If original Export used TRIGGERS=n, then Data Pump Export uses the EXCLUDE=TRIGGER parameter
TTS_FULL_CHECK If original Export used TTS_FULL_CHECK=y, then Data Pump uses the TRANSPORT_FULL_CHECK parameter. If original Export used TTS_FULL_CHECK=y, then the parameter is ignored.
VOLSIZE It means the location specified for the dump file is a tape device. The Data Pump Export dump file format does not support tape devices. Therefore, this operation terminates with an error.
 
Additional Export Parameters
ABORT_STEP export ABORT_STEP=<n | -1]
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo01.dmp SCHEMA=uwclass ABORT_STEP=-1
ACCESS_METHOD export ACCESS_METHOD=<AUTOMATIC | DIRECT_PATH | EXTERNAL_TABLE | INSERT_AS_SELECT>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo02.dmp SCHEMA=uwclass ACCESS_METHOD=DIRECT_PATH
ATTACH export ATTACH=<[schema_name.]job_name>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo03.dmp ATTACH=uw_job NOLOGFILE=yes
CHECKSUM export CHECKSUM=<NO | YES>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo04.dmp CHECKSUM=yes
CHECKSUM_ALGORITHM export CHECKSUM_ALGORITHM=<CRC32 | SHA256 | SHA384 | SHA512>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo05.dmp CHECKSUM=yes CHECKSUM_ALGORITHM=SHA512
CLUSTER export CLUSTER=<Y | N>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo06.dmp CLUSTER=no
COMPRESSION export COMPRESSION=<ALL | DATA_ONLY | METADATA_ONLY | NONE>
expdp uwclass DIRECTORY=data_pump_dir  DUMPFILE=demo07.dmp COMPRESSSION=all
COMPRESSION_ALGORITHM export COMPRESSION_ALGORITHM=<BASIC | LOW | MEDIUM | HIGH>
expdp uwclass DIRECTORY=data_pump_dir  DUMPFILE=demo08.dmp COMPRESSSION=DATA_ONLY COMPRESSSION_ALGORITHM=LOW
CONTENT export CONTENT=<ALL | DATA_ONLY | METADATA_ONLY>
expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo09.dmp CONTENT=metadata_only
CREDENTIAL export CREDENTIAL=<user_credential>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo10.dmp CREDENTIAL=fraud_investigation
DATA_OPTIONS export DATA_OPTIONS=<GROUP_PARTITION_TABLE_DATA | VERIFY_STREAM_FORMAT>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo11.dmp LOGFILE=data_pump_dir:demo30exp.log DATA_OPTIONS=GROUP_PARTITION_TABLE_DATA
DIRECTORY export DIRECTORY=<directory_object | DATA_PUMP_DIR>
See FULL Demo Below
DUMPFILE export DUMPFILE=<expdat.dmp | file_name.dmp>
See FULL Demo Below
ENABLE_SECURE_ROLES export ENABLE_SECURE_ROLES=<NO | YES>
expdp uwclass DIRECTORY=data_pump_dir  DUMPFILE=demo12.dmp ENABLE_SECURE_ROLES=yes
ENCRYPTION export Default: The default mode depends on which other encryption-related parameters are used. If only the ENCRYPTION parameter is specified, then the default mode is TRANSPARENT.

If the ENCRYPTION_PASSWORD parameter is specified and the Oracle Encryption Wallet is open, then the default is DUAL. If the ENCRYPTION_PASSWORD parameter is specified and the Oracle Encryption Wallet is closed, then the default is PASSWORD.


ENCRYPTION=<ALL|DATA_ONLY | ENCRYPTED_COLUMNS_ONLY | METADATA_ONLY | NONE>
ENCRYPTION_ALGORITHM=<AES128 | AES192 | AES256>
ENCRYPTION_MODE=<dual | password | transparent>
ENCRYPTION_PASSWORD=<user_supplied_pwd>
ENCRYPTION_PWD_PROMPT=<NO | YES>
expdp uwclass/uwclass@pdbdev DIRECTORY=data_pump_dir DUMPFILE=demo13.dmp ENCRYPTION=all ENCRYPTION_ALGORITHM=aes256 ENCRYPTION_MODE=dual ENCRYPTION_PASSWORD="a1pha!Betaamma6" ENCRYPTION_PWD_PROMPT=yes
ESTIMATE export ESTIMATE=<BLOCKS | STATISTICS>
expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo14.dmp ESTIMATE=blocks

expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo15.dmp ESTIMATE=statistics
ESTIMATE_ONLY export ESTIMATE_ONLY=<Y | N>
expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass ESTIMATE_ONLY=y

open export.log with an editor
EXCLUDE export EXCLUDE=<exclude_criterion>
-- exclude all (nonreferential) constraints, except for NOT NULL constraints and any constraints needed for successful table creation and loading
expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo16.dmp EXCLUDE=constraint

-- exclude referential integrity (foreign key) constraints
expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo17.dmp EXCLUDE=ref_constraint

-- exclude object grants on all object types and system priv grants
expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo18.dmp EXCLUDE=grant

-- excludes the definitions of users
expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo19.dmp EXCLUDE=user

-- excludes views
expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo20.dmp EXCLUDE=view,package,function

-- to exclude a specific user and all objects of that user, specify a filter such as the following
-- (where hr is the schema name of the user you want to exclude):

expdp uwclass/uwclass@pdbdev FULL=yes DIRECTORY=data_pump_dir DUMPFILE=demo21.dmp EXCLUDE=SCHEMA:\"='HR'\"
FILESIZE export FILESIZE<0 | integer[B | K | M | G]>

The default, zero, means unlimited
expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE = demo%U.dmp COMPRESSION=none FILESIZE=4G
FLASHBACK_SCN export FLASHBACK_SCN=<scn_value>
conn / as sysdba

SELECT dbms_flashback.get_system_change_number
FROM dual;

exit

expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo22.dmp FLASHBACK_SCN=36477000
FLASHBACK_TIME export FLASHBACK_TIME=<timestamp_value>
conn / as sysdba

SELECT dbms_flashback.get_system_change_number
FROM dual;

SELECT SCN_TO_TIMESTAMP(dbms_flashback.get_system_change_number)
FROM dual;

exit

expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo23.dmp FLASHBACK_TIME =SYSTIMESTAMP

or

expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo23.dmp FLASHBACK_TIME =\"TO_TIMESTAMP('01-FEB-2020 08:08:08', 'DD-MON-YYYY HH24:MI:SS')\"
FULL export expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo01.dmp FULL=yes
HELP export HELP=<YES | NO>
expdp uwclass HELP=y
INCLUDE export INCLUDE=<include_list>
expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo24.dmp INCLUDE=table

expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo25.dmp INCLUDE=\"IN ('SERVERS', 'SERV_INST')\"

expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo26.dmp INCLUDE=procedure

expdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo27.dmp INCLUDE=INDEX:\"LIKE 'PK%\"
JOB_NAME export JOB_NAME=<job_or_master_table_name>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo28.dmp JOB_NAME=uwjob
KEEP_MASTER export KEEP_MASTER=<NO | YES>
expdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo29.dmp LOGFILE=data_pump_dir:demo29exp.log SCHEMAS=uwclass KEEP_MASTER=yes
LOGFILE export LOGFILE=<export.log | directory_object:file_name>
expdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo30.dmp LOGFILE=data_pump_dir:demo30exp.log
LOGTIME export LOGTIME=<ALL | LOGFILE | NONE | STATUS>
expdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo31.dmp LOGFILE=data_pump_dir:demo31exp.log LOGTIME=all
METRICS export METRICS=<NO | YES>
expdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo32.dmp LOGFILE=data_pump_dir:demo32exp.log METRICS=yes
NETWORK_LINK export NETWORK_LINK=<source_database_link>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo33.dmp NETWORK_LINK=fixed_user
NOLOGFILE export NOLOGFILE=<NO | YES>
expdp uwclass TABLES=servers, airplanes DUMPFILE=data_pump_dir:demo34.dmp NOLOGFILE=yes
PARALLEL export PARALLEL=<1 | parallel_degree>
expdp uwclass TABLES=servers DUMPFILE=data_pump_dir:demo35.dmp PARALLEL=8
PARFILE export PARFILE=<[directory_object.]file_name>
-- create this as a text file in the data_pump_dir directory

TABLES=servers DUMPFILE=data_pump_dir:demo36.dmp
LOGFILE=data_pump_dir:demo36exp.log
PARALLEL=2

expdp uwclass PARFILE=data_pump_dir:parfile.par
QUERY export QUERY=<[schema.][table_name:]query_where_clause>
expdp uwclass TABLES=airplanes DUMPFILE=data_pump_dir:demo37.dmp QUERY=airplanes:\ "WHERE program_id = ''737''\"
REMAPDATA export REMAP_DATA=<[schema_name.]table_name.column_name:[schema.]pkg.function>
conn uwclass/uwclass@pdbdev

CREATE OR REPLACE PACKAGE remap IS
  FUNCTION timestwo (inval NUMBER) RETURN NUMBER;
END remap;
/

CREATE OR REPLACE PACKAGE BODY remap IS
  FUNCTION timestwo (inval NUMBER) RETURN NUMBER IS
  BEGIN
    RETURN inval*2;
  END timestwo;
END remap;
/

expdp uwclass TABLES=servers DUMPFILE=data_pump_dir:demo33.dmp REMAP_DATA=uwclass.servers.srvr_id:uwclass.remap.timestwo
REUSE_DUMPFILES export REUSE_DUMPFILES=<N | Y>
expdp uwclass TABLES=servers DUMPFILE=data_pump_dir:demo34.dmp REUSE_DUMPFILES=y
SAMPLE export SAMPLE=<[[[schema_name.]table_name:]sample_percent>
expdp uwclass/uwclass@pdbdev TABLES=airplanes DUMPFILE=data_pump_dir:demo35.dmp SAMPLE=\ "UWCLASS.AIRPLANES:10\"
SCHEMA export See BASIC EXPORT TYPES: Above
SERVICE_NAME export SERVICE_NAME=<service_name>
expdp uwclass TABLES=airplanes DUMPFILE=ctemp:demo36.dmp SERVICE_NAME=pdbprod
SOURCE_EDITION export SOURCE_EDITION=<edition_name>
expdp uwclass TABLES=airplanes DUMPFILE=ctemp:demo37.dmp SOURCE_EDITION=ORA$BASE
STATUS export STATUS=<0 | seconds> -- how often job status is displayed
expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo38.dmp STATUS=5
TABLES export See BASIC EXPORT TYPES: Above
TABLESPACES export See BASIC EXPORT TYPES: Above
TRANSPORT_DATAFILES_LOG export TRANSPORT_DATAFILES_LOG=<directory_object_name:file_name>
expdp uwclass TRANSPORT_DATAFILES_LOG=data_pump_dir:mig_file_list.log
TRANSPORT_FULL_CHECK export TRANSPORT_FULL_CHECK=<NO | YES>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo39.dmp TRANSPORT_TABLESPACES=userdata TRANSPORT_FULL_CHECK=yes LOGFILE=data_pumpdir:tts20200320exp.log
TRANSPORT_TABLESPACES export TRANSPORT_TABLESPACES=<table_space_name_comma_delimited_list>
expdp uwclass expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo40.dmp TRANSPORT_TABLESPACES=userdata TRANSPORT_FULL_CHECK=yes LOGFILE=data_pumpdir:tts20200320exp.log
TRANSPORTABLE export TRANSPORTABLE=<ALWAYS | NEVER>
expdp uwclass expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo41.dmp TABLES=uwclass.servers TRANSPORTABLE=always
TRANSPORT_FULL_CHECK export TRANSPORT_FULL_CHECK=<NO | YES>
expdp uwclass expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo42.dmp LOGFILE=uwclass_export.log TRANSPORT_FULL_CHECK=yes
TTS_CLOSURE_CHECK export TTS_CLOSURE_CHECK=<FULL | OFF | ON | TEST_MODE>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo43.dmp TABLES=uwclass.servers  TTS_CLOSURE_CHECK=FULL
VERSION export VERSION=<COMPATIBLE | LATEST | version_string>
expdp version

expdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo44.dmp VERSION=latest
VIEWS_AS_TABLES export VIEWS_AS_TABLES=<[schema_name].view_name:table_name, ...>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo45.dmp VIEWS_AS_TABLES=uwclass.servers_view:servers, uwclass.serv_inst_view:serv_inst
 
Importing Schemas
Demo Setup conn / as sysdba

desc dba_directories

col owner format a10
col directory_path format a70

SELECT *
FROM dba_directories
ORDER BY 1;

CREATE OR REPLACE DIRECTORY data_pump_dir AS 'c:\temp';
-- default is $ORACLE_BASE/admin/<database_name>/dpdump

GRANT import full database TO uwclass;
Query a list of database dependent objects SELECT object_path
FROM database_export_objects
WHERE object_path like 'TABLE%'
ORDER BY 1;

OBJECT_PATH
-------------------------------------------
TABLE
TABLE/AUDIT_OBJ
TABLE/CLUSTERING
TABLE/COMMENT
TABLE/CONSTRAINT
TABLE/CONSTRAINT/REF_CONSTRAINT
TABLE/FGA_POLICY
TABLE/GRANT
TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
TABLE/INDEX
TABLE/INDEX/STATISTICS
TABLE/INSTANCE_CALLOUT
TABLE/MATERIALIZED_VIEW_LOG
TABLE/POST_INSTANCE/GRANT/PROCDEPOBJ_GRANT
TABLE/POST_INSTANCE/PROCDEPOBJ
TABLE/POST_INSTANCE/PROCDEPOBJ_AUDIT
TABLE/POST_TABLE_ACTION
TABLE/PRE_TABLE_ACTION
TABLE/PROCACT_INSTANCE
TABLE/RADM_POLICY
TABLE/RLS_CONTEXT
TABLE/RLS_GROUP
TABLE/RLS_POLICY/RLS_POLICY
TABLE/TRIGGER
TABLESPACE
TABLESPACE_ILM_POLICY
TABLESPACE_QUOTA
 
Basic Import Types
Full Import FULL=<N | Y>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo01.dmp FULL=y
Schema Import SCHEMAS=<schema_name_list>
SQL> conn / as sysdba

SQL> CREATE USER abc
     IDENTIFIED BY abc
     DEFAULT TABLESPACE users
     TEMPORARY TABLESPACE temp
     QUOTA unlimited ON users;

SQL> GRANT create session, create table TO abc;
SQL> GRANT read on directory data_pump_dir TO abc;
SQL> GRANT write on directory data_pump_dir TO abc;

SQL> conn abc/abc

SQL> CREATE TABLE zzyzx AS
     SELECT * FROM all_tables;

expdp uwclass/uwclass@pdbdev DUMPFILE=data_pump_dir:demo02.dmp SCHEMAS=abc

SQL> DROP TABLE zzyzx;

impdp uwclass DUMPFILE=data_pump_dir:demo02.dmp SCHEMAS=abc

impdp abc DIRECTORY=data_pump_dir DUMPFILE=demo03.dmp SCHEMAS=abc

SQL> SELECT owner, object_type, created
     FROM dba_objects_ae
     WHERE object_name = 'ZZYZX';
Table Import TABLES=<[schema_name.]table_name[:partition_name] [, ...]>
expdp uwclass/uwclass@pdbdev DUMPFILE=data_pump_dir:demo04.dmp TABLES=servers, serv_inst
Tablespace Import TABLESPACES=<tablespace_list>
expdp uwclass DUMPFILE=data_pump_dir:demo05.dmp TABLESPACES=uwclass,users TRANSPORT_FULL_CHECK=y
Transportable Tablespace Import TRANSPORT_TABLESPACES=<tablespace_name [, ...]>

The default tablespace of the user performing the export must not be set to one of the tablespaces being transported
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo06.dmp TRANSPORT_TABLESPACES=users,example TRANSPORT_FULL_CHECK=yes LOGFILE=demo06imp.log

conn / as sysdba

ALTER TABLESPACE users READ ONLY;
ALTER TABLESPACE example READ ONLY;

expdp uwclass/uwclass@pdbdev DIRECTORY=data_pump_dir DUMPFILE=demo07.dmp
TRANSPORT_TABLESPACES=users,example TRANSPORT_FULL_CHECK=yes LOGFILE=demo07imp.log

ALTER TABLESPACE users READ WRITE;
ALTER TABLESPACE example READ WRITE;
Legacy Mode Mappings
Legacy Export Cmd How Handled
BUFFER This parameter is ignored because Data Pump does not make use of conventional mode.
CHARSET This parameter is desupported and will cause the Data Pump Import operation to abort.
COMMIT This parameter is ignored. Data Pump Import automatically performs a commit after each table is processed.
COMPILE This parameter is ignored. Data Pump Import compiles procedures after they are created.
CONSTRAINTS If original Import used CONSTRAINTS=n, then Data Pump Import uses the EXCLUDE=CONSTRAINT parameter. If original Import used CONSTRAINTS=y, the parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior.
DATAFILES The Data Pump Import TRANSPORT_DATAFILES parameter is used.
DESTROY If original Import used DESTROY=y, then Data Pump Import uses the REUSE_DATAFILES=yes parameter. If original Import used DESTROY=n, the parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior.
FEEDBACK The Data Pump Import STATUS=30 command is used. Note that this is not a direct mapping because the STATUS command returns the status of the import job, as well as the rows being processed. In original Import, feedback was given after a certain number of rows, as specified with the FEEDBACK command. In Data Pump Import, the status is given every so many seconds, as specified by STATUS.
FILE Data Pump attempts to determine the path that was specified or defaulted to for the FILE parameter, and also to determine whether a directory object exists to which the schema has read and write access.
FILESIZE This parameter is ignored because the information is already contained in the Data Pump dump file set.
FROMUSER The Data Pump SCHEMAS parameter is used. If FROMUSER was used without TOUSER also being used, then import schemas that have the IMP_FULL_DATABASE role cause Data Pump Import to attempt to create the schema and then import that schema's objects. Import schemas that do not have the IMP_FULL_DATABASE role can only import their own schema from the dump file set.
GRANTS If original Import used GRANTS=n, then Data Pump uses the EXCLUDE=OBJECT_GRANT parameter. If original Import used GRANTS=y, the parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior.
IGNORE If original Import used IGNORE=y, then Data Pump Import uses the TABLE_EXISTS_ACTION=APPEND parameter. This causes the processing of table data to continue. If original Import used IGNORE=n, the parameter is ignored and does not need to be remapped because that is the Data Pump Import default behavior.
INDEXES If original Import used INDEXES=n, then Data Pump uses the EXCLUDE=INDEX parameter. If original Import used INDEXES=y, the parameter is ignored and does not need to be remapped because that is the Data Pump default behavior.
INDEXFILE fThe Data Pump Import SQLFILE={directory-object:}filename and INCLUDE=INDEX parameters are used. The same method and attempts made when looking for a directory object described for the FILE parameter also take place for the INDEXFILE parameter. If no directory object was specified on the original Import, then Data Pump  uses the directory object specified with the DIRECTORY parameter.
LOG Data Pump attempts to determine the path that was specified or defaulted to for the LOG parameter, and also to determine whether a directory object exists to which the schema has read and write access.
RECORDLENGTH This parameter is ignored because Data Pump automatically takes care of buffer sizing
RESUMABLE This parameter is ignored because Data Pump automatically provides this functionality
RESUMABLE_NAME This parameter is ignored because Data Pump automatically provides this functionality
RESUMABLE_TIMEOUT This parameter is ignored because Data Pump automatically provides this functionality.
ROWS=N If original Import used ROWS=n, then Data Pump uses the CONTENT=METADATA_ONLY parameter. If original Import used ROWS=y, then Data Pump uses the CONTENT=ALL parameter.
SHOW If SHOW=y is specified, the Data Pump Import SQLFILE=[directory_object:]file_name parameter is used to write the DDL for the import operation to a file. Only the DDL (not the entire contents of the dump file) is written to the specified file. (Note that the output is not shown on the screen as it was in original Import.)
STATISTICS This parameter is ignored because statistics are always saved for tables as part of a Data Pump operation.
STREAMS_CONFIGURATION This parameter is ignored because Data Pump Import automatically determines it; it does not need to be specified.
STREAMS_INSTANTIATION This parameter is ignored because Data Pump Import automatically determines it; it does not need to be specified.
TABLESPACES If original Import also specified TRANSPORT_TABLESPACE=n (the default), then Data Pump ignores the TABLESPACES parameter. If original Import also specified TRANSPORT_TABLESPACE=y, then Data Pump takes the names supplied for this TABLESPACES parameter and applies them to the Data Pump TRANSPORT_TABLESPACES parameter.
TOID_NOVALIDATE This parameter is ignored. OIDs are no longer used for type validation.
TO_USER The REMAP_SCHEMA parameter is used. There may be more objects imported than with original Import. Data Pump may create the target schema if it does not already exist.
TRANSPORT_TABLESPACE If original Import used TRANSPORT_TABLESPACE=n, then Data Pump Import ignores this parameter.
TTS_OWNERS This parameter is ignored because this information is automatically stored in the Data Pump dump file set.
VOLSIZE When the VOLSIZE parameter is used, it means the location specified for the dump file is a tape device. The Data Pump dump file format does not support tape devices. The operation terminates with an error.
 
Additional Import Parameters
ABORT_STEP import ABORT_STEP=<<n | -1]
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo01.dmp SCHEMA=uwclass ABORT_STEP=-1
ACCESS_METHOD import ACCESS_METHOD=<AUTOMATIC | DIRECT_PATH | EXTERNAL_TABLE | INSERT_AS_SELECT>
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo02.dmp SCHEMA=uwclass ACCESS_METHOD=DIRECT_PATH
ATTACH import ATTACH[=[schema_name.]job_name]
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo03.NOLOGFILE=yes dmp ATTACH=uw_job
CLUSTER import CLUSTER=<YES | NO>
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo04.dmp CLUSTER=no
CONTENT import CONTENT=<ALL | DATA_ONLY | METADATA_ONLY>
impdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo05.dmp CONTENT=metadata_only
CREDENTIAL import CREDENTIAL=<user_credential>
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo06.dmp CREDENTIAL=fraud_investigation
DATA_OPTIONS import DATA_OPTIONS=<[DISABLE_APPEND_HINT | SKIP_CONSTRAINT_ERRORS | REJECT_ROWS_WITH_REPL_CHAR | GROUP_PARTITION_TABLE_DATA | TRUST_EXISTING_TABLE_PARTITIONS | VALIDATE_TABLE_DATA | ENABLE_NETWORK_COMPRESSION | CONTINUE_LOAD_ON_FORMAT_ERROR>
expdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo11.dmp LOGFILE=data_pump_dir:demo07.log DATA_OPTIONS=DISABLE_APPEND_HINT
DIRECTORY import DIRECTORY=<directory_object | DATA_PUMP_DIR>
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo08.dmp
DUMPFILE import DUMPFILE=<expdat.dmp | file_name.dmp> -- can be used more than 1X at once in a parfile
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo09.dmp
ENABLE_SECURE_ROLES import ENABLE_SECURE_ROLES=<NO | YES>
impdp uwclass DIRECTORY=data_pump_dir  DUMPFILE=demo10.dmp ENABLE_SECURE_ROLES=yes
ENCRYPTION import Default: The default mode depends on which other encryption-related parameters are used. If only the ENCRYPTION parameter is specified, then the default mode is TRANSPARENT.

If the ENCRYPTION_PASSWORD parameter is specified and the Oracle Encryption Wallet is open, then the default is DUAL. If the ENCRYPTION_PASSWORD parameter is specified and the Oracle Encryption Wallet is closed, then the default is PASSWORD.


ENCRYPTION=<ALL|DATA_ONLY | ENCRYPTED_COLUMNS_ONLY | METADATA_ONLY | NONE>
ENCRYPTION_ALGORITHM=<AES128 | AES192 | AES256>
ENCRYPTION_MODE=<dual | password | transparent>
ENCRYPTION_PASSWORD=<user_supplied_pwd>
ENCRYPTION_PWD_PROMPT=<NO | YES>
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo13.dmp ENCRYPTION=all ENCRYPTION_ALGORITHM=aes256 ENCRYPTION_MODE=dual ENCRYPTION_PWD_PROMPT=yes
ESTIMATE import ESTIMATE=<BLOCKS | STATISTICS>
impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo13.dmp ESTIMATE=blocks

impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo14.dmp ESTIMATE=statistics
EXCLUDE import EXCLUDE=<exclude_criterion>
-- exclude all (nonreferential) constraints, except for NOT NULL constraints and any constraints
-- required for successful table creation and loading

impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo15.dmp EXCLUDE=constraint

-- exclude referential integrity (foreign key) constraints
impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo16.dmp EXCLUDE=ref_constraint

-- exclude object grants on all object types and system priv grants
impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo17.dmp EXCLUDE=grant

-- excludes the definitions of users
impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo18.dmp EXCLUDE=user

-- excludes views
impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo19.dmp EXCLUDE=view,package,function

-- to exclude a specific user and all objects of that user, specify a filter such as the following
-- (where hr is the schema name of the user you want to exclude):

impdp uwclass/uwclass@pdbdev FULL=yes DIRECTORY=data_pump_dir DUMPFILE=demo20.dmp EXCLUDE=SCHEMA:\"='HR'\"
FLASHBACK_SCN import FLASHBACK_SCN=<scn_value>
conn / as sysdba

SELECT dbms_flashback.get_system_change_number FROM dual;

exit

impdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo21.dmp FLASHBACK_SCN=36477000
FLASHBACK_TIME import FLASHBACK_TIME=<timestamp_value>
conn / as sysdba

SELECT dbms_flashback.get_system_change_number
FROM dual;

SELECT SCN_TO_TIMESTAMP(dbms_flashback.get_system_change_number)
FROM dual;

exit

impdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo22.dmp
FLASHBACK_TIME=\"TO_TIMESTAMP('01-FEB-2020 08:08:08', 'DD-MON-YYYY HH24:MI:SS')\"
FULL import FULL=<NO | YES>
impdp uwclass/uwclass@pdbdev DIRECTORY=data_pump_dir DUMPFILE=demo23.dmp FULL=yes
HELP import HELP=<NO | YES>
impdp uwclass HELP=yes
INCLUDE import INCLUDE=<include_list>
impdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo24.dmp INCLUDE=table

impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo25.dmp INCLUDE=\"IN ('SERVERS', 'SERV_INST')\"

impdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo26.dmp INCLUDE=procedure

impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo27.dmp INCLUDE=INDEX:\"LIKE 'PK%\"
JOB_NAME import JOB_NAME=<job_or_master_table_name>
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo28.dmp JOB_NAME=uwjob
KEEP_MASTER import KEEP_MASTER=<NO | YES>
impdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo29.dmp LOGFILE=data_pump_dir:demo29.log SCHEMAS=uwclass KEEP_MASTER=yes
LOGFILE import LOGFILE=<export.log | directory_object:file_name>
impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo30.dmp LOGFILE=data_pump_dir:demo30imp.log
LOGTIME import LOGTIME=<ALL | LOGFILE | NONE | STATUS>
impdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo31.dmp LOGFILE=data_pump_dir:demo31imp.log LOGTIME=all
MASTER_ONLY import MASTER_ONLY=<NO | YES>
impdp uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir DUMPFILE=demo32.dmp LOGFILE=data_pump_dir:demo32imp.log MASTER_ONLY=yes
METRICS import METRICS=<NO | YES>
impdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo33.dmp LOGFILE=data_pump_dir:demo32.log METRICS=yes
NETWORK_LINK import NETWORK_LINK=<source_database_link>
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo34.dmp NETWORK_LINK=fixed_user
NOLOGFILE import NOLOGFILE=<NO | YES>
impdp uwclass TABLES=servers, airplanes DUMPFILE=data_pump_dir:demo35.dmp NOLOGFILE=yes
PARALLEL import PARALLEL=<1 | parallel_degree>
impdp uwclass TABLES=servers DUMPFILE=data_pump_dir:demo36.dmp PARALLEL=2
PARFILE import PARFILE=<[directory_object.]file_name>
-- create this as a text file in the data_pump_dir directory

TABLES=servers DUMPFILE=data_pump_dir:demo37.dmp
LOGFILE=data_pump_dir:demo37imp.log
PARALLEL=2

impdp uwclass PARFILE=data_pump_dir:parfile.par
PARTITION_OPTIONS import PARTITION_OPTIONS=<none | departition | merge>
impdp uwclass TABLES=servers DUMPFILE=data_pump_dir:demo38.dmp PARTITION_OPTIONS=departition
QUERY import QUERY=<[schema.][table_name:]query_where_clause>
impdp uwclass TABLES=airplanes DUMPFILE=data_pump_dir:demo39.dmp QUERY=airplanes:\ "WHERE program_id = ''737''\"
REMAP_DATA import REMAP_DATA=[schema.]tablename.column_name:[schema.]pkg.function
CREATE TABLE holder (
holder_name VARCHAR2(30) NOT NULL,
card_number NUMBER(16)   NOT NULL);

CREATE TABLE activity (
vendor      VARCHAR2(30) NOT NULL,
card_number NUMBER(16)   NOT NULL,
sales_total NUMBER(10,2) NOT NULL);

INSERT INTO holder VALUES ('Morgan', 4124555566661234);
INSERT INTO holder VALUES ('Kyte', 3776555566665678);
INSERT INTO holder VALUES ('Norgaard', 5058555566669012);

INSERT INTO activity VALUES ('Amazon.com', 4124555566661234, 100.00);
INSERT INTO activity VALUES ('eBay.com', 4124555566661234, 59.50);
INSERT INTO activity VALUES ('Oracle', 3776555566665678, 50.50);
INSERT INTO activity VALUES ('Miracle', 5058555566669012, 42.42);

CREATE OR REPLACE PACKAGE hidedata AUTHID CURRENT_USER IS
 FUNCTION newcc(oldno IN NUMBER) RETURN NUMBER;
END hidedata;
/

CREATE OR REPLACE PACKAGE BODY hidedata IS
 TYPE cc_list IS TABLE OF NUMBER INDEX BY VARCHAR2(16);
 cc_remaps cc_list;
 cc_seed   NUMBER(15) := 000000000000010;

 FUNCTION newcc(oldno IN NUMBER) RETURN NUMBER IS
 BEGIN
   IF NOT cc_remaps.EXISTS(oldno) THEN
     cc_seed := cc_seed+10;
     cc_remaps(oldno) := ROUND(oldno, -15)+cc_seed+MOD(oldno,10);
   END IF;
   RETURN cc_remaps(oldno);
 END newcc;
END hidedata;
/

col card_number format 9999999999999999

SELECT vendor, hidedata.newcc(card_number) CARD_NUMBER, sales_total
FROM activity;

expdp uwclass/uwclass@pdbdev tables=uwclass.holder,uwclass.activity remap_data=uwclass.holder.card_number:hidedata.newcc
remap_data
=uwclass.activity.card_number:hidedata.newcc directory=ctemp dumpfile=uwemp40.dmp
REMAP_DATAFILE import REMAP_DATAFILE=<source_datafile_string:target_datafile_string>
TBD
REMAP_DIRECTORY import REMAP_DIRECTORY=<source_directory_string:target_directory_string>
TBD
REMAP_SCHEMA import REMAP_SCHEMA=<source_schema:target_schema>
expdp system/oracle1 DIRECTORY=data_pump_dir DUMPFILE=streamsdemo41.dmp SCHEMAS=scott

impdp system/oracle1 DUMPFILE=data_pump_dir:streamsdemo42.dmp SCHEMAS=scott REMAP_SCHEMA=scott:scottrep
REMAP_TABLE import REMAP_TABLE=<[schema.]old_tablename[.partition]:new_tablename>
TBD
REMAP_TABLESPACE import REMAP_TABLESPACE=<source_tablespace:target_tablespace>
impdp uwclass DIRECTORY=data_pump_dir dumpfile=demo43.dmp schemas=abc job_name=export_cw4ora REMAP_SCHEMA=abc:def REMAP_TABLESPACE=users:uw_data REMAP_TABLESPACE=example:uwdata exclude=GRANT
REUSE_DATAFILES import REUSE_DATAFILES=<NO | YES>
expdp system/oracle1 DIRECTORY=data_pump_dir DUMPFILE=streamsdemo44.dmp REUSE_DATAFILES=yes
SCHEMAS import SCHEMAS=<comma_delimited_schema_name_list>
impdp uwclass SCHEMAS=uwclass LOGFILE=uwdp.log DUMPFILE=dump45.dmp
SERVICE_NAME import SERVICE_NAME=<name_of_service>
impdp uwclass TABLES=airplanes DUMPFILE=ctemp:demo46.dmp SERVICE_NAME=pdbprod
SKIP_UNUSABLE_INDEXES import SKIP_UNUSABLE_INDEXES=<value_of_the_corresponding_init_parameter | NO | YES>
impdp uwclass TABLES=airplanes DUMPFILE=ctemp:demo47.dmp SKIP_UNUSABLE_INDEXES=yes
SOURCE_EDITION import SOURCE_EDITION=<default_database_edition | edition_name>
impdp uwclass TABLES=airplanes DUMPFILE=ctemp:demo48.dmp SOURCE_EDITION=ORA$BASE
SQLFILE import SQLFILE=<[directory_object:]file_name>
impdp hr DIRECTORY=dpump_dir1 DUMPFILE=demo49.dmp SQLFILE=dpump_dir2:expfull.sql
STATUS import STATUS=<0 | freq_in_seconds> -- how often job status is displayed
impdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo50.dmp STATUS=5
STREAMS_CONFIGURATION import STREAMS_CONFIGURATION=<NO | YES>
impdp uwclass SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo51.dmp STREAMS_CONFIGURATION=no
TABLES import TABLES=<[schema_name.]table_name[:partition_name] [, ...]>
impdp uwclass DUMPFILE=data_pump_dir:demo52a.dmp TABLES=servers, serv_inst

impdp uwclass DUMPFILE=data_pump_dir:demo52b.dmp TABLES=sales:sales_q1_2020
TABLESPACES import TABLESPACES=<comma_delimited_tablespace_list>
impdp uwclass DUMPFILE=data_pump_dir:demo53.dmp TABLESPACES=uwclass,users TRANSPORT_FULL_CHECK=yes
TABLE_EXISTS_ACTION import TABLE_EXISTS_ACTION=<SKIP | APPEND | TRUNCATE | REPLACE>
impdp uwclass DUMPFILE=data_pump_dir:demo54.dmp TABLESPACES=uwclass,users TABLE_EXISTS_ACTION=APPEND
TARGET_EDITIONS import TARGET_EDITIONS=<edition_name_list>
impdp uwclass DUMPFILE=data_pump_dir:demo55.dmp TARGET_EDITIONS=ora$base2019,ora$base2020
TRANSFORM import TRANSFORM=<transform_name:value[:object_type]>
TBD
TRANSPORT_DATA_FILES import TRANSPORT_DATAFILES=<data_file_name_list>
impdp uwclass DIRECTORY=dpump_dir1 DUMPFILE=tdf.dmp TRANSPORTABLE=always TRANSPORT_DATAFILES='/app/oracle/oradata/orabase/uwclass01.dbf' TRANSPORT_FULL_CHECK=no
TRANSPORT_FULL_CHECK import TRANSPORT_FULL_CHECK=<NO | YES>
impdp uwclass DIRECTORY=data_pump_dir DUMPFILE=demo56.dmp TRANSPORT_TABLESPACES=userdata TRANSPORT_FULL_CHECK=yes LOGFILE=data_pumpdir:tts20200320imp.log
TRANSPORTABLE import TRANSPORTABLE=<ALWAYS | NEVER>
See TRANSPORT_DATAFILES Demo Above
VERIFY_CHECKSUM import VERIFY_CHECKSUM=<NO | YES>
impdp uwclass DIRECTORY=dpump_dir1 DUMPFILE=tdf.dmp VERIFY_CHECKSUM=yes
VERIFY_ONLY import VERIFY_ONLY=<NO | YES>
impdp uwclass DIRECTORY=dpump_dir1 DUMPFILE=tdf.dmp VERIFY_ONLY=yes
VERSION import VERSION=<COMPATIBLE | LATEST | version_string>
impdp uwclass/uwclass@pdbdev SCHEMAS=uwclass DUMPFILE=data_pump_dir:demo56.dmp VERSION=latest

Related Topics
Database Links
DBMS_DATAPUMP
DBMS_DATAPUMP_INT
DBMS_DATAPUMP_UTL
DBMS_PSWMG_IMPORT
Export
Import
KUPCC
PSTDY_DATAPUMP_SUPPORT
Transportable Tablespaces
TSDP$DATAPUMP
What's New In 21c
What's New In 23c

Morgan's Library Page Footer
This site is maintained by Dan Morgan. Last Updated: This site is protected by copyright and trademark laws under U.S. and International law. © 1998-2023 Daniel A. Morgan All Rights Reserved
  DBSecWorx