Presentation is loading. Please wait.

Presentation is loading. Please wait.

The New Data Pump Caleb Small Next generation Import / Export New features Better performance Improved security Versatile interfaces.

Similar presentations


Presentation on theme: "The New Data Pump Caleb Small Next generation Import / Export New features Better performance Improved security Versatile interfaces."— Presentation transcript:

1 The New Data Pump Caleb Small Caleb@caleb.com

2 Next generation Import / Export New features Better performance Improved security Versatile interfaces

3 Old Import / Export Still exists Installed and enabled by default Supports features through version 9i No new releases Required to import pre-10g dump files Dump files & scripts are NOT compatible

4 New Features Data Sampling & filtering Object filtering Estimate file size, max file size, multiple files Remap datafile, schema, tablespace Network mode Point-in-time export Version conversion Column data encryption

5 Better Performance Complete re-write Parallel processing Parallel data streams to multiple files Runs within the instance Uses direct path whenever possible Restartable Tunable (auto tuning) Progress monitoring

6 Improved Security Server based only – no client side exports! Based on directory objects within database Always runs as “oracle” process on behalf of invoking database user.

7 Versatile Interfaces Command line Parameter file Interactive mode DBMS_DATAPUMP package DB console (Enterprise manager) External table Scheduled job

8 Datapump Architecture Master process Manages and controls the operation Worker process(es) Responsible for data movement One for each degree of parallelism Master table Created in invokers schema at job start Maintained during job execution Dropped after successful completion Used to resume a paused/failed job Control & status queues

9 Types of Exports Table Schema Tablespace Database Transportable Tablespace (metadata) INCLUDE / EXCLUDE object filters QUERY and SAMPLE data filters CONTENTS = data | metadata | both

10 Directory Objects Created as a database object Requires CREATE_ANY_DIRECTORY privilege Permissions (read, write) granted on the object to specific user(s) Not validated – existence, syntax, OS privilege Accessed as user “oracle” at the OS level Default DATA_PUMP_DIR maps to …

11 Data Access Direct Path Chosen automatically whenever possible Reads/writes data blocks directly No undo, redo can be turned off See Utilities Guide for exceptions (eg. active triggers, clustered tables, BFILE column, etc) External Table Equivalent to old “conventional path” Normal SQL and commit processing, slower NOT the same as the external table driver for SQL

12 Monitoring Data Pump STATUS parameter Detach / re-attach jobs Stop / start / kill jobs STATUS command (interactive mode) Data dictionary views DBA_DATAPUMP_JOBS DBA_DATAPUMP_SESSIONS V$SESSION_LONGOPS Log File

13 Interactive Mode NOT the same as old imp/exp! Default starts schema mode export Use command line arguments or par file “logging” vs “interactive command” mode Default logging mode logs to terminal Ctl-C to enter interactive command mode Job will continue to run even if client disconnects! expdp scott/tiger@fred parfile=myjob.par

14 Interactive Mode Commands STATUS Status of current job CONTINUE_CLIENT Resume “logging” mode EXIT_CLIENT Exit client, leave job running STOP_JOB Stop current job, do not delete START_JOB Re-start current job KILL_JOB Delete current job PARALLEL Add/remove worker processes ADD_FILE Add dump file during export HELP Get help

15 Obsolete Parameters Export BUFFER COMPRESS CONSISTENT DIRECT RECORD_LENGTH RESUMABLE STATISTICS USERID VOLSIZE Import BUFFER CHARSET COMMIT COMPILE FILESIZE RECORD_LENGTH RESUMABLE STATISTICS Others have changed, see the Utilities Guide!

16 Data Pump API Grant execute on DBMS_DATAPUMP and optionally DBMS_METADATA Exec DBMS_DATAPUMP.OPEN Define parameters (job type, dump file, etc) Exec DBMS_DATAPUMP.START_JOB Optionally monitor, detach, re-attach, stop, start or kill the job Can be scheduled as a recurring job

17 External Table (SQL) Created as a database object (TABLE) Organization EXTERNAL Uses ORACLE_DATAPUMP access driver Can load and unload data Dumpfile contains row data only Metadata stored in data dictionary Not compatible with regular dump file

18 Network Mode Works across database link Import reads tables from remote DB and writes directly to tables in local DB –No dump file created –Directory object still required for logging Export reads tables from remote DB and writes to dump file on local server

19 SQL File Import can generate an SQL file instead of actually performing the import Contains DDL that would have been executed based on job parameters Passwords excluded No change to target DB

20 Required Reading Oracle Database New Features Guide Oracle Database Utilities PL/SQL Packages and Types Reference Oracle Data Pump FAQ on OTN

21 Demonstration Directory object basics Basic Data Pump unload and load Query / Sample data filtering Re-attaching and monitoring a big job The PL/SQL API SQL File import External Table access driver www.caleb.com/dba Caleb@caleb.com

22 Tuning Data Pump PARALLEL is the only DP specific parameter Set to 2 times number of CPUs Will cause increased resource consumption Memory, CPU, I/O bandwidth Do not overload these resources! Individual worker processes can use parallel query Use multiple dump files to maximize parallelism Separate files on separate physical devices/channels Use wildcard in filename eg: dumpfile%u.dmp Separate device/channel from source tablespace(s)

23 Tuning Data Pump Instance parameters that may affect performance: DISK_ASYNCH_IO=TRUE DB_BLOCK_CHECKING=FALSE DB_BLOCK_CHECKSUM=FALSE PROCESSES SESSIONS PARALLEL_MAX_SERVERS SHARED_POOL_SIZE UNDO_TABLESPACE


Download ppt "The New Data Pump Caleb Small Next generation Import / Export New features Better performance Improved security Versatile interfaces."

Similar presentations


Ads by Google