Oracle expdp Schema Export Using Data Pump

Oracle expdp Schema Export Using Data Pump

Oracle Data Pump's expdp utility lets you export a full database schema β€” including tables, indexes, constraints, and data β€” into a portable dump file. This guide breaks down the single-line command that does exactly that for the andy schema.

Purpose of This Command

The goal of this command is to create a logical backup of a single Oracle database schema called andy. The export produces a binary dump file (andy.dp) saved to a pre-defined Oracle directory object (my_dir) on the server. This backup can later be imported into the same or a different Oracle database using impdp.

Schema-mode export is the default export mode in Oracle Data Pump and is ideal for migrating, cloning, or backing up a single user's objects without touching the rest of the database.

The Command

1expdp system/****** schemas=andy dumpfile=andy.dp directory=my_dir

Breakdown of Each Part

PartValueWhat It Does
expdpβ€”Calls the Oracle Data Pump Export executable
system/******CredentialsConnects as the system user (password hidden)
schemas=andyandyTells Data Pump to export only the andy schema
dumpfile=andy.dpandy.dpSets the name of the output dump file
directory=my_dirmy_dirPoints to an Oracle Directory Object for file output

Each parameter is a keyword argument passed directly on the command line.

What Is a Directory Object?

An Oracle Directory Object is a database-level pointer that maps a short name (like my_dir) to a real path on the server's file system (like /backup/oracle/).

You must create this object before running the export, and you must grant READ and WRITE privileges on it to the user running expdp. The command to create and grant looks like this:

1-- Run as SYSDBA or a privileged user
2CREATE OR REPLACE DIRECTORY my_dir AS '/backup/oracle/';
3GRANT READ, WRITE ON DIRECTORY my_dir TO system;

Data Pump will refuse to run if the directory object does not exist or if the user does not have the correct privilege on it.

What Gets Exported?

When you run a schema-mode export, Oracle Data Pump captures all objects that belong to that schema. This typically includes:

  • Tables (structure and all rows of data)
  • Indexes and constraints
  • Views, sequences, and synonyms
  • Stored procedures, functions, and packages
  • Triggers
  • Grants and object-level privileges assigned to the schema

If you have the DATAPUMP_EXP_FULL_DATABASE role (as system does), you can also export schema definitions and system-level privilege grants for the target schema.

Why Use system to Export Another Schema?

The system account holds the DATAPUMP_EXP_FULL_DATABASE role by default. This role allows system to export schemas that it does not own β€” in this case, andy. Without this role, a user can only export their own schema. Running the export as system is a common DBA practice for administrative backups.

The Dump File: andy.dp

The output file andy.dp is a proprietary Oracle binary format. It is not a plain SQL script. You cannot open it in a text editor and read it. It is designed specifically to be re-loaded by the impdp (Data Pump Import) utility.

  • The file is written to the directory mapped by my_dir on the database server, not on your local client machine.
  • By default, if a file with the same name already exists, the job will fail β€” use REUSE_DUMPFILES=y to overwrite.

How to Add a Log File

The command above does not include a logfile parameter, so no log is written. It is a strong best practice to always add one:

1expdp system/****** schemas=andy dumpfile=andy.dp directory=my_dir logfile=andy_exp.log

The log file records job progress, object counts, warnings, and errors. It is saved to the same my_dir directory unless you specify a different directory for it.


How to Import This Dump File

To restore or migrate the andy schema using the dump file, use impdp:

1impdp system/****** schemas=andy dumpfile=andy.dp directory=my_dir logfile=andy_imp.log

You can also remap the schema to a different target user during import with REMAP_SCHEMA=andy:new_user.


Key Points and Tips

  • Password security: Never store plain-text passwords in scripts. Use Oracle Wallet or an external password store for automated jobs.
  • File extension: Oracle does not enforce a specific extension for dump files. .dmp is common, but .dp works too. Choose a consistent naming convention.
  • Parallel export: For large schemas, add PARALLEL=4 (or higher) along with a wildcard dumpfile like dumpfile=andy_%U.dmp to write multiple files simultaneously.
  • Compression: Add COMPRESSION=ALL to reduce dump file size significantly on large schemas.
  • Exclude objects: Use EXCLUDE=STATISTICS to skip table statistics and speed up both export and import.
  • Monitor a running job: Attach to a running job with expdp system/****** attach=job_name and use the STATUS command inside the interactive prompt.

Quick Reference: Useful expdp Parameters

ParameterExamplePurpose
LOGFILElogfile=andy.logSaves a text log of the job
PARALLELparallel=4Uses multiple worker threads
COMPRESSIONcompression=allCompresses dump file
REUSE_DUMPFILESreuse_dumpfiles=yOverwrites existing dump file
EXCLUDEexclude=statisticsSkips specific object types
CONTENTcontent=data_onlyExports only data, no DDL
FLASHBACK_TIMEflashback_time=systimestampExports a consistent snapshot

References

Posts in this series