Short on time? Copy-paste the exact commands using the Copy button below each snippet!
1. Check or Create Data Pump Directory
Oracle Data Pump requires a logical directory mapped to a real OS path for dump files. Follow these steps:
View Existing Directory Objects
SELECT directory_name, directory_path FROM dba_directories;
Check if DATA_PUMP_DIR already exists and note its OS path.
Create or Update Directory Object
CREATE OR REPLACE DIRECTORY DATA_PUMP_DIR AS '/opt/oracle/dumpfiles';
You can change the path as needed. The OS folder must exist and be writable by Oracle.
Grant Directory Access to User
GRANT READ, WRITE ON DIRECTORY DATA_PUMP_DIR TO system;
This example gives SYSTEM user permission to use this directory for export/import.
Ensure Directory Exists at OS Level
mkdir -p /opt/oracle/dumpfiles
chmod 777 /opt/oracle/dumpfiles
Log in to your Linux VM and make sure the path exists with the right permissions.
2. Backup (Export) a Schema
Exporting: This creates a .dmp dump file containing your schema data and metadata.
Basic Export Command
expdp system/password@dbname schemas=SYSTEM directory=DATA_PUMP_DIR dumpfile=system_backup.dmp logfile=exp.log
Explanation of Each Switch
| Parameter | Description |
|---|---|
| system/password@dbname | Connects as SYSTEM user to database. Replace password and dbname accordingly. |
| schemas=SYSTEM | Exports only the SYSTEM schema. List others as needed: schemas=SCOTT,HR |
| directory=DATA_PUMP_DIR | Logical directory pointing to export location. |
| dumpfile=system_backup.dmp | Output dump file name. |
| logfile=exp.log | Export log file for monitoring progress and errors. |
Useful Optional Switches
| Switch | Purpose | Example |
|---|---|---|
| tables=EMPLOYEES,DEPARTMENTS | Export only specific tables | tables=EMPLOYEES,DEPARTMENTS |
| parallel=4 | Use multiple parallel threads for faster export | parallel=4 |
| exclude=statistics | Exclude statistics from export | exclude=statistics |
| compression=all | Compress the dump file | compression=all |
Tip: After export, check that both .dmp and exp.log files exist and that exp.log reports success.
3. Restore (Import) a Schema
Importing: Loads schema data from the backup dump file back into your Oracle DB.
Basic Import Command
impdp system/password@dbname schemas=SYSTEM directory=DATA_PUMP_DIR dumpfile=system_backup.dmp logfile=imp.log
Explanation of Each Switch
| Parameter | Description |
|---|---|
| system/password@dbname | Connect as SYSTEM user. Adjust as needed. |
| schemas=SYSTEM | Imports the SYSTEM schema (or others). |
| directory=DATA_PUMP_DIR | Directory object where .dmp file resides. |
| dumpfile=system_backup.dmp | The backup file to import from. |
| logfile=imp.log | Log file to review import progress and errors. |
Optional Useful Import Switches
| Switch | Purpose | Example |
|---|---|---|
| tables=EMPLOYEES,DEPARTMENTS | Import only specific tables | tables=EMPLOYEES,DEPARTMENTS |
| table_exists_action=replace | Replace existing tables if present | table_exists_action=replace |
| ignore=y | Ignore errors about existing objects (use with care) | ignore=y |
| remap_schema=SYSTEM:NEW_USER | Import data from SYSTEM schema into NEW_USER | remap_schema=SYSTEM:NEW_USER |
After import, always check imp.log for any errors or warnings!
4. Running Data Pump Inside Containers (Podman Example)
If Oracle is running inside a Podman or Docker container, use podman exec to run the import/export commands inside the container.
podman exec -it oracle-xe bash
impdp system/password@XE schemas=SYSTEM directory=DATA_PUMP_DIR dumpfile=system_backup.dmp logfile=imp.log
Adjust
oracle-xe to your container name. For Docker, replace podman with docker.5. Quick Reference Table
| Task | Command Example |
|---|---|
| Check Directories | SELECT directory_name, directory_path FROM dba_directories; |
| Create Directory | CREATE OR REPLACE DIRECTORY DATA_PUMP_DIR AS '/opt/oracle/dumpfiles'; |
| Grant Permission | GRANT READ, WRITE ON DIRECTORY DATA_PUMP_DIR TO system; |
| Export Schema | expdp system/password@dbname schemas=SYSTEM directory=DATA_PUMP_DIR dumpfile=system_backup.dmp logfile=exp.log |
| Import Schema | impdp system/password@dbname schemas=SYSTEM directory=DATA_PUMP_DIR dumpfile=system_backup.dmp logfile=imp.log |
6. Best Practices & Notes
- Always verify the dump and log files after export and import.
- Keep your DATA_PUMP_DIR path secure and with correct OS permissions.
- Never expose database credentials in scripts or logs.
- For large exports, use parallel and compression switches.
- Test restore operations periodically to validate your backup strategy.