Tuesday, September 25, 2007

Oracle Data Pump, easy steps – Attach to an existing job

Purpose:

To show how to continue ejecuting an Oracle Data Pump stopped job. For any reason, an Oracle Data pump Job can be stopped on Oracle 10G, Oracle 11G should automatically try to restart a stopped job one-time by default. Data Pump jobs can be restarted without loss of data, whether or not the stoppage was voluntary or involuntary.

Description:

Data Pump job creates a Master Table in which the entire record of the job is maintained. The Master Table is the directory to the job, so if a job is stopped for any reason, it can be restarted at a later point in time, without losing any data.

The ATTACH command attaches the client session to an existing export job and automatically places you in the interactive-command interface. Export displays a description of the job to which you are attached and also displays the export prompt. A job name does not have to be specified if there is only one export job that is associated with your schema. The job you attach to can be either currently executing or stopped.

Versions: Oracle Database 10g, 11g

The main process:

to see jobs and states we can query "dba_datapump_jobs" view:

SELECT owner_name, job_name, state
FROM dba_datapump_jobs;

From your terminal window, issue the following command:

expdp USER/PASSWORD ATTACH=job_name

Then we can issue the following commands:

Export> help
The following commands are valid while in interactive mode.
Note: abbreviations are allowed

Command Description
------------------------------------------------------------------------------
ADD_FILE Add dumpfile to dumpfile set.
CONTINUE_CLIENT Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT Quit client session and leave job running.
FILESIZE Default filesize (bytes) for subsequent ADD_FILE commands.
HELP Summarize interactive commands.
KILL_JOB Detach and delete job.
PARALLEL Change the number of active workers for current job.
PARALLEL=.
START_JOB Start/resume current job.
STATUS Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
STATUS[=interval]
STOP_JOB Orderly shutdown of job execution and exits the client.
STOP_JOB=IMMEDIATE performs an immediate shutdown of the
Data Pump job.

How to cleanup rows in dba_datapump_jobs (cleanup older Data Pump Jobs)

After deleting Oracle Data Pump dump files from the directory location of unfinished jobs, we can't attach to them and could receive the following Oracle errors:


ORA-31640 unable to open dump file %s for read
ORA-39000 bad dump file specification
ORA-39002 invalid operation

We are not able to remove the row from dba_datapump_jobs and the row for the job stays there forever.


Oracle people says:

When we run the datapump utility for export or import it runs as a job. This job can be seen in dba_datapump_jobs during the lifetime of the job. If we stop the job inbetween (or job gets aborted for some reasons) then the job lies in the dba_datapump_jobs and it will lie there till we resume the job or kill it. To kill or complete the job we need to attache to the job.
For a successful attachment we need to have the uncorrupted dump file in the proper location,
else we won't be able to attach.

What's the solution then?
drop the master table of the job from the schema.

Example:
Query table dba_datapump_jobs to see "not running" jobs.

SELECT owner_name, job_name, state FROM dba_datapump_jobs;
OWNER_NAME JOB_NAME STATE
USER SYS_EXPORT_FULL_04 NOT RUNNING


Try to attach to the NOT RUNNING job.

expdp USER/pass attach=SYS_EXPORT_FULL_04

Export: Release 10.2.0.3.0 - 64bit Production on Wednesday, 26 September, 2007 14:58:02

Copyright (c) 2003, 2005, Oracle. All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-39000: bad dump file specification
ORA-31640: unable to open dump file "/backup/copias/INSTANCE/expdp/expdpINSTANCE_1809200713H.dmp" for read
ORA-27037: unable to obtain file status
Linux-x86_64 Error: 2: No such file or directory
Additional information: 3


The master table is a normal table under the schema initiating the export/import. If the dumpfile is available then we can place it in the correct location to attach.

If we do not have the dump file or the dump file is corrupted then the only way left is to
drop the master table of the job from the schema.

drop table USER.SYS_EXPORT_FULL_04 purge;

SQL> SELECT owner_name, job_name, state FROM dba_datapump_jobs;
no rows selected

Links, References:

http://www.oracle.com/.../datapump.htm

Oracle Metalink

Oracle Database Documention Library 10g Release 2 (10.2)

Oracle Database Documention Library 11g Release 1 (11.1)

4 comments:

Anonymous said...

Good stuff..

Anonymous said...

Thank you for your help!!

Anonymous said...

Thanks for posting this.
It really help me out. I got so worry that my datapump job keep running, just won't stop.

Anonymous said...

I really liked the article, and the very cool blog