Applies to:
Oracle Server - Enterprise Edition - Version 11.2.0.2 and laterInformation in this document applies to any platform.
Symptoms
During DataPump export or import you receive the below error messages:
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
Cause
1. One of the reason this problem usually can occurs when the listener process has not been started under the same account as the database instance service. The listener forks the new server process and when this runs under a different security context as the database, then access to directories and files are likely impacted.Please verify the following information:
1) the output of:
ps -ef | grep SMON
2) the output of:
ps -ef | grep tnslsnr
3) the output of:
ps -ef|grep LIST
4) the output of:
ls -ld <full directory name of the directory to which the export/import is written>
expdp system/xxx directory=xxxxxxx dumpfile=xxxxxx.dmp logfile= xxxxxx.log schemas=xxxx
Export: Release 11.1.0.7.0 - 64bit Production on Tuesday, 08 March, 2011 11:15:2 3
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_03": system/******** directory=xxxxx dumpfile=xxxxx.dmp logfile= xxxxxx.log schemas=xxxxxxx
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 3.25 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
. . exported "xxxxxx"."xxxxxxxxxxxx" 1.122 MB 1156 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 131.1 KB 3980 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 7.710 KB 1 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 6.054 KB 5 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 5.5 KB 2 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 5.492 KB 1 rows
Master table "SYSTEM"."SYS_EXPORT_SCHEMA_03" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_03 is:
/opt/xxxxxxxxxx/3.0/admin/dbbackup/dump/xxxxxxxxxxx.dmp
Job "SYSTEM"."SYS_EXPORT_SCHEMA_03" successfully completed at 11:15:41
But with the connect strings, it fails with the below error messageExport: Release 11.1.0.7.0 - 64bit Production on Tuesday, 08 March, 2011 11:15:2 3
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_03": system/******** directory=xxxxx dumpfile=xxxxx.dmp logfile= xxxxxx.log schemas=xxxxxxx
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 3.25 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
. . exported "xxxxxx"."xxxxxxxxxxxx" 1.122 MB 1156 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 131.1 KB 3980 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 7.710 KB 1 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 6.054 KB 5 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 5.5 KB 2 rows
. . exported "xxxxxx"."xxxxxxxxxxxx" 5.492 KB 1 rows
Master table "SYSTEM"."SYS_EXPORT_SCHEMA_03" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_03 is:
/opt/xxxxxxxxxx/3.0/admin/dbbackup/dump/xxxxxxxxxxx.dmp
Job "SYSTEM"."SYS_EXPORT_SCHEMA_03" successfully completed at 11:15:41
expdp system/xxxx@db directory=xxxxxxxxxx dumpfile=xxxxxxxx.dmp logfile= xxxxxxx.log schemas=xxxxxx
Export: Release 11.1.0.7.0 - 64bit Production on Tuesday, 08 March, 2011 11:16:2 0
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
Note : Both commands ran on the same server.Export: Release 11.1.0.7.0 - 64bit Production on Tuesday, 08 March, 2011 11:16:2 0
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP, Data Mining
and Real Application Testing options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
The reason can be because the connect string (TNS Name) is a load balancing connect string and whenever you try to use the connect string with expdp/impdp, it goes to the other node where the directory information is not available, or the directory might be a local folder which is not a shared one.
3. For single node or RAC instance (this should fail with both connect string and without connect string) because the folder path or the folder does not exist .
4. Directory path/folder exist but create directory is executed by a different user in the database and the import is run by a different user.
Run the below query to cross verify who owns the directory
set pages 999 lines 200
select * from dba_directories where directory_name = '< directory_name >';
Solution
1. Make sure the listener and instance services are started by the same account2. Make sure that the directory are shared between nodes so that the directory can be accessed on any instance, or, create a folder similar to the other nodes locally, if there is already a folder created locally on the all the node with the same file directory path structure check if the permission are correct.
3. Make sure the folder exist has specified in during creation in the "CREATE DIRECTORY" syntax command.
4. Grant the required permission to the importing user to use the directory.
grant read, write on directory ,directory_name> to <username>;
No comments:
Post a Comment