Network Exports/Imports using Network_Link

The NETWORK_LINK parameter identifies a database link to be used as the source for a network export/import. The following database link will be used to demonstrate its use.

CONN test/test
                    CONNECT TO scott IDENTIFIED BY tiger USING 'DEV';

In the case of exports, the NETWORK_LINK parameter identifies the database link pointing to the source server. The objects are exported from the source server in the normal manner, but written to a directory object on the local server, rather than one on the source server. Both the local and remote users require the EXP_FULL_DATABASE role granted to them.

expdp test/test@db10g tables=SCOTT.EMP network_link=REMOTE_SCOTT
        directory=TEST_DIR dumpfile=EMP.dmp logfile=expdpEMP.log

For imports, the NETWORK_LINK parameter also identifies the database link pointing to the source server. The difference here is the objects are imported directly from the source into the local server without being written to a dump file. Although there is no need for a DUMPFILE parameter, a directory object is still required for the logs associated with the operation. Both the local and remote users require the IMP_FULL_DATABASE role granted to them.

impdp test/test@db10g tables=SCOTT.EMP network_link=REMOTE_SCOTT
     directory=TEST_DIR logfile=impdpSCOTT.log
  • Ensure that the exporting user at the source database has the EXP_FULL_DATABASE role.This user must be specified when you create the database link.
  • Ensure that the importing user at the destination database has the IMP_FULL_DATABASE role.
  • Create and test a database link between the source and destination databases.
  • Run the following command, where import_user is the username for the importing user, and db_link is the name of the database link owned by the exporting user:
    IMPDP import_user/password NETWORK_LINK=db_link FULL=Y;
  • A log file for the import operation is written to the DATA_PUMP_DIR directory. You can discover the location of this directory by running the following command:
    SQL> select * from dba_directories where DIRECTORY_NAME like 'DATA_PUMP_DIR';

Source/Reference Links:

Oracle Data Pump

Data Pump Export and Import

Data Pump takes the old export and import utilities one step further, you can have total control over the job running (stop it, pause it, check it, restart it). Data pump is a server side technology and it can transfer large amounts of data very quickly using parallel streams to achieve maximum throughput, they can be 15-45% faster than the older import/export utilities.

Advantages using data pump are

  • ability to estimate jobs times
  • ability to restart failed jobs
  • perform fine-grained object selection
  • monitor running jobs
  • directly load a database from a remote instance via the network
  • remapping capabilities
  • improved performance using parallel executions

A couple of notes is that you cannot export to a tape device only to disk, and the import will only work with version of oracle 10.1 or greater. Also remember that the expdp and impdp are command line tools and run from within the Operating System.

Data Pump Uses

You can use data pump for the following

  • migrating databases
  • copying databases
  • transferring oracle databases between different operating systems
  • backing up important tables before you change them
  • moving database objects from one tablespace to another
  • transporting tablespace’s between databases
  • reorganizing fragmented table data
  • extracting the DDL for tables and other objects such as stored procedures and packages

Data Pump components

Data pump technology consists of three major components

  • dbms_datapump – the main engine for driving data dictionary metadata loading and unloading
  • dbms_metadata – used to extract the appropriate metadata
  • command-line – expdp and impdp are the import/export equivalents



expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp


impdp system/oracle DIRECTORY=dmpdir DUMPFILE=scott.dmp